EP1658755A1 - Tonquelle-raumklangssystem - Google Patents

Tonquelle-raumklangssystem

Info

Publication number
EP1658755A1
EP1658755A1 EP03748189A EP03748189A EP1658755A1 EP 1658755 A1 EP1658755 A1 EP 1658755A1 EP 03748189 A EP03748189 A EP 03748189A EP 03748189 A EP03748189 A EP 03748189A EP 1658755 A1 EP1658755 A1 EP 1658755A1
Authority
EP
European Patent Office
Prior art keywords
sound
module
spatialization
source
transfer functions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP03748189A
Other languages
English (en)
French (fr)
Other versions
EP1658755B1 (de
Inventor
Gérard; c/o THALES Intellectual Property REYNAUD
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Thales SA
Original Assignee
Thales SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thales SA filed Critical Thales SA
Publication of EP1658755A1 publication Critical patent/EP1658755A1/de
Application granted granted Critical
Publication of EP1658755B1 publication Critical patent/EP1658755B1/de
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S1/00Two-channel systems
    • H04S1/007Two-channel systems in which the audio signals are in digital form
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/30Control circuits for electronic adaptation of the sound field
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/30Control circuits for electronic adaptation of the sound field
    • H04S7/302Electronic adaptation of stereophonic sound system to listener position or orientation
    • H04S7/303Tracking of listener position or orientation
    • H04S7/304For headphones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S2400/00Details of stereophonic systems covered by H04S but not provided for in its groups
    • H04S2400/11Positioning of individual sound objects, e.g. moving airplane, within a sound field
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S2420/00Techniques used stereophonic systems covered by H04S but not provided for in its groups
    • H04S2420/01Enhancing the perception of the sound image or of the spatial distribution using head related transfer functions [HRTF's] or equivalents thereof, e.g. interaural time difference [ITD] or interaural level difference [ILD]

Definitions

  • the present invention relates to a spatialization system of sound sources with improved performance allowing in particular the production of a spatialization system compatible with modular avionic data processing equipment of IMA type (abbreviation of the English expression “Integrated Modular Avionics ”) also called EMTI (for Modular Information Processing Equipment).
  • IMA modular avionic data processing equipment
  • EMTI Modular Information Processing Equipment
  • 3D sound is in the same approach as the helmet visual by allowing pilots to acquire information of spatial situation (position of team members, threats, ...) in its own reference point, by a different communication channel than visual following a natural modality.
  • 3D sound enriches the transmitted signal with situation information, static or dynamic, in space. Its use, in addition to locating teammates or threats, can cover other applications such as multi-speaker intelligibility.
  • the system described in the above-mentioned request includes in particular for each source to be spatialized, a binaural processor with two convolutional channels whose role is on the one hand to calculate by interpolation the head transfer functions (left / right) at the point in which the sound source will be placed, on the other hand to create the spatialized signal on two channels from the original monophonic signal.
  • the aim of the present invention is to define a spatialization system with improved performance such that it is able to be integrated into an avionics modular information processing equipment (EMTI) which imposes constraints in particular on the number of processors and their type.
  • EMTI avionics modular information processing equipment
  • the invention proposes a spatialization system in which it is no longer necessary to make an interpolation calculation of the head transfer functions. It is then possible to carry out the convolution operations in order to create the spatialized signals to have only one single computer instead of the n binaural processors necessary in the system according to the prior art for spatializing n sources.
  • the invention relates to a spatialization system of at least one sound source creating for each source two spatialized monophonic channels intended to be received by a listener, comprising
  • a filter database comprising a set of head transfer functions specific to the auditor
  • a data presentation processor receiving the information from each source and comprising in particular a module for calculating the relative positions of the sources with respect to the listener,
  • said data presentation processor comprises a module selection of head transfer functions with variable resolution adapted to the relative position of the source with the listener.
  • pilot head transfer function bases adapted to the precision required for a given piece of information to be spatialized (threat, position of a drone, etc.), combined with optimal use of the spatial information contained in each of the positions of these bases makes it possible to considerably reduce the number of operations to be carried out for spatialization without degrading performance.
  • FIG. 1 a general diagram of a spatialization system according to the invention
  • FIG. 2 a block diagram of an exemplary embodiment of the system according to the invention
  • FIG. 4 a layout diagram of the system according to the invention in a modular avionics type equipment
  • IMA IMA.
  • the invention is described below with reference to an aircraft audio system, in particular a combat aircraft, but it is understood that it is not limited to such an application and that it can be implemented. used in other types of vehicles (land or sea) as well as in fixed installations.
  • the user of this system is, in this case, the pilot of an airplane but there can be several users simultaneously, in particular if it is a civil transport airplane, devices specific to each user. then being provided in sufficient number.
  • FIG. 1 represents a general diagram of a spatialization system of sound sources according to the invention, the role of which is to make a listener hear sound signals (tones, words, alarms, etc.) using a stereophonic headphones so that they are perceived by the listener as if they came from a particular point in space, this point being able to be the effective position of the sound source or an arbitrary position.
  • sound signals tones, words, alarms, etc.
  • FIG. 1 represents a general diagram of a spatialization system of sound sources according to the invention, the role of which is to make a listener hear sound signals (tones, words, alarms, etc.) using a stereophonic headphones so that they are perceived by the listener as if they came from a particular point in space, this point being able to be the effective position of the sound source or an arbitrary position.
  • the detection of a missile by a countermeasure device may generate a sound the origin of which seems to come from the origin of the attack, allowing the pilot to react more quickly.
  • the system according to the invention mainly comprises a processor CPU1 for presenting data and a calculation unit CPU2 generating the monophonic spatialized channels.
  • the data presentation processor CPU1 notably comprises a module 101 for calculating the relative positions of the sources with respect to the listener, that is to say in the listener's head reference. These positions are for example calculated from information received by an attitude detector 11 from the listener's head and by a module 12 for determining the position of the source to be restored (this module can include an inertial unit, a tracking device such as a goniometer, a radar, etc.).
  • the processor CPU1 is connected to a “filter” database 13 comprising a set of head transfer functions (HRTF) specific to the listener.
  • HRTF head transfer functions
  • the head transfer functions are for example acquired during an earlier learning phase. They are specific to the inter-aural delay of the listener (delay in the arrival of sound between the two ears), of the physiognomic characteristics of each listener. It is these transfer functions that give the listener the feeling of spatialization.
  • the computing unit CPU2 generates the monophonic channels G and D spatialized by convolution of each monophonic sound signal characteristic of the source to be spatialized and contained in the “sounds” database 14 with head transfer functions from said database 13 estimated at the position of the source in the head reference.
  • the calculation unit comprises as many processors as there are sound sources to be spatialized. Indeed, it is necessary in these systems to carry out a spatial interpolation of the head transfer functions in order to know the transfer functions at the point at which the source will be placed.
  • This architecture requires multiplying the number of processors in the computing unit, which is incompatible with a modular spatialization system for integration into modular avionics information processing equipment.
  • the spatialization system according to the invention has a specific algorithmic architecture which makes it possible in particular to reduce the number of processors of the computing unit.
  • the applicant has shown that the computing unit CPU2 can then be produced by means of a programmable component of the EPLD type (abbreviation of “Programmable electronics by logic gates”).
  • the data presentation processor of the system according to the invention comprises a module 102 for selecting head transfer functions with a variable resolution adapted to the relative position of the source with the listener (or position of the source in the head reference). Thanks to this selection module, it is no longer necessary to carry out interpolation calculations to estimate the transfer functions at the location where the sound source must be located. It is therefore possible to considerably simplify the architecture of the calculation unit, an exemplary embodiment of which will be described later.
  • the selection module operating a selection of the resolution of the transfer functions as a function of the relative position of the sound source with respect to the listener, it is possible to work with a database 13 of the head transfer functions comprising a large number of functions distributed regularly throughout the space, knowing that only a part of these will be selected to perform the convolution calculations.
  • the applicant worked with a database in which the transfer functions are collected with a step of 7 ° in azimuth, from 0 to 360 °, and with a step of 10 ° in elevation, from -70 ° to + 90 °.
  • the applicant has shown that, thanks to the resolution selection module 102 of the system according to the invention, it is possible to limit the number of coefficients of each head transfer function used to 40 (compared to 128 or 256 in most systems of the prior art) without degrading the sound spatialization results, which further reduces the computing power necessary for the spatialization function.
  • the calculation unit CPU2 can thus be reduced to a component of the EPLD type for example, even when several sources have to be spatialized, which makes it possible to dispense with the protocols of dialogue between the different binaural processors necessary for processing the spatialization of several sound sources in prior art systems.
  • FIG. 2 represents a functional diagram of an exemplary embodiment of the system according to the invention.
  • the spatialization system comprises a data presentation processor CPU1 receiving the information from each source and a calculation unit CPU2 of the spatialized right and left monophonic channels.
  • the processor CPU1 notably comprises the module 101 for calculating the relative position of a sound source in the listener's head reference frame, this module receiving in real time information on the head attitude (listener position) and on the position of the source to be restored, as described above.
  • the module 102 for selecting in resolution the HRTF transfer functions contained in the database 13 makes it possible to select, for each source to be spatialized, as a function of the relative position of the source, the transfer functions which will be used for the generation of spatialized sounds.
  • a sound selection module 103 connected to the sound database 14 makes it possible to select the monophonic signal from the database which will be sent to the computing unit CPU2 to be convoluted to the functions adapted left and right head transfer.
  • the sound selection module 103 operates a hierarchy between the sound sources to be spatialized. Depending on system events and the choice of platform management logic, a choice of concomitant sounds to be spatialized will be made. All of the information used to define this spatial presentation priority logic travels on the EMTI broadband bus.
  • the sound selection module 103 is for example connected to a configuration and parameterization 104 in which personalization criteria specific to the auditor are recorded.
  • the data concerning the choice of the HRTF transfer functions as well as the sounds to be spatialized are sent to the computing unit CPU2 by means of a communication link 15. They are temporarily stored in a filtering memory and digital sounds 201.
  • the part of the memory containing the digital sounds called "earcons" (name assigned to the sounds used as alarms or alerts and having a strong significant value) is for example loaded at initialization. It contains the samples of audio signals previously digitized in the sound database 14.
  • the spatialization of one or more of these signals will be activated or suspended. As long as the activation persists, the signal concerned is read in a loop.
  • the convolution calculations are performed by a computer 202, for example a component of the EPLD type which generates the spatialized sounds as has been described previously.
  • a processor interface 203 constitutes a memory used for the filtering operations. It is composed of buffer registers for sounds, HRTF filters, as well as coefficients used for other functions such as soft switching and the simulation of atmospheric absorption which will be described later.
  • earcons or audible alarms
  • radios directly UHF / VHF
  • FIG. 3 represents the diagram of a calculation unit of a spatialization system according to the example of FIG. 2.
  • the spatialization system comprises an audio input / output conditioning module 16 which recovers the monophonic left and right spatialized channels at the output in order to format them before sending them to the listener.
  • audio input / output conditioning module 16 which recovers the monophonic left and right spatialized channels at the output in order to format them before sending them to the listener.
  • live communications if “live” communications must be spatialized, these communications are shaped by the conditioning module with a view to their spatialization by the computer 202 of the computing unit.
  • live source By default, a sound coming from a so-called live source will always have priority over the sounds to be spatialized.
  • the processor interface 203 which constitutes a short term memory for all the parameters used.
  • the computer 202 constitutes the heart of the calculation unit. In the example in FIG. 3, it includes a module 204 for activating and selecting sources, performing the mixing function between the live inputs and the earcons type sounds.
  • the computer 202 can perform the calculation functions for the n sources to be spatialized.
  • the n sources In the example in Figure 3, four sound sources can be spatialized.
  • It includes a double spatialization module 205, which receives the adapted transfer functions and performs the convolution with the monophonic signal to be spatialized. This convolution is carried out in time space by using the shifting capacities of the FIR filters (finite impulse response filters) associated with the inter-aural delays.
  • FIR filters finite impulse response filters
  • the soft switching module includes a soft switching module 206, connected to a calculation parametering register 207 optimizing the choice of transition parameters as a function of the speed of movement of the source and of the listener's head.
  • the soft switching module allows a transition, without audible switching noise, when switching from one pair of filters to the next.
  • This function is performed by double linear weighting ramp. It involves a double convolution: each sample of each output channel results from the weighted sum of two samples, each being obtained by convolution of the input signal with a spatialization filter, element of the HRTF base. At a given instant, there are therefore in input memory two pairs of spatialization filters per channel to be processed.
  • a module 208 for simulating atmospheric absorption includes a module 208 for simulating atmospheric absorption.
  • This function is for example performed by a linear filtering with 30 coefficients and a gain, realized on each channel (left, right) of each channel, after spatialization processing. This function allows the auditor to perceive the depth effect necessary for his operational decision.
  • dynamic weighting modules 209 and summation 210 are provided for performing the weighted sum of the channels of each channel to provide a single stereophonic signal compatible with the output dynamics. The only constraint associated with this stereophonic reproduction is linked to the bandwidth necessary for sound spatialization (typically 20 kHz).
  • FIG. 4 diagrams the hardware architecture of an avionics modular equipment 40 for processing information of the EMTI type. It includes a broadband bus 41 to which all the equipment functions are connected, including in particular the sound spatialization system according to the invention 42 as described above, the other man machine interface functions 43 such as by example voice control, head-up symbology management, helmet display, etc., and a system management card 44 which functions as an interface with the other equipment of the aircraft.
  • the sound spatialization system 42 according to the invention is connected to the high speed bus via the data presentation processor CPUL II also comprises the calculation unit CPU2, as described above and formed for example of a component EPLD, compatible with the technical requirements of EMTI (number and type of operations, memory space, coding of audio samples, digital bit rate).

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Stereophonic System (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
  • Holo Graphy (AREA)
  • Surface Acoustic Wave Elements And Circuit Networks Thereof (AREA)
EP03748189A 2002-07-02 2003-06-27 Tonquelle-raumklangssystem Expired - Lifetime EP1658755B1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR0208265A FR2842064B1 (fr) 2002-07-02 2002-07-02 Systeme de spatialisation de sources sonores a performances ameliorees
PCT/FR2003/001998 WO2004006624A1 (fr) 2002-07-02 2003-06-27 Systeme de spatialisation de sources sonores

Publications (2)

Publication Number Publication Date
EP1658755A1 true EP1658755A1 (de) 2006-05-24
EP1658755B1 EP1658755B1 (de) 2008-03-19

Family

ID=29725087

Family Applications (1)

Application Number Title Priority Date Filing Date
EP03748189A Expired - Lifetime EP1658755B1 (de) 2002-07-02 2003-06-27 Tonquelle-raumklangssystem

Country Status (10)

Country Link
US (1) US20050271212A1 (de)
EP (1) EP1658755B1 (de)
AT (1) ATE390029T1 (de)
AU (1) AU2003267499C1 (de)
CA (1) CA2490501A1 (de)
DE (1) DE60319886T2 (de)
ES (1) ES2302936T3 (de)
FR (1) FR2842064B1 (de)
IL (1) IL165911A (de)
WO (1) WO2004006624A1 (de)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2865096B1 (fr) * 2004-01-13 2007-12-28 Cabasse Systeme acoustique pour vehicule et dispositif correspondant
JP2006180467A (ja) * 2004-11-24 2006-07-06 Matsushita Electric Ind Co Ltd 音像定位装置
EP1855474A1 (de) * 2006-05-12 2007-11-14 Sony Deutschland Gmbh Verfahren zur Erzeugung eines interpolierten Bildes zwischen zwei Bildern einer Eingangsbildsequenz
DE102006027673A1 (de) 2006-06-14 2007-12-20 Friedrich-Alexander-Universität Erlangen-Nürnberg Signaltrenner, Verfahren zum Bestimmen von Ausgangssignalen basierend auf Mikrophonsignalen und Computerprogramm
US9031242B2 (en) 2007-11-06 2015-05-12 Starkey Laboratories, Inc. Simulated surround sound hearing aid fitting system
EP2255359B1 (de) * 2008-03-20 2015-07-15 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Vorrichtung und verfahren zur akustischen anzeige
FR2938396A1 (fr) * 2008-11-07 2010-05-14 Thales Sa Procede et systeme de spatialisation du son par mouvement dynamique de la source
US9264812B2 (en) * 2012-06-15 2016-02-16 Kabushiki Kaisha Toshiba Apparatus and method for localizing a sound image, and a non-transitory computer readable medium
GB2574946B (en) * 2015-10-08 2020-04-22 Facebook Inc Binaural synthesis
GB2544458B (en) 2015-10-08 2019-10-02 Facebook Inc Binaural synthesis
US10331750B2 (en) 2016-08-01 2019-06-25 Facebook, Inc. Systems and methods to manage media content items
WO2018084769A1 (en) * 2016-11-04 2018-05-11 Dirac Research Ab Constructing an audio filter database using head-tracking data
US10394929B2 (en) * 2016-12-20 2019-08-27 Mediatek, Inc. Adaptive execution engine for convolution computing systems
CN113039509A (zh) * 2018-11-21 2021-06-25 谷歌有限责任公司 使用位置传感器和虚拟声学建模提供情境感知的装置和方法
US11363402B2 (en) 2019-12-30 2022-06-14 Comhear Inc. Method for providing a spatialized soundfield
FR3110762B1 (fr) 2020-05-20 2022-06-24 Thales Sa Dispositif de personnalisation d'un signal audio généré automatiquement par au moins un équipement matériel avionique d'un aéronef
KR20230157331A (ko) * 2021-03-16 2023-11-16 파나소닉 인텔렉츄얼 프로퍼티 코포레이션 오브 아메리카 정보 처리 방법, 정보 처리 장치, 및, 프로그램
JPWO2022219881A1 (de) * 2021-04-12 2022-10-20

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4583075A (en) * 1980-11-07 1986-04-15 Fairchild Camera And Instrument Corporation Method and apparatus for analyzing an analog-to-digital converter with a nonideal digital-to-analog converter
US4817149A (en) * 1987-01-22 1989-03-28 American Natural Sound Company Three-dimensional auditory display apparatus and method utilizing enhanced bionic emulation of human binaural sound localization
US5645074A (en) * 1994-08-17 1997-07-08 Decibel Instruments, Inc. Intracanal prosthesis for hearing evaluation
US6043676A (en) * 1994-11-04 2000-03-28 Altera Corporation Wide exclusive or and wide-input and for PLDS
JP3258195B2 (ja) * 1995-03-27 2002-02-18 シャープ株式会社 音像定位制御装置
US5742689A (en) * 1996-01-04 1998-04-21 Virtual Listening Systems, Inc. Method and device for processing a multichannel signal for use with a headphone
FR2744277B1 (fr) * 1996-01-26 1998-03-06 Sextant Avionique Procede de reconnaissance vocale en ambiance bruitee, et dispositif de mise en oeuvre
FR2744320B1 (fr) * 1996-01-26 1998-03-06 Sextant Avionique Systeme de prise de son et d'ecoute pour equipement de tete en ambiance bruitee
FR2744871B1 (fr) * 1996-02-13 1998-03-06 Sextant Avionique Systeme de spatialisation sonore, et procede de personnalisation pour sa mise en oeuvre
KR0175515B1 (ko) * 1996-04-15 1999-04-01 김광호 테이블 조사 방식의 스테레오 구현 장치와 방법
JP3976360B2 (ja) * 1996-08-29 2007-09-19 富士通株式会社 立体音響処理装置
DE69733956T2 (de) * 1996-09-27 2006-06-01 Honeywell, Inc., Minneapolis Integration und steuerung von flugzeugdienstleistungssystemen
US6181800B1 (en) * 1997-03-10 2001-01-30 Advanced Micro Devices, Inc. System and method for interactive approximation of a head transfer function
US6173061B1 (en) * 1997-06-23 2001-01-09 Harman International Industries, Inc. Steering of monaural sources of sound using head related transfer functions
FR2765715B1 (fr) * 1997-07-04 1999-09-17 Sextant Avionique Procede de recherche d'un modele de bruit dans des signaux sonores bruites
FR2771542B1 (fr) * 1997-11-21 2000-02-11 Sextant Avionique Procede de filtrage frequentiel applique au debruitage de signaux sonores mettant en oeuvre un filtre de wiener
US6996244B1 (en) * 1998-08-06 2006-02-07 Vulcan Patents Llc Estimation of head-related transfer functions for spatial sound representative
FR2786107B1 (fr) * 1998-11-25 2001-02-16 Sextant Avionique Masque inhalateur d'oxygene avec dispositif de prise de son
GB2374772B (en) * 2001-01-29 2004-12-29 Hewlett Packard Co Audio user interface
US7123728B2 (en) * 2001-08-15 2006-10-17 Apple Computer, Inc. Speaker equalization tool
US20030223602A1 (en) * 2002-06-04 2003-12-04 Elbit Systems Ltd. Method and system for audio imaging

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2004006624A1 *

Also Published As

Publication number Publication date
ATE390029T1 (de) 2008-04-15
AU2003267499B2 (en) 2008-04-17
IL165911A0 (en) 2006-01-15
DE60319886T2 (de) 2009-04-23
WO2004006624A1 (fr) 2004-01-15
IL165911A (en) 2010-04-15
DE60319886D1 (de) 2008-04-30
FR2842064A1 (fr) 2004-01-09
ES2302936T3 (es) 2008-08-01
CA2490501A1 (fr) 2004-01-15
AU2003267499C1 (en) 2009-01-15
FR2842064B1 (fr) 2004-12-03
AU2003267499A1 (en) 2004-01-23
EP1658755B1 (de) 2008-03-19
US20050271212A1 (en) 2005-12-08

Similar Documents

Publication Publication Date Title
EP1658755B1 (de) Tonquelle-raumklangssystem
CA2197166C (fr) Systeme de spatialisation sonore, et procede de personnalisation pour sa mise en oeuvre
EP2898707B1 (de) Optimierte kalibrierung eines klangwiedergabesystems mit mehreren lautsprechern
EP1992198B1 (de) Optimierung des binauralen raumklangeffektes durch mehrkanalkodierung
EP2508011B1 (de) Audiozoomverfahren in einer audioszene
Shilling et al. Virtual auditory displays
US9992602B1 (en) Decoupled binaural rendering
CN113889125B (zh) 音频生成方法、装置、计算机设备和存储介质
US11257478B2 (en) Signal processing device, signal processing method, and program
US20150245158A1 (en) Apparatus and method for reproducing recorded audio with correct spatial directionality
EP2194734A1 (de) Verfahren und System zur Anpassung des Klangs an Weltraumbedingungen durch dynamische Bewegung der Quelle
US20180220251A1 (en) Ambisonic audio with non-head tracked stereo based on head position and time
FR3025325A1 (fr) Dispositif et procede de localisation et de cartographie
EP1652406B1 (de) System und verfahren zur bestimmung einer repräsentation eines akustischen feldes
US9715366B2 (en) Digital map of a physical location based on a user's field of interest and a specific sound pattern
EP3400599A1 (de) Verbesserter ambisonic-codierer für eine tonquelle mit mehreren reflexionen
Maempel et al. Audiovisual perception of real and virtual rooms
EP1994526B1 (de) Gemeinsame schallsynthese und -spatialisierung
US11451931B1 (en) Multi device clock synchronization for sensor data fusion
WO2023043963A1 (en) Systems and methods for efficient and accurate virtual accoustic rendering
Han On the relation between directional bands and head movements
CN115335899A (zh) 用于目标语音分离的具有神经网络的多抽头最小方差无失真响应波束成形器
Ludovico et al. Head in space: A head-tracking based binaural spatialization system
FR3070568A1 (fr) Systemes et procedes pour reduire les effets d'annulation de phase lors de l'utilisation de casque audio
Sandler Surround Sound Impact over Large Areas

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20041216

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PT RO SE SI SK TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: THALES

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PT RO SE SI SK TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REF Corresponds to:

Ref document number: 60319886

Country of ref document: DE

Date of ref document: 20080430

Kind code of ref document: P

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: FRENCH

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080319

REG Reference to a national code

Ref country code: ES

Ref legal event code: FG2A

Ref document number: 2302936

Country of ref document: ES

Kind code of ref document: T3

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080319

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080319

REG Reference to a national code

Ref country code: IE

Ref legal event code: FD4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080319

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080826

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080619

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080319

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080319

BERE Be: lapsed

Owner name: THALES

Effective date: 20080630

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080319

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080319

Ref country code: MC

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20080630

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

26N No opposition filed

Effective date: 20081222

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20080630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080619

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080319

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20080630

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20080630

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20090603

Year of fee payment: 7

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20090619

Year of fee payment: 7

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080319

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: ES

Payment date: 20090709

Year of fee payment: 7

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20090626

Year of fee payment: 7

Ref country code: GB

Payment date: 20090624

Year of fee payment: 7

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20080627

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080920

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080319

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080620

REG Reference to a national code

Ref country code: NL

Ref legal event code: V1

Effective date: 20110101

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20100627

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20100627

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20110101

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20110101

REG Reference to a national code

Ref country code: ES

Ref legal event code: FD2A

Effective date: 20110715

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20110705

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20100627

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20100628

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20120619

Year of fee payment: 10

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20140228

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130701