EP2194734A1 - Verfahren und System zur Anpassung des Klangs an Weltraumbedingungen durch dynamische Bewegung der Quelle - Google Patents

Verfahren und System zur Anpassung des Klangs an Weltraumbedingungen durch dynamische Bewegung der Quelle Download PDF

Info

Publication number
EP2194734A1
EP2194734A1 EP09174579A EP09174579A EP2194734A1 EP 2194734 A1 EP2194734 A1 EP 2194734A1 EP 09174579 A EP09174579 A EP 09174579A EP 09174579 A EP09174579 A EP 09174579A EP 2194734 A1 EP2194734 A1 EP 2194734A1
Authority
EP
European Patent Office
Prior art keywords
sound
signal
movement
information
spatialized
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP09174579A
Other languages
English (en)
French (fr)
Inventor
Vincent Clot
Jean-Noël Perbet
Pierre-Albert Breton
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Thales SA
Original Assignee
Thales SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thales SA filed Critical Thales SA
Publication of EP2194734A1 publication Critical patent/EP2194734A1/de
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/30Control circuits for electronic adaptation of the sound field
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S2400/00Details of stereophonic systems covered by H04S but not provided for in its groups
    • H04S2400/11Positioning of individual sound objects, e.g. moving airplane, within a sound field
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S2420/00Techniques used stereophonic systems covered by H04S but not provided for in its groups
    • H04S2420/01Enhancing the perception of the sound image or of the spatial distribution using head related transfer functions [HRTF's] or equivalents thereof, e.g. interaural time difference [ITD] or interaural level difference [ILD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/30Control circuits for electronic adaptation of the sound field
    • H04S7/302Electronic adaptation of stereophonic sound system to listener position or orientation
    • H04S7/303Tracking of listener position or orientation
    • H04S7/304For headphones

Definitions

  • the field of the invention relates to a method of algorithmic processing of signals for sound spatialization for improving the location of the original virtual positions of the sound signals. Subsequently, we will use the term sound spatialization or 3D sound.
  • the invention applies in particular to spatialization systems compatible with an avionics modular information processing equipment type IMA (abbreviation of the English expression "Integrated Modular Avionics") still called EMTI (for Modular Treatment Equipment some information).
  • IMA abbreviation of the English expression "Integrated Modular Avionics”
  • EMTI Modular Treatment Equipment some information
  • 3D sound is part of the same approach as the helmet visual by allowing pilots to acquire spatial situation information in their own landmark via a communication channel other than the visual, following a natural mode that is less loaded than the view. .
  • weapon aircraft typically include threat detection systems, such as enemy aircraft radar snapping or collision risk, associated with visualization systems inside the cockpit. These systems alert the pilot of threats in his environment by displaying a visualization combined with a sound signal. 3D sound techniques provide an indication of the location of the threat through the hearing input channel, which is not overloaded and intuitive. The pilot is thus informed of the threat by means of a spatialized sound in the direction corresponding to the information.
  • threat detection systems such as enemy aircraft radar snapping or collision risk
  • visualization systems inside the cockpit.
  • 3D sound techniques provide an indication of the location of the threat through the hearing input channel, which is not overloaded and intuitive. The pilot is thus informed of the threat by means of a spatialized sound in the direction corresponding to the information.
  • a sound spatialization system consists of a calculation system performing algorithmic processing on the sound signals.
  • the size of the aircraft cockpits limits the integration of audio equipment and therefore the systems of multiple speaker networks and / or mobile loudspeakers allowing a spatialization without algorithmic processing are little used for the restitution of sounds 3D.
  • the patent is also known WO 2004/006624 A1 describing an avionic 3D sound system using, to increase the detection of the position of the sound, the use of an HRTF database.
  • the present invention aims to avoid ambiguities of location of a sound source.
  • the invention relates to an algorithmic signal processing method for sound spatialization for associating sound signals with information to be located by a listener.
  • the spatialized sound signals are defined by a virtual position of origin corresponding to the position of the information, the method is characterized in that , by algorithmic processing, a movement describing a series of virtual positions of said signal is applied to a spatialized sound signal. signal around the original virtual position of the information.
  • the movement around the original virtual position of the information is of oscillatory type.
  • the movement around the original virtual position of the information is of the Random type.
  • This solution for improving the location of 3D sounds is particularly intended for listeners who are subject to movement and workloads.
  • a listener gives movement to his auditory receivers to better locate a sound.
  • the invention allows the listener to remain motionless. Indeed, since the original virtual position is the spatialized position of the sound, the movement of the sound source around this position provides location information better than that of a continuous monodirectional movement.
  • the spatialization system can also be coupled with a pilot's helmet position detection device.
  • the movement is then correlated to the angle of difference between the listening direction of the listener and the original virtual position of said sound signal.
  • the movement then varies according to the orientation of the pilot vis-à-vis the information to be detected which is associated with the sound signal.
  • the amplitude of motion is correlated with the value of said deviation angle and the orientation of movement is also correlated with the orientation of the plane of said deviation angle.
  • the pilot thus receives information indicating whether he is moving in the direction of the information to be acquired.
  • This spatialisation effect of the sound simulates a movement of the sound signal converging towards the original virtual position of the information. This dynamic effect improves sound detection.
  • the invention also relates to the algorithmic signal processing system for sound spatialization comprising a first sound spatialization calculation means for associating sound signals with information to be located by a listener.
  • Said system is characterized in that it comprises a second path calculating means providing data enabling the first means spatialization calculus to apply motion to a spatialized sound signal around its original virtual position. This movement of the sound signal is preferably oscillatory.
  • the system also comprises a third means for calculating at least one law for varying the intensity of a sound signal to modify the intensity of the spatialized sound signal during the oscillatory movement.
  • the second trajectory calculation means calculates the difference in distance between the original virtual position of the sound source and the position supplied by the reception means and calculates a movement correlating with said distance difference.
  • the position data receiving means is connected to a helmet position detector carried by a listener.
  • the position data receiving means is connected to a camera detecting the positioning of the listener. This camera is not worn by the listener.
  • the sound signals come from a sound database of the aircraft and said sound signals are associated with information of at least one avionic device.
  • a first avionic device is a display device.
  • a second avionics device is a navigation device.
  • a third avionics device is an alert device.
  • the invention relates to sound spatialization systems and a method for improving the localization of a sound in the environment of a listener.
  • the results obtained by empirical method shows that an individual more easily detects the origin of a sound when the sound is moving.
  • the above works show better results in location tests with continuous moving sound.
  • the essential characteristic of the spatialization process is to confer an oscillatory movement to a sound around its original virtual position.
  • a sound spatialization system lies at the border between the avionics systems and the man-machine interface.
  • the figure 3 schematizes a spatialization system in an aircraft cockpit and particularly in a weapons plane where the pilot wears a helmet incorporating a device 6 for detecting the position of the helmet.
  • This The aircraft type includes a plurality of avionics systems 7, including collision-avoidance navigation systems 71 to avoid collisions, and systems for military operations such as target detection devices 72 and aircraft attack devices. target. Avionics systems may also include a meteorological device 73. These systems are most often coupled to visualization devices.
  • the figure 3 does not represent all the avionics systems that can be associated with the spatial system. The skilled person knows the avionics architectures and is able to implement a sound spatialization system with any avionics device transmitting information to the pilot.
  • the figure 4 schematizes a particular situation showing the interest of a spatialized system with an anticollision system.
  • the field of vision 24 of the pilot flying the aircraft is oriented to the left at a given instant.
  • This driver has a speaker system 21 positioned inside the helmet at the level of his ears.
  • the cockpit of the aircraft comprises several visualizations 21-23 and the pilot's field of vision is oriented towards the visualization 23.
  • an event such as a risk of collision with the ground detected by the anti-collision system alerts the pilot by displaying on visualization 21 the risk situation with the navigation data to be monitored and the flight instructions to be established.
  • the system also issues audible alerts associated with the information on the screen.
  • the spatialized sound associated with the alerts indicates to the pilot the location of the information to be taken into account and thus reduces his mental workload thanks to the audio stimulus given by the spatialization system.
  • the reaction time of the driver is reduced.
  • a sound spatialization system 1 is generally associated with a sound reception device and a sound database system 81 storing prerecorded sounds, such as synthesized alert messages, signaling sounds, software application sounds. or sounds from internal communications systems as external to the aircraft. For example, the spatialization of audio communications gives additional information on the interlocutor with whom the pilot is in communication.
  • the sound reproduction system 5 includes the earphones inside the pilot's helmet and also includes the cockpit loudspeaker system.
  • the sound reproduction system must be of stereophonic type for the application of the temporal difference effects of the signals between the loudspeakers.
  • the output of the spatialization module also comprises a signal processing device for adding additional effects on the spatialized signals, such as Tremolo effects or Doppler effect for example.
  • the sound spatialization calculating means 2 performs the algorithmic processing of the sounds to produce the monaural signals, realizing the modification of the sound intensity, the binaural signals, realizing the modification of the phase of the signals to simulate a time shift and the setting anatomical transfer functions (HRTF).
  • HRTF anatomical transfer functions
  • Binaural signals are used for the localization of sound sources in azimuth and require a stereophonic reproduction system.
  • the calculation means establish an algorithmic processing for simulating a distance of the sound sources by modifying the sound level (ILD) and a temporal offset between the sounds (ITD).
  • Monaural signals are used for elevation location and to distinguish a sound source positioned in front of or behind the listener. Monaural signals do not require a stereophonic reproduction system.
  • the spatialization system is connected to an HRTF database 82 storing the anatomical transfer functions of the known pilots. These transfer functions can be customized for each driver by an individual measurement.
  • the database may also include several anatomical standard profiles in order to be correlated with a pilot at the first use of the system to detect the adapted profile. This manipulation is faster than the individual measurement.
  • the first means 3 has the function of calculating the trajectory of the oscillatory movement to be imparted to the spatialized sound.
  • the oscillatory movement has a trajectory that can vary in elevation and azimuth with respect to the listener.
  • the oscillatory trajectory is in an angular range whose apex of the angle is centered on the listener.
  • the calculation means 3 determines an angle of difference between the orientation of the pilot's gaze, indirectly by the position of the helmet, and the virtual position. origin of the sound signal.
  • the figure 5 schematizes the application of the invention for calculating the trajectory of the oscillatory movement.
  • the drawing on the left represents the case where the orientation of the pilot is such that the direction of his field of vision 42 is strongly decorrelated with the direction 43 of his field of vision if it was oriented towards the position of origin of the signal sound.
  • the deviation angle 31 is calculated by the calculation means 3. This deviation angle can be in a plane varying in azimuth and in elevation as a function of the orientation 42 of the pilot's field of view.
  • the calculation means 3 also calculates the trajectory of the oscillatory movement 32 as a function of this angle of variation 31.
  • the trajectory of the oscillatory movement 32, or 41 on the drawing of the right of the figure 5 is a function of the angle of difference 31, or 36.
  • the coordinates of the original virtual position 33 are defined by an azimuth coordinate a1 and an elevation coordinate e1.
  • the trajectory of the oscillatory movement 32 is bidirectional and continuous, thus achieving an oscillatory movement back and forth along an arc of a circle connected to the angular bearings 44 and 45.
  • the trajectory calculation means can define a trajectory that can be oval or other shape.
  • the scan speed of this path is also configurable. It is preferably greater than the speed of movement of the head with a latency of less than 70 ms to preserve the natural sound.
  • the spatialization process calculates, by the calculation means 2, the trajectory of the oscillatory movement 32 around the virtual position 33.
  • the calculated angles are used by the calculation functions of FIG. monaural and binaural signals to determine the trajectory 32 around the original virtual position 33.
  • This trajectory 32 comprises a series of several virtual positions delimited by two extreme positions 34 and 35. These two extreme positions are located according to the coordinates of the position of origin 33 added with angular bearings in azimuth 44 and elevation 45.
  • the signals ITD and ILD depend on the angles in azimuth and elevation.
  • a sound signal is defined as a vibration perceived by the human ear, described in the form of a sound wave and which can be represented in the time and frequency domain (spectrum of the wave).
  • the ITD signals include a phase change to simulate a different azimuth position by a time shift of the signal between the ears of a listener.
  • ITD interaural time shift
  • the figure 6 represents the timing diagram of the sound for each ear for different positions of the sound source.
  • the sound source A21 represents a first position where the sound source is on the left ear side of the listener and the sound source A22 represents a second position where the sound source is on the side of the right ear of the listener.
  • the sound source A21 is closer to the listener than the source A22 is.
  • the angle values evolving in the angular bearing range are used for the calculation of the different virtual positions constituting the trajectory 32. These values are injected into the formula 2.
  • the various angle values included in the angular bearing range define the different virtual positions composing the trajectory of the sound signal.
  • the sounds should have a periodicity ideally less than 70 ms, to avoid the artificial effects of drag when one turns the head. In the same way, the overall sound footprint should ideally last at least 250 ms.
  • the loudness is different between the left ear and the right ear of a listener.
  • the difference in sound level is variable at an angle between the two ears.
  • the difference in sound level between the two ears varies between the ILD curve associated with the original virtual position 33 of the sound signal and the ILD curve associated with the extreme position of the sound signal on the trajectory of movement.
  • the sound shift varies according to the angular bearings 44 and 45 (azimuth and elevation).
  • the figure 1 illustrates for example a law of loudness that can be applied to sounds.
  • the value of the angular bearings 44 and 45 varies depending on the orientation of the pilot.
  • the law regulating the bearings mentioned above shows that the more the listener moves towards the original position of the sound signal and the lower the value of the angular bearings, until the bearings are canceled for a direction substantially equal to the direction of rotation. the virtual position of origin.
  • the position detection device 6 transmits the position coordinates to the calculation means 3 and according to these coordinates the angular bearings 44 and 45 will decrease or increase.
  • the diagram on the right of the figure 3 corresponds to a situation where the orientation of the pilot approaches the original virtual position 33 of the sound signal.
  • the angle of variation 36 is reduced and therefore the new trajectory 41 calculated by the calculation module 2 is of smaller amplitude, delimited by the two positions 36 and 37.
  • the invention also includes, for each subject, calculation means 4 for processing the sound signal at the output of the calculation means 2 for spatializing the sound.
  • This module 4 realizes a variation of the sound intensity of a sound signal according to the positions of the sound on the oscillatory movement.
  • a law of regulation of linear sound is applied to a spatialized sound performing the oscillatory movement so that the intensity 39 of the sound signal is reduced by a predefined number of dB when the position of the sound signal is located at a position extreme of the oscillatory movement, these positions corresponding, on the figure 3 at positions 34 and 35 of the oscillatory movement 32, and the intensity 40 of the sound signal is maximum when the position of the sound signal is located at the original virtual position 33 of the sound signal.
  • a linear regression law can for example determine the intermediate positions having an intermediate level of sound intensity between the maximum intensity and the decreased intensity.
  • the intensity of the sound signal is indeed less strong for a signal far from its actual position.
  • this variation in intensity simulates a distance 51 between the listener and an extreme position of the high oscillatory movement. While for the original virtual position, the simulated distance 52 is small.
  • the change in intensity simulates a spatial convergence of the oscillatory movement towards the original position.
  • the law of variation of the sound intensity can be independent of the angular difference between the pilot's gaze and the direction of the sound source.
  • the variation can also be random, ie not continuous between the original position and the extremes.
  • the duration of a sound is greater than 250 ms. Ideally, the duration should even be greater than 5 s to take full advantage of the associated dynamic signals.
  • Any type of sound reproduction system may be used: a system comprising a loudspeaker, a system comprising several loudspeakers, a system of cartilaginous transducers, a plug system with or without wires, etc.
  • the sound spatialization method applies to any type of application whose needs require the location of a sound. It is particularly intended for applications associating sound with information to be taken into account by an auditor. It applies to the field of aeronautics for the human-machine interaction of avionics systems with the pilot, for simulator applications immersing an individual (virtual reality system for example or airplane simulator) and also in the automotive field for systems to alert the driver of a hazard and provide an indication of the origin of the hazard.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • Stereophonic System (AREA)
EP09174579A 2008-11-07 2009-10-30 Verfahren und System zur Anpassung des Klangs an Weltraumbedingungen durch dynamische Bewegung der Quelle Withdrawn EP2194734A1 (de)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
FR0806229A FR2938396A1 (fr) 2008-11-07 2008-11-07 Procede et systeme de spatialisation du son par mouvement dynamique de la source

Publications (1)

Publication Number Publication Date
EP2194734A1 true EP2194734A1 (de) 2010-06-09

Family

ID=40763225

Family Applications (1)

Application Number Title Priority Date Filing Date
EP09174579A Withdrawn EP2194734A1 (de) 2008-11-07 2009-10-30 Verfahren und System zur Anpassung des Klangs an Weltraumbedingungen durch dynamische Bewegung der Quelle

Country Status (3)

Country Link
US (1) US20100183159A1 (de)
EP (1) EP2194734A1 (de)
FR (1) FR2938396A1 (de)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3061150A1 (fr) * 2016-12-22 2018-06-29 Thales Systeme de designation interactif pour vehicule, notamment pour aeronef, comportant un serveur de donnees
FR3137810A1 (fr) * 2022-07-06 2024-01-12 Psa Automobiles Sa Contrôle d’une ambiance sonore dans un véhicule

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130010967A1 (en) * 2011-07-06 2013-01-10 The Monroe Institute Spatial angle modulation binaural sound system
US8183997B1 (en) * 2011-11-14 2012-05-22 Google Inc. Displaying sound indications on a wearable computing system
US9591427B1 (en) * 2016-02-20 2017-03-07 Philip Scott Lyren Capturing audio impulse responses of a person with a smartphone
CN105929367A (zh) * 2016-04-28 2016-09-07 乐视控股(北京)有限公司 一种手柄的定位方法、装置及系统
DE102016115449B4 (de) 2016-08-19 2020-02-20 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Verfahren zur Erzeugung eines Raumklangs aus einem Audiosignal, Verwendung des Verfahrens sowie Computerprogrammprodukt
EP4085660A4 (de) 2019-12-30 2024-05-22 Comhear Inc. Verfahren zum bereitstellen eines räumlichen schallfeldes
FR3110762B1 (fr) 2020-05-20 2022-06-24 Thales Sa Dispositif de personnalisation d'un signal audio généré automatiquement par au moins un équipement matériel avionique d'un aéronef

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030059070A1 (en) * 2001-09-26 2003-03-27 Ballas James A. Method and apparatus for producing spatialized audio signals
FR2842064A1 (fr) * 2002-07-02 2004-01-09 Thales Sa Systeme de spatialisation de sources sonores a performances ameliorees
US20060018497A1 (en) * 2004-07-20 2006-01-26 Siemens Audiologische Technik Gmbh Hearing aid system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2696258B1 (fr) * 1992-09-25 1994-10-28 Sextant Avionique Dispositif de gestion d'un système d'interaction homme-machine.
FR2696574B1 (fr) * 1992-10-06 1994-11-18 Sextant Avionique Procédé et dispositif d'analyse d'un message fourni par des moyens d'interaction à un système de dialogue homme-machine.
FR2786308B1 (fr) * 1998-11-20 2001-02-09 Sextant Avionique Procede de reconnaissance vocale dans un signal acoustique bruite et systeme mettant en oeuvre ce procede
FR2808917B1 (fr) * 2000-05-09 2003-12-12 Thomson Csf Procede et dispositif de reconnaissance vocale dans des environnements a niveau de bruit fluctuant

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030059070A1 (en) * 2001-09-26 2003-03-27 Ballas James A. Method and apparatus for producing spatialized audio signals
FR2842064A1 (fr) * 2002-07-02 2004-01-09 Thales Sa Systeme de spatialisation de sources sonores a performances ameliorees
WO2004006624A1 (fr) 2002-07-02 2004-01-15 Thales Systeme de spatialisation de sources sonores
US20060018497A1 (en) * 2004-07-20 2006-01-26 Siemens Audiologische Technik Gmbh Hearing aid system

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
DURAND R BEGAULT: "3-D Sound for Virtual Reality and Multimedia", NASA/TM-2000-000000, XX, XX, 1 January 2000 (2000-01-01), pages 1 - 246, XP002199910 *
F.L. WIGHTMAN; D.J. KISTLER: "Resolution of front-back ambiguity in spatial hearing by listener and source movement", THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA, vol. 105, no. 5, May 1999 (1999-05-01), pages 2841 - 2853
H. WALLACH: "The role of head movements and vestibular and visual cues in sound localization", J. EXP. PSYCHOL., vol. 27, 1940, pages 339 - 368
S. PERRETT; W. NOBLE: "The effect of head rotations on vertical plane sound localization", THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA, vol. 102, 1997, pages 2325 - 2332
W.R. THURLOW; P.S. RUNGE: "Effects of induced head movements on localisation of direct sound", THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA, vol. 42, 1967, pages 480 - 487
WIGHTMAN FREDERIC L ET AL: "Resolution of front-back ambiguity in spatial hearing by listener and source movement", JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA, AIP / ACOUSTICAL SOCIETY OF AMERICA, MELVILLE, NY, US, vol. 105, no. 5, 1 May 1999 (1999-05-01), pages 2841 - 2853, XP012000962, ISSN: 0001-4966 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3061150A1 (fr) * 2016-12-22 2018-06-29 Thales Systeme de designation interactif pour vehicule, notamment pour aeronef, comportant un serveur de donnees
FR3137810A1 (fr) * 2022-07-06 2024-01-12 Psa Automobiles Sa Contrôle d’une ambiance sonore dans un véhicule

Also Published As

Publication number Publication date
US20100183159A1 (en) 2010-07-22
FR2938396A1 (fr) 2010-05-14

Similar Documents

Publication Publication Date Title
EP2194734A1 (de) Verfahren und System zur Anpassung des Klangs an Weltraumbedingungen durch dynamische Bewegung der Quelle
EP0813688B1 (de) Persönliches ortungsgerät
EP3622386B1 (de) Räumliches audio für dreidimensionale datensätze
US10038966B1 (en) Head-related transfer function (HRTF) personalization based on captured images of user
US11112389B1 (en) Room acoustic characterization using sensors
EP1658755B1 (de) Tonquelle-raumklangssystem
US11364844B2 (en) Systems and methods for verifying whether vehicle operators are paying attention
US20110164768A1 (en) Acoustic user interface system and method for providing spatial location data
EP3189389B1 (de) Lokalisierungs- und kartographievorrichtung mit entsprechenden verfahren
US10798515B2 (en) Compensating for effects of headset on head related transfer functions
US11638110B1 (en) Determination of composite acoustic parameter value for presentation of audio content
US20190114052A1 (en) Human-machine interface tethered to a user position in a three- dimensional vr or ar environment
US11979803B2 (en) Responding to a signal indicating that an autonomous driving feature has been overridden by alerting plural vehicles
US10795038B2 (en) Information presentation system, moving vehicle, information presentation method, and non-transitory storage medium
Carlander et al. Uni-and bimodal threat cueing with vibrotactile and 3D audio technologies in a combat vehicle
US20220021999A1 (en) Spatialized audio rendering for head worn audio device in a vehicle
FR3066635A1 (fr) Procede et dispositif d'aide au stationnement d'un vehicule
JPH10230899A (ja) 航空宇宙飛行機のマンマシンインターフェース
FR3038101A1 (fr) Procede de guidage d'un individu et systeme de navigation
Vasilijevic et al. Acoustically aided HMI for ROV navigation
WO2020115896A1 (ja) 情報処理装置および情報処理方法
Ernst et al. Influence of background noise on spatial audio localization for helicopter pilot headsets
FR2855932A1 (fr) Systeme d'avertissement audiospatialise pour l'evitement de collision
EP3607430B1 (de) Verfahren und system zur bestimmung der periodizität von daten
Daniels et al. Improved performance from integrated audio video displays

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

AX Request for extension of the european patent

Extension state: AL BA RS

17P Request for examination filed

Effective date: 20100630

17Q First examination report despatched

Effective date: 20100722

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20101202