WO2022010465A1 - Affichage visuel de son pour localisation de direction avec rétroaction audio pour prothèses auditives - Google Patents

Affichage visuel de son pour localisation de direction avec rétroaction audio pour prothèses auditives Download PDF

Info

Publication number
WO2022010465A1
WO2022010465A1 PCT/US2020/041127 US2020041127W WO2022010465A1 WO 2022010465 A1 WO2022010465 A1 WO 2022010465A1 US 2020041127 W US2020041127 W US 2020041127W WO 2022010465 A1 WO2022010465 A1 WO 2022010465A1
Authority
WO
WIPO (PCT)
Prior art keywords
leds
sound
audio
array
localization
Prior art date
Application number
PCT/US2020/041127
Other languages
English (en)
Inventor
Luis Z. LOPEZ
Original Assignee
Lopez Luis Z
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lopez Luis Z filed Critical Lopez Luis Z
Publication of WO2022010465A1 publication Critical patent/WO2022010465A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R25/00Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception
    • H04R25/40Arrangements for obtaining a desired directivity characteristic
    • H04R25/407Circuits for combining signals of a plurality of transducers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/10Earpieces; Attachments therefor ; Earphones; Monophonic headphones
    • H04R1/1008Earpieces of the supra-aural or circum-aural type
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/40Visual indication of stereophonic sound image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2499/00Aspects covered by H04R or H04S not otherwise provided for in their subgroups
    • H04R2499/10General applications
    • H04R2499/15Transducers incorporated in visual displaying devices, e.g. televisions, computer displays, laptops
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R5/00Stereophonic arrangements
    • H04R5/033Headphones for stereophonic communication

Definitions

  • the present invention is in the technical area of eyeglasses and hearing aids, and pertains more particularly to eyewear enhanced for the generation and display of visual localization cues.
  • Hearing aids in the art are well-known, including behind the ear hearing aids having two microphones at different positions, enabling some degree of localization in sound reception.
  • the use of localization information is, however, limited in the prior art to processing of sound for presentation in the ears of a subject.
  • a hearing aid apparatus including a glasses assembly having a lens frame, a left temple frame arm, and a right temple frame arm joined to the lens frame.
  • a behind-the-ear hearing aid having a front and a rear microphone at the end of the left and right temple frames in a manner to be positioned behind a user's left and right ear when wearing the glasses assembly.
  • a plurality of LEDs implemented in an array in a back surface of the lens frame above each of a lens area such that the user is enabled to see the LEDs when wearing the glasses assembly.
  • circuitry is included comprising a processor implemented in one of the frames, the circuitry adapted to process audio events arriving at the front and rear microphones of the left and right hearing aids, and to light individual ones of the LEDs, indicating localization for the arriving audio events.
  • the array of LEDs comprises a left array over the left lens and a right array over the right lens, wherein one specific LED in each of the left and right arrays is lit if the audio event comes from directly in front of the user.
  • one specific LED in each of the left and right arrays is lit if the audio event comes from behind the user.
  • the array of LEDs comprises a set of LEDs in each of the left and right arrays, and LEDs in the set are lit according to intensity of sound in the audio event.
  • the LEDS are arranged in a vertical column in each set, and number of LEDs lit increases in proportion to the intensity of the sound in the audio event.
  • the LED lowest in the column is lit
  • at a second higher threshold intensity the lowest and the next higher are lit
  • at a third higher threshold a the first and second and a third next higher LED is lit
  • a highest threshold intensity all four LEDs are lit.
  • circuitry processing audio events processes the audio signals received by the microphones and feeds audio via the hearing aids to earmolds in the ears of the user.
  • a method for indicating sound localization is also provided with the components listed, above, including implementing behind-the-ear hearing aids at the end of temple frames of a glasses assembly having a lens frame with lenses, a left temple frame, and a right temple frame joined to the lens frame, such that hearing aids are positioned behind the ears of a user wearing the glasses assembly.
  • a next step is provided implementing a plurality of LEDs in an array in a back surface of the lens frame above the lenses such that the user is enabled to see the LEDs when wearing the glasses assembly; and lighting individual ones of the LEDs in the array by circuitry including a processor implemented in one of the frames, the circuitry adapted to process audio events arriving at the front and rear microphones of the left and right hearing aids, indicating localization for arriving audio events.
  • Fig. l is a perspective view of a glasses frame combined with hearing aids in an embodiment of the invention.
  • Fig. 2 is a straight-on view of a portion of the glasses frame of Fig. 1.
  • Fig. 3 A is an enlarged view of a set of light-emitting diodes implemented on a left side of the glasses frame of Figs. 1 and 2.
  • Fig. 3B is an enlarged view of a set of light emitting diodes implemented on a right side of the glasses frame of Figs. 1 and 2.
  • Fig. 4 is a flow diagram illustrating steps in processing in an embodiment of the invention.
  • Fig. 5 is a diagram of logic in processing in an embodiment of the invention.
  • Fig. 6 illustrates LED lighting in response to a sound event from the left front in an embodiment of the invention.
  • Fig. 7 illustrates LED lighting in response to a sound event from the directly in front in an embodiment of the invention.
  • Fig. 8 illustrates LED lighting in response to a sound event from the directly in front in an embodiment of the invention at a different intensity than on Fig. 7.
  • Fig. 9 illustrates LED lighting indicating sound intensity for volume unit (VU) meter in an embodiment of the invention.
  • Fig. 10A illustrates LED lighting in a specific circumstance in an embodiment of the invention.
  • Fig. 10B illustrates LED lighting in another specific circumstance in an embodiment of the invention.
  • Fig. 11 illustrates sound intensity in a specific circumstance in an embodiment of the invention.
  • Fig. 12 illustrates an alternative embodiment of the invention.
  • Fig. 13 illustrates yet another alternative embodiment of the invention.
  • Fig. 14 illustrates yet another alternative embodiment of the invention. DETAILED DESCRIPTION OF THE INVENTION
  • Sound localization is an ability to determine the direction from which a sound is arriving. Horizontal azimuth and vertical elevation coordinates are used to localize sound. Localization is a complex process that involves sound intensity and frequency, sound propagation time, head and ear anatomy, nerve paths and nerve transmission delays, left and right side hearing data integration, and brain processing.
  • Azimuth coordinates are the points on a horizontal plane that localize the direction of sounds on that axis. Elevation refers to the vertical direction coordinates that lie on a vertical plane. Both of our ears are on a horizontal plane and have evolved as two listening points of reference for comparison, tri angulation, and localization. Sound cues originating from a single point on that plane are received by the two ears separated by the width of the head. This slight distance between ears results in a relative time shift as perceived by the two ears. These differences can be used to triangulate and determine azimuth.
  • a comparison of cues related to sound intensity and time-delay that reach each ear determine azimuth localization. Sound closer to a given ear is louder than sound received by the more distant ear and results in an inter-aural level difference (ILD).
  • ILD inter-aural level difference
  • Elevation coordinates add a vertical dimension in the localization of sound. Like azimuth, elevation can also be triangulated, but requires using two reference pickup points on a vertical plane. Tilting the head to position one ear higher than the other provides two reference points almost on a vertical plane that can triangulate vertical position. Normal hearing, however, resolves both azimuth and elevation with two ears on a horizontal plane. This makes determining elevation more complex since elevation involves a vertical plane.
  • the human outer ear plays a crucial role in differentiating sounds on a vertical plane. Humans with two functional ears can resolve elevation and front-rear distinctions by processing changes in the spectral -shape of sounds as they pass the convoluted folds of the outer ear and uniquely resonate in the in the ear canal or inner ear. Azimuth of sound directly in front of the listener is easily determined because both ears hear the sound simultaneously. Changes in vertical elevation do not change azimuth.
  • the elevation component For the sounds equidistant to both ears, like sounds directly in front, the elevation component identifies sounds that are above, below, or even directly behind the listener. The subtle elevation and front-back cues necessary for precise sound localization are possible due to the outer ear and physical and neurological pathways.
  • Hearing aids typically are designed to improve hearing by compensating for lost or damaged capabilities. Amplifiers, filters, and even implants are incorporated with ranges of success. Localization is tied to hearing ability and particularly to bilateral hearing ability. In cases where the user has some bilateral hearing, hearing aid devices can correct hearing loss with improvement in localization. In the case of monaural hearing or Single Sided Deafness (SSD), hearing aid devices do not improve or restore localization. In fact, hearing aids can interfere with localization by filtering, amplifying, or changing incoming sounds essential for determining direction. Hearing aids inserted into the ear canal also bypass the outer ear that gathers and directs subtle localization signals.
  • SSD Single Sided Deafness
  • Severe hearing loss is the most difficult to restore with hearing aids because the audio neurobiological channels and transmission mechanisms to the brain are often severely impaired, damaged, or missing. Monaural or binaural signals simply do not reach the brain. In the case of SSD, hearing aids alone cannot restore localization or directionality of sound because bilateral reference signals not available.
  • visual localization cues are created to work with a subject's existing audio directional cues in both healthy and mildly impaired binaural circumstances.
  • Conventional hearing aids combine detected audio cues with visual directional cues to generate an additional intensity cue that can be discriminated by monaural and SSD subjects.
  • visual localization cues can fill the place of audio directional cues permanently lost to audio neurobiological path damage.
  • the subject can rely on directional sound information converted into visual directional cues that have been added or substituted to localize sound.
  • such visual directional cues are created and displayed to the subject.
  • a subject with normal hearing in both ears does not require hearing aids which potentially could interfere with hearing and localization.
  • CROS Contra-lateral Routing Of Signal
  • One of the hearing aids equipped with at least one microphone, a transmitter, and receiver, is installed in the functional ear.
  • the second hearing aid with only the microphone and transmitter activated is installed in the impaired or deaf ear to pick up sounds that would otherwise be lost.
  • These sounds picked up from the deaf ear are then channeled to the functional side hearing aid receiver by sound pipes, radio waves, or bone conduction, and added to the audio stream of the hearing aid installed on the functional ear. While this sound transfer eliminates the hearing shadow on the impaired side of the head and improves the user’s overall hearing, localization does not improve.
  • Fig. 1 is a perspective illustration of a pair of eyeglasses 12 having a lens frame 14 and left and right temple frames 38L and 38R connected to the lens frame 14.
  • a pair of behind-the-ear hearing aids: 32L and 32R are integrated at the ends of the temple frames, to sit behind the ears of a subject wearing the eyeglasses 12.
  • the hearing aids 32L and 32R are of a type that has both a front and a rear microphone (mic) to be able to differentiate direction by the intensity or time of arrival of audio: hearing aid 32L has a front mic 36L and a rear mic 34L and hearing aid 32R has a front mic 36R and a rear mic 34R.
  • LEDS light-emitting diodes
  • the LEDs are not annotated in Fig. 1 because of scale.
  • Fig. 2 is a straight-on view of lens frame 14 from the wearer’s perspective of the eyeglasses of Fig. 1, showing two sets of LED based displays 90L and 90R s in the lens frame: Each set comprises a four element volume unit(VU) meter and two single LED indicators.
  • Fig. 3 A and 3B there is an LED 30L on the left and another 30R on the right for indicating sounds that originate directly in front of the subject. These are red LEDs in this example. There are additionally LEDs 20L on the left and 20R on the right for indicating sounds that originate directly behind the subject. These are green LEDs in this example but could be another color. There is an array of four LEDs 22L through 28L on the left and 22R through 28R on the right, aligned vertically. These are termed by the inventor as VU meters 48L and 48R for indicating sound intensity in an audio event.
  • FIG. 4 is a flowchart of the logical sequence of the signals and functions of circuit board 42.
  • the top three blocks relate to the left side LED 20L display and left VU meter 48L.
  • LED 20L indicates that a sound originates from behind the wearer.
  • the VU meter 48L displays the sound intensity of that left front microphone 36L.
  • the bottom three blocks are a mirror image of the top blocks and relate to the displays on the right side of the eyeglasses.
  • the right side operates VU meter 48R identically to but independently of the left side VU meter 48L.
  • the middle three blocks have functions common to both left and right front microphones 36L and 36R.
  • the intensity of sound signal picked up by each of the two front microphones located within the hearing aids is constantly evaluated. Anytime both sound signals are equal, the summing amplifier doubles the signal amplitude, provides feedback for the hearing aids’ speakers, and turns on LEDs 30L and 3 OR simultaneously.
  • FIG. 5 is a more detailed schematic representation of circuit board 42.
  • amplifier 44L compares the signal amplitude of rear microphone 34L with front microphone 36L. When rear microphone 34L signal is more intense than the front mic 36L, LED 20L turns on to identify the sound as predominately originating from the rear.
  • Front microphone 36L output also passes to LED driver 46L that drives left LED VU array meter 48L that displays sound intensity levels on the left side.
  • the right side circuits on the bottom are a mirror image of the left side.
  • the middle shared circuits have four basic functions: Detect sounds located directly in front, double sound amplitude of sound directly in front, activate LED 30L and 30R, and route this amplified sound cue to the hearing aids. This sharp jump in amplitude, discriminated even by monaural users with SSD, localizes and amplifies sounds directly in front.
  • LED 30L and 3 OR turn on in tandem with the amplified audio cue to highlight sounds directly in front. Sound is localized by moving the head from side to side while listening for a narrow band of amplified sound and looking for a simultaneous flash of light.
  • FIG. 6 illustrates the activation of LEDs in display 90L caused by audio signals determined to be coming from the left front.
  • the location of the origin of the sound from the left front results in LEDs 20L and 20R to not be lit, as these are for indicating sound from the rear.
  • the sound is greater on the left, causing all four VU meter LED 22L through 28L on display 90L to be lit, while only one, LED 22R on the display 90R is lit.
  • Fig. 7 illustrates display 90L and 90R .
  • the sound intensity is the same on each side and is indicated by LEDs 22L, 24L and 26L lit on the left, and 22R, 24R and 26R lit on the right.
  • LEDs 30R and 30L are not shown in this view.
  • Fig. 8 illustrates a circumstance like that of Fig. 7 where the sound is from directly in front, but at a somewhat lesser intensity than in the circumstance of Fig. 7.
  • VU meter 48L and 48R LEDS Just four of the VU meter 48L and 48R LEDS are lit, 22L and 24L on the left, and 22R and 24R on the right.
  • LEDs 30L and 30R are shown lit, indicating the sound is from directly in front.
  • FIG. 9 shows five possible displays for the VU meters that show intensity levels from 0 to 4 with samples of sound signal intensities above each level number.
  • column 0 there is no sound, and all four LEDs in the VU meter are off.
  • column 1 there is a low intensity sound, resulting in one LED 22 (either L or R) to be lit.
  • column 2 the sound is a bit more intense, and LEDs 22 and 24 are lit.
  • column 3 more intensity, and 22, 24 and 26 are lit.
  • the sound intensity is at the highest level, at the upper threshold, so all four LEDs in the VU meter are lit.
  • FIG. 10A shows a louder level 3 sound with 22L, 24L and 26L lit on the left side as indicated by left VU display 48L and a somewhat less intense level 2 sound on the right side with only 22R and 24R lit as indicated by VU meter 48R. Uneven levels keep LED 30L and 30R off in this embodiment.
  • FIG. 10b shows equal sound levels on both left VU meter 48L and right side VU meter 48R with the VU displays indicating a directly-in-front localization with LED 30L and 30R both lit, and equal intensity indicated by LEDs 22L and 22R, and 24L and 24R lit.
  • FIG. 11 shows an intensity meter display of recording of a word as may be perceived by the user when spoken on the right and left side and when spoken directly in front. It is seen that the intensity is less on the left and right than directly in front. A continuous sine wave of constant amplitude is shown for all three locations that spikes in intensity when sound passes directly in front where amplification doubles.
  • FIG. 12 illustrates an alternative embodiment on a pair of eyeglasses 56 comprising two LED arrays 58 implemented horizontally along the top edge of the frame to indicate both direction and intensity of oncoming sound.
  • LEDs light horizontally to the left or right in arrays 58, to indicate localization, and change in brightness to indicate intensity.
  • LED 60 identifies sounds from the rear. Incoming sound turns on only the LED in line with the source. The result is that when an LED is lit, pointing to the perceived origin of sound, moving the head right or left will shift the lit LED right or left, constantly pointing to the direction of sound.
  • the extreme right and left LEDs 60 of the arrays are a different color and identify sound coming from the right rear or left rear. Turning the head toward the rear will cause the display to point to the resulting sound once again.
  • FIG. 13 is a perspective view of an alternative embodiment that implements small LED screens 64 and 66 secured in a wearable frame in front of the user’s eyes to display azimuth, elevation, and intensity of oncoming sounds. Loud intense sounds are displayed as larger circles on the screen. The circles follow the sound origin on the screen to localize direction. Color is used to identify audio frequency. Of the six sound samples detected, sound bubble 68 in this example is the largest in intensity with a left azimuth and positive elevation.
  • FIG. 13 illustrates an embodiment on a headset 62 similar to welding or virtual reality goggles comprising two video screens (64, 66) that display sound as bubbles that localize sound in coordinated space and grow or shrink with changes in sound intensity.
  • the wearer can move the head to center sounds of interest while still being aware of other smaller intensity sounds coming from other directions.
  • Screens present displays to the users two eyes as two-dimensional or three-dimensional mode displaying circles or bubbles.
  • the screens can be transparent with superimposed images, to allow interaction with a real environment or completely closed to external view.
  • This embodiment supports hearing aid test lab and hearing aid fitting environments.
  • the user can evaluate hearing aids while wearing this eyewear for visual feedback while a technician adjusts the hearing aid features for better results.
  • a hearing professional can view the same user screens on a computer or alternate video screen as required adjustments and subtle programming changes are made to the hearing aid systems for an individually tailored fit.
  • Alternate embodiment screens can be projected to display phase and frequency, speech spectrographs in color, and even real time hearing test results in graphic form as the technician explorers the wearer’s capabilities.
  • FIG. 14 illustrates an alternative embodiment on a headset and binoculars to localize sounds like bird calls by means of a projected arrow in the field of view.
  • Headphone set 70 contains microphones and localization processor. There is a front azimuth microphone on each side of the headphones at 74R and 74L, front high elevation microphone at 76R and 76L, front low elevation microphone at 78R and 78L, and rear microphone at 80R (not shown) and 80L.
  • Binocular set 72 projects a localization arrow in the viewing field.
  • the four microphones on each side of the headset localize sounds by azimuth and elevation, localizing sounds originating from the left, right, front, rear, up, and down.
  • a processor contained in the headset amplifies sound and controls arrow 82 in the viewing field of the binoculars connected to headphone set 70 by wireless or other means.
  • the user scans with the binoculars while a projected arrow in the field of view points to the right or left and floats upward or downward to localize the origin.
  • the arrow points to the right and when on the left, it points to the left.
  • a double amplitude audio cue heard through the headphones identifies sounds directly in front.
  • the arrow disappears and the sound intensity is doubled and remains amplified as long as the target stays directly in front within the binoculars’ viewing field.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Neurosurgery (AREA)
  • Otolaryngology (AREA)
  • Stereophonic System (AREA)

Abstract

Appareil et procédé destinés à indiquer la localisation de son, comprenant la mise en œuvre de prothèses auditives derrière l'oreille au niveau de l'extrémité des montures de branche d'un ensemble lunettes comprenant une monture de verres avec des verres, une monture de branche gauche, et une monture de branche droite reliées à la monture de verres, de sorte que des prothèses auditives soient positionnées derrière les oreilles d'un utilisateur portant l'ensemble lunettes. La mise en œuvre d'un processeur interagissant avec les prothèses auditives et une pluralité de DEL dans un réseau dans une surface arrière de la monture de verres au-dessus des verres de sorte que l'utilisateur puisse voir les DEL lors du port de l'ensemble lunettes, indiquant la localisation des événements audio arrivant.
PCT/US2020/041127 2020-07-07 2020-07-08 Affichage visuel de son pour localisation de direction avec rétroaction audio pour prothèses auditives WO2022010465A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202016922964A 2020-07-07 2020-07-07
US16/922,964 2020-07-07

Publications (1)

Publication Number Publication Date
WO2022010465A1 true WO2022010465A1 (fr) 2022-01-13

Family

ID=79552652

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/041127 WO2022010465A1 (fr) 2020-07-07 2020-07-08 Affichage visuel de son pour localisation de direction avec rétroaction audio pour prothèses auditives

Country Status (1)

Country Link
WO (1) WO2022010465A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022165317A1 (fr) * 2021-01-29 2022-08-04 Quid Pro Consulting, LLC Systèmes et procédés pour améliorer l'audition fonctionnelle

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160100258A1 (en) * 2014-10-03 2016-04-07 Umm Al-Qura University Direction indicative hearing apparatus and method
US20200077206A1 (en) * 2018-09-02 2020-03-05 Oticon A/S Hearing device configured to utilize non-audio information to process audio signals
US20200236475A1 (en) * 2015-09-18 2020-07-23 Ear Tech Llc Hearing aid for people having asymmetric hearing loss

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160100258A1 (en) * 2014-10-03 2016-04-07 Umm Al-Qura University Direction indicative hearing apparatus and method
US20200236475A1 (en) * 2015-09-18 2020-07-23 Ear Tech Llc Hearing aid for people having asymmetric hearing loss
US20200077206A1 (en) * 2018-09-02 2020-03-05 Oticon A/S Hearing device configured to utilize non-audio information to process audio signals

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022165317A1 (fr) * 2021-01-29 2022-08-04 Quid Pro Consulting, LLC Systèmes et procédés pour améliorer l'audition fonctionnelle
US11581008B2 (en) 2021-01-29 2023-02-14 Quid Pro Consulting, LLC Systems and methods for improving functional hearing

Similar Documents

Publication Publication Date Title
EP3346730B1 (fr) Dispositif de casque pour reproduction audio 3d
US5272757A (en) Multi-dimensional reproduction system
US5764778A (en) Hearing aid headset having an array of microphones
US9020168B2 (en) Apparatus and method for audio delivery with different sound conduction transducers
JP6092151B2 (ja) 空間的に信号を強調する補聴器
US8638959B1 (en) Reduced acoustic signature loudspeaker (RSL)
EP0695109B1 (fr) Système de reproduction visuelle et sonore
US20110157327A1 (en) 3d audio delivery accompanying 3d display supported by viewer/listener position and orientation tracking
US4418243A (en) Acoustic projection stereophonic system
US20090252360A1 (en) Hearing aid glasses using one omni microphone per temple
EP3468228B1 (fr) Système auditif binauriculaire comportant une localisation des sources sonores
JPS6295098A (ja) 補聴器
JP6732890B2 (ja) 聴覚補助
Johnson et al. Impact of hearing aid technology on outcomes in daily life III: Localization
CN102598718A (zh) 用于再现具有改进的声像的多通道声音的扩音器系统
WO2022010465A1 (fr) Affichage visuel de son pour localisation de direction avec rétroaction audio pour prothèses auditives
JP2008113118A (ja) 音響再生システムおよび音響再生方法
US6990210B2 (en) System for headphone-like rear channel speaker and the method of the same
US8666080B2 (en) Method for processing a multi-channel audio signal for a binaural hearing apparatus and a corresponding hearing apparatus
WO2019198194A1 (fr) Dispositif de sortie audio
WO2005034574B1 (fr) Dispositif de reproduction acoustique tridimensionnelle au moyen d'un casque
US7050596B2 (en) System and headphone-like rear channel speaker and the method of the same
US6983054B2 (en) Means for compensating rear sound effect
EP0549836B1 (fr) Système de reproduction sonore multi-dimensionnel
WO2023061130A1 (fr) Écouteur, dispositif utilisateur et procédé de traitement de signal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20943957

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20943957

Country of ref document: EP

Kind code of ref document: A1