GB2557178A - Improvements relating to hearing assistance in vehicles - Google Patents

Improvements relating to hearing assistance in vehicles Download PDF

Info

Publication number
GB2557178A
GB2557178A GB1620115.4A GB201620115A GB2557178A GB 2557178 A GB2557178 A GB 2557178A GB 201620115 A GB201620115 A GB 201620115A GB 2557178 A GB2557178 A GB 2557178A
Authority
GB
United Kingdom
Prior art keywords
user
vehicle
processor
hearing
audio
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1620115.4A
Other versions
GB201620115D0 (en
GB2557178B (en
Inventor
Wells Andrew
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB201620115A priority Critical patent/GB2557178B/en
Publication of GB201620115D0 publication Critical patent/GB201620115D0/en
Priority to PCT/EP2017/077978 priority patent/WO2018099677A1/en
Priority to US16/349,911 priority patent/US20200066070A1/en
Publication of GB2557178A publication Critical patent/GB2557178A/en
Application granted granted Critical
Publication of GB2557178B publication Critical patent/GB2557178B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/30Conjoint control of vehicle sub-units of different type or different function including control of auxiliary equipment, e.g. air-conditioning compressors or oil pumps
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R25/00Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R25/00Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception
    • H04R25/55Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception using an external connection, either wireless or wired
    • H04R25/554Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception using an external connection, either wireless or wired using a wireless connection, e.g. between microphone and amplifier or using Tcoils
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/10Earpieces; Attachments therefor ; Earphones; Monophonic headphones
    • H04R1/1041Mechanical or electronic switches, or control elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2205/00Details of stereophonic arrangements covered by H04R5/00 but not provided for in any of its subgroups
    • H04R2205/041Adaptation of stereophonic signal reproduction for the hearing impaired
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2420/00Details of connection covered by H04R, not provided for in its groups
    • H04R2420/03Connection circuits to selectively connect loudspeakers or headphones to amplifiers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2420/00Details of connection covered by H04R, not provided for in its groups
    • H04R2420/07Applications of wireless loudspeakers or wireless microphones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2430/00Signal processing covered by H04R, not provided for in its groups
    • H04R2430/01Aspects of volume control, not necessarily automatic, in sound systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2499/00Aspects covered by H04R or H04S not otherwise provided for in their subgroups
    • H04R2499/10General applications
    • H04R2499/13Acoustic transducers and sound field adaptation in vehicles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R25/00Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception
    • H04R25/40Arrangements for obtaining a desired directivity characteristic
    • H04R25/407Circuits for combining signals of a plurality of transducers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R3/00Circuits for transducers, loudspeakers or microphones
    • H04R3/12Circuits for transducers, loudspeakers or microphones for distributing signals to two or more loudspeakers

Abstract

A system for a vehicle for providing improved audio for a hearing impaired user, comprising: an input configured to receive electrical input signals from at least one vehicle mounted device 22, wherein the electrical input signals are representative of an operative state of the vehicle; a user input module 20 configured to enable user input of at least one user characteristic; and a processor configured to generate electrical output signals corresponding to audio and to transmit the electrical output signals to at least one audio output device e.g. hearing aid or speaker 24, for outputting the audio to the user, wherein the generation of the electrical output signals is dependent upon the at least one user characteristic. The processor may automatically determine a hearing impairment level of the user or the user may manually input the level, for both right and left ears. The user input module may allow selection of a vehicle zone for the user.

Description

(71) Applicant(s):
Jaguar Land Rover Limited (Incorporated in the United Kingdom)
Abbey Road, Whitley, Coventry, Warwickshire, CV3 4LF, United Kingdom (72) Inventor(s):
(51) INT CL:
B60W10/30 (2006.01) B60R 16/00 (2006.01) H04R 25/00 (2006.01) (56) Documents Cited:
WO 2008/015293 A2 US 20150365771 A1
B60K 35/00 (2006.01) G06F 3/16 (2006.01)
DE 102014218065 A1 US 20050086058 A1 (58) Field of Search:
INT CL B60K, B60Q, B60R, B60W, G06F, H04R Other: EPODOC, WPI, Fulltext
Andrew Wells (74) Agent and/or Address for Service:
Jaguar Land Rover
Patents Department W/1/073, Abbey Road, Whitley, COVENTRY, CV3 4LF, United Kingdom (54) Title of the Invention: Improvements relating to hearing assistance in vehicles Abstract Title: Vehicle audio for hearing impaired users (57) A system for a vehicle for providing improved audio for a hearing impaired user, comprising: an input configured to receive electrical input signals from at least one vehicle mounted device 22, wherein the electrical input signals are representative of an operative state of the vehicle; a user input module 20 configured to enable user input of at least one user characteristic; and a processor configured to generate electrical output signals corresponding to audio and to transmit the electrical output signals to at least one audio output device e.g. hearing aid or speaker 24, for outputting the audio to the user, wherein the generation of the electrical output signals is dependent upon the at least one user characteristic. The processor may automatically determine a hearing impairment level of the user or the user may manually input the level, for both right and left ears. The user input module may allow selection of a vehicle zone for the user.
Figure GB2557178A_D0001
Fig. 2 /7 ,0
Figure GB2557178A_D0002
Figure GB2557178A_D0003
Figure GB2557178A_D0004
Fig. 1
2/7
Figure GB2557178A_D0005
Fig. 2
3/7
150
Figure GB2557178A_D0006
162
Transmit output signal to output location(s)
Fig. 3
1684/7
Figure GB2557178A_D0007
Fig. 4
5/7
304
306 V
Hearing impairment support - SEAT SELECT
User 1
308
Seat:
310Left Ear:
Right Ear:
312
302
Figure GB2557178A_D0008
Fig. 5
300
6/7
Figure GB2557178A_D0009
404
Fig. 6
402
400
7/7
Figure GB2557178A_D0010
500
IMPROVEMENTS RELATING TO HEARING ASSISTANCE IN VEHICLES
TECHNICAL FIELD
The present disclosure relates to a system for a vehicle for providing improved audio for a hearing impaired user. Aspects of the invention relate to a system and a vehicle incorporating the system.
BACKGROUND
Sound is an important aspect of the driving experience. In a vehicle, users, both drivers and passengers alike, receive audible information and cues from a variety of sources including in-car entertainment systems, navigation systems, audible alert systems, and the external environment. In many situations, it is important that users receive this information clearly so that the vehicle can be operated correctly and appropriately.
However, if a user has a hearing impairment, it may be difficult for them to receive this information clearly if the vehicle does not have assistive features. The World Health Organisation estimates that 15% of the world’s adult population have some degree of hearing loss, while at least 5% have disabling hearing loss (loss greater than 40dB). It is therefore imperative that vehicles provide for hearing-impaired vehicle users so that audible information can be clearly communicated.
Current vehicle systems seeking to address this issue use audio induction loop technology. A disadvantage associated with audio induction loop systems is that losses are still introduced into the system, creating difficulty for the user in situations where background noise is high. Similarly, if a user has only a single hearing aid, the information received at each ear may be different and may be even more difficult to comprehend. Similarly, if a user has a hearing impairment but does not require hearing aids, there is currently very little provision other than increasing the volume of the systems, which may cause problems for non-hearing impaired users sharing the vehicle at the same time.
There is therefore a need to provide greater accessibility for hearing-impaired vehicle users, especially when considering important information that is communicated audibly.
At least in certain embodiments, the present invention seeks to mitigate or overcome at least some of the above-mentioned problems.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide a system, a method, a processor, a media device storing computer readable code and a vehicle as claimed in the appended claims.
According to an aspect of the present invention there is provided a system for a vehicle for providing improved audio for a hearing impaired user, the system comprising: an input configured to receive electrical input signals from at least one vehicle mounted device, wherein the electrical input signals are representative of an operative state of the vehicle; a user input module configured to enable user input of at least one user characteristic; and a processor configured to generate electrical output signals corresponding to audio and to transmit the electrical output signals to at least one audio output device for outputting the audio to the user, wherein the generation of the electrical output signals is dependent upon the at least one user characteristic.
Advantageously, the system ensures clear communication of vehicle alerts to hearing impaired users according to their specific preference and needs. A user input module allows for information particular to a user to be input to the system. The user characteristic is easily variable and allows for personalised signals to be transmitted to audio devices within a vehicle. The signals are particularly tailored to the user’s needs or wishes, which means that the user experience in the vehicle can be improved significantly and that signals are communicated most efficiently and intelligibly.
The processor may be configured to determine the hearing impairment level, and optionally determines the hearing impairment level automatically. The processor may be arranged to determine the hearing impairment level by conducting an overt or covert audiometric test to provide an audiogram indicative of the user’s hearing acuity and I or impairment. Optionally, the processor is configured to conduct an audiometric test using otoacoustic emission tracking (OAE). This further benefits the user and enhances the user experience.
The user input module may be configured to enable user input of a hearing impairment level for the user.
The hearing impairment level may comprise a first hearing impairment level and a second hearing impairment level corresponding to a left ear and a right ear of the user, respectively.
The processor may be configured to generate electrical output signals in dependence on the hearing impairment level or in dependence upon the user characteristic.
The user input module may be configured to enable user selection of the at least one audio output device.
Optionally, the processor is configured to transmit the electrical output signals to a hearing assistance device and/or a loudspeaker. The processor may be configured to transmit the electrical output signals to at least two hearing assistance devices and/or at least two loudspeakers. This advantageously caters for all hearing impaired users, ensuring that the system can be used by all those who require assistance.
In some embodiments, the user input module may be configured to enable user selection of a zone for the hearing impaired user, and wherein the system is configured to transmit electrical output signals in dependence on the selected zone. Additionally, the user input module may be configured to enable user selection of a zone for a passenger within the vehicle, and wherein the processor is configured to transmit electrical output signals corresponding to signals received from a microphone associated with the selected zone. Selecting zones is particularly important in allowing the user to hear requested sounds from anywhere within the vehicle. In one example, the or each zone may relate to a seating position or a seat of a user of the vehicle.
The user characteristic may relate to at least one characteristic or feature of the electrical output signal, and the feature may comprise a waveform, an amplitude, or frequency content of the electrical output signal. Allowing variability of the features of the electrical output signal further allows for an improved user experience, as well as further catering for differing hearing impairments. For example, if a hearing impairment is particular to a certain band of frequencies, a different frequency content of the output signal may be required to enable the signal to be heard by the user.
The user input module may comprise an infotainment system. In this case, the system may be further configured to generate electrical output signals comprising audio content from the infotainment system.
According to another aspect of the invention, there is provided a vehicle comprising the above system.
According to another aspect of the invention, there is provided a processor for use in the system as described herein before, or for use in the vehicle as described in the foregoing aspect.
According to another aspect of the invention, there is provided a method of providing improved audio for a hearing impaired user in a vehicle, the method comprising:
receiving from at least one vehicle mounted device electrical signals representative of an operative state of the vehicle;
receiving from a user input module electrical signals representative of at least one user characteristic;
processing the received electrical signals representative of the operative state of the vehicle in dependence on the received electrical signals representative of the at least one user characteristic;
generating electrical output signals corresponding to audio in dependence on said processed electrical signals;
transmitting the electrical output signals to at least one audio output device for outputting the audio to the user.
According to yet another aspect of the invention, there is provided media device storing computer program code adapted, when loaded into or run on a computer or processor, to cause the computer or processor to become a system as described in one or more of the foregoing aspects.
According to a further aspect of the invention, there is provided a media device storing computer program code adapted, when loaded into or run on a computer or processor, to cause the computer or processor to perform the method as described above.
The media device described in the above aspects may comprise a non-transitory computer readable media.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 is schematic diagram of a vehicle having a vehicle interior provided with a hearing assistance system according to an embodiment of the invention;
Figure 2 is a schematic block diagram of a hearing assistance system that may be installed in the vehicle of Figure 1;
Figure 3 is a flow chart illustrating operation of the hearing assistance system of Figure 2;
Figure 4 is a flow chart illustrating initiation of the hearing assistance system of Figure 2; and
Figures 5 to 7 illustrate example displays for a human-machine interface that forms part of the hearing assistance system of Figure 2.
DETAILED DESCRIPTION
Figure 1 is a schematic diagram of a vehicle 10 having a vehicle interior 12 provided with a hearing assistance system according to an embodiment of the invention. The hearing assistance system 14, which is shown in Figure 2, improves user experience and accessibility options available to a user of the vehicle 10 by allowing users with hearing impairments to implement audio settings tailored to their requirements. More specifically, the system 14 ensures that audio signals can be communicated to hearing impaired users who wear a hearing assistance device quickly and clearly by directly transmitting warning signals or other sounds to the hearing assistance device.
For the purposes of this application, it should be assumed that any reference to a hearing assistance device includes medical devices used for improvement of hearing of users with hearing impairments, such as a wireless hearing aids, as well as wireless headphones such as the type used with portable audio devices.
Considering both Figures 1 and 2, the vehicle 10 incorporates a hearing assistance system 14, comprising a plurality of modules 16 including on-board computer 18 that is electrically connected to a user input module comprising a human-machine interface (HMI) 20, input hardware 22, output hardware 24 and an exchange module 26.
The connections 28 between modules 16 and the directions of travel of information between these modules 16 are indicated in Figure 2 by arrows. Connections 29 that may not require physical connections are illustrated using a dashed line. It will be noted that Figure 2 is a schematic view, so the way in which the commands are transferred between modules 16 or hardware is not depicted explicitly. However, it will be appreciated that suitable cabling may be provided to interconnect the modules 16, and that these interconnections may be direct or ‘point to point’ connections, or may be part of a local area network (LAN) operated under a suitable protocol (CAN-bus or Ethernet for example). Vehicle bus communications networks form part of the art and will not be discussed in detail in this application.
It should also be noted that the output hardware modules 86, 88 that are outlined using a dashed line are not considered to be part of the hearing assistance system 14 as it is manufactured and/or sold. Similarly, it should be noted that the sensors and microphones may be separate from the system and information may be received from the sensors by the system.
The on-board computer 18 implements commands according to pre-defined user instructions. The commands alter various settings of the hearing assistance system 14, and are implemented in response to data inputs received in the form of electrical input signals at a processor 30. The commands comprise electrical output signals that are generated at the on-board computer 18 based upon the input signals.
More specifically, the on-board computer 18 receives input signals from the HMI 20 or from the input hardware 22. Within the on-board computer 18, a processor 30 may receive input signals from an exchange module 26 or may request and receive information from a data storage memory 36. Similarly, within the on-board computer 18, the processor 30 generates output signals that are sent to an amplifier 38 for amplification of the output signal, to the exchange module 26 either directly or via the amplifier 38, or written to the data storage memory 36 for later recall. The on-board computer 18 provides output signals and/or commands to the HMI 20.
The HMI 20 comprises a display screen 44 and a user input interface 46. Via the HMI
20, information can be presented to a user on the display screen 44 and the user can instruct the operation of the processor 30, and thus influence the operation of the hearing assistance system 14, using the user input interface 46. The user input interface 46 may be provided, for example, in the form of a physical keyboard or as a touch-sensitive screen being part of the display 44. In use, the HMI 20 allows the user to input information relating to their hearing impairment to tailor the operation of the system 14 accordingly as will be discussed in more detail later in relation to Figure 4.
Still considering Figures 1 and 2, the HMI 20 is located centrally in a dashboard 48 of the vehicle 10 along a central axis 50 of the vehicle 10 so that it is within reach of a user seated in a first seating row 52 of the vehicle 10. The first seating row 52 is roughly aligned with a front pair of doors 54 of the vehicle 10, and comprises a driver’s seat 56 and a passenger’s seat 58. A second seating row 60 including three seats 62, 64, 66 roughly aligned with a rear pair of doors 68, is spaced from the first seating row 52, and the interior 12 further includes a luggage space 70 disposed between the second seating row 60 and a tailgate 72.
As mentioned above, input signals are received by the processor 30 from input hardware 22. The input hardware 22 comprises one or more of a plurality of vehicle mounted devices such as a satellite navigation system 34, an infotainment system 33, a plurality of sensors 74 and a plurality of microphones 76. In alternative embodiments the satellite navigation and infotainment systems 34, 33 may be incorporated into the on-board computer 18 or may be external to the system 14, such as within a mobile phone or other portable device.
In practice, a sensor may comprise a single sensing element or a plurality of sensing elements configured to measure a parameter. The plurality of sensors 74 are configured to take measurements relating to operational parameters or states of the vehicle 10 and to send an electrical input signal or signals representative of each measurement to the processor 30. Typical examples of sensors 74 are speedometers, tachometers, indicator sensors that sense the current status of the indicator (i.e. on or off), and fuel level sensors or gauges.
The microphones 76 are dispersed within the interior 12 of the vehicle 10 and detect sound within the interior 12, being optimally positioned within the vehicle interior 12 to ensure adequate coverage of the volume of the interior 12. Some or all of the microphones 76 may be arranged into one or more microphone arrays, with an array comprising at least two microphones positioned close to a predetermined position. A microphone or microphone array 78 is positioned above each seat 56, 58, 62, 64, 66 to specifically pick up sound in the vicinity of that seat 56, 58, 62, 64, 66 in the interior 12 of the vehicle 10. At least two microphones 80, 82 are placed to register ambient sound within the interior 12 of the vehicle 10; in this case one is located above the dashboard 48 and one is located above the luggage space 70.
The processor 30 is in communication with the exchange module 26, either directly or via the amplifier 38. The exchange module 26 is capable of detecting, distinguishing between and communicating with, audio output hardware including headphones 86 or hearing assistance devices 88. It is envisaged that this hardware 86, 88 is enabled for wireless communication and connects or ‘pairs’ with the exchange module 26 using a wireless communication technology such as a suitable Bluetooth protocol. However, it would be possible to implement the system 14 using a physical connection between the listed output hardware 86, 88 and the exchange module 26 without departing from the scope of the invention.
It is depicted here that the exchange module 26 is incorporated within the on-board computer 18. However, the exchange module 26 and the on-board computer 18 may be separate modules. Additionally, the amplifier 38 may be separate from the on-board computer 18. Alternatively, the exchange module 26 or amplifier 38 may form part of the processor 30.
The exchange module 26 is capable of transmitting electrical signals corresponding to audio to the audio output hardware 86, 88, 40, where it is converted to an audio signal. For example, the exchange module 26 is capable of transmitting audio signals to a plurality of loudspeaker arrays dispersed throughout the vehicle 10. The term loudspeaker array may refer to a plurality of loudspeakers 40 of differing sizes, typically a trio of woofer, tweeter and sub-woofer, or to a single loudspeaker.
As shown in Figure 1, a pair of loudspeaker arrays 90, 92 is disposed in the front pair of doors 54 of the vehicle 10 and a pair of loudspeaker arrays 94, 96 is disposed in the rear pair of doors 56 of the vehicle 10. Each pair of loudspeaker arrays 90, 92 and 94, 96, flanks a respective seating row 52, 60. At least one loudspeaker array 98 is disposed in the luggage space 70 and at least one loudspeaker array 100 is disposed at the front end of the vehicle 10 in the vicinity of the dashboard 48.
In addition to these more general loudspeaker arrays, seat-specific arrays 102 are incorporated into headrests of the seats 56, 58, 62, 64, 66. These arrays 102 are each formed of two small loudspeakers 104, 106 disposed in the headrest of their respective seat 56, 58, 62, 64, 66 so that when a user is seated at least one loudspeaker 104, 106 is directed towards each of the user’s ear.
In operation, the system 14 operates according to the process 150 shown in Figure 3. For simplicity, the process 150 of Figure 3 is described only with reference to a system 14 having a single sensor. In practice, it is expected that the system 14 will comprise a plurality of sensors 74 and that the on-board computer 18 will receive multiple input signals simultaneously, all of which are processed according to the process 150 shown in Figure 3.
Initially, at step 152, an electrical input signal is received by the on-board computer 18 from a sensor, for example a speedometer measuring vehicle speed. The input signal received by the on-board computer 18 is analysed at the processor 30, and the information gathered by the analysis is compared with a respective criterion stored in the memory 36 of the on-board computer 18 at step 154. For example, for a speedometer sensor, the sensor sends input signals to the on-board computer 18 that are indicative of the instantaneous speed of the vehicle 10. The speed signal is compared at step 154 with the criterion for this sensor, which in this case a threshold speed. The threshold speed may be a fixed threshold of the upper speed limit of the country in which the vehicle 10 is used. Alternatively, the threshold speed may be dictated by the satellite navigation system 34, being the speed limit on the road on which the vehicle 10 is travelling.
At step 156, the criterion is deemed to be met if the comparison shows that the signal received from the sensor indicates that the instantaneous speed of the vehicle 10 is above the threshold.
If the criterion is met 158 then an output signal is generated at step 160. If the criterion is not met 162, the process returns to the first step 152 and continues monitoring the input signals received from the sensor. It is envisaged that in some embodiments the user may specify whether or not an output signal is required when a criterion is met by a selection made at the HMI 20 when setting up the system 14.
In the case that an output signal is generated 160, the processor 30 renders an electrical output signal, in dependence on the input signal, the criterion that is met, and the sensor from which the input signal has been received. For example, the processor 30 may render an output signal that, when converted to an audio signal, is a beeping sound to indicate excessive speed.
Where the sensor is an indicator sensor, the input signal contains information as to whether either of the left or right indicator switches has been operated or not. If one of the indicator switches has been operated, the criterion at step 156 is met 158. The sensor may not continually send information to the on-board computer 18, but rather only send a signal if the indicator is on. In the event that an indicator signal is detected (i.e. the criterion is met) 158, the output signal generated at step 160 may be a familiar ticking sound of an operative indicator.
The processor 30 designates 164 the relevant output locations for the output signal to be sent to according to user settings that have been specified. The specification of user settings will be considered below in relation to Figure 5 to 7. The output locations may comprise any combination of output hardware 24 including the hearing assistance device or a plurality of hearing assistance devices 88, and/or the loudspeakers 40. When multiple output signals are to be sent to an output location such as the output hardware 24, the process would include a step (not shown) whereby the signals are mixed accordingly, or certain signals may be delayed for maximum impact.
The output signal is amplified 166 at the amplifier 38, before being transmitted 168 to the relevant output locations by the exchange module 26. Transmission 168 of an output signal to the hearing assistance device(s) 88 and/or the loudspeakers 40 will result in the conversion of the output signal to an audio alert. Optionally, an audio alert output signal is transmitted to a hearing assistance device 88 while the ambient noise audio signal from the microphones 76 is used to create a signal that is the inverse of the ambient noise at the loudspeakers 40 in order to attenuate noise within the interior 12 by the mechanism of destructive interference. This further aids the user in hearing the signal clearly by improving the signal to noise ratio.
In some embodiments, the system 14 may also transmit 168 information from the infotainment system 33 and I or the satellite navigation system 34 directly to a user’s hearing assistance device 88. In some embodiments, sound registered by the microphones 76 within the interior of the vehicle 10 can be used to aid the user in hearing other users. The system may identify the position of a speaker by a threshold comparison of all microphones and amplify the audio signal input from the microphone having the highest signal input. The input from the microphone would then be sent to the hearing assistance device, or may be processed to filter out unwanted noise first. As will be discussed in more detail with relation to Figure 5, by specifying a zone in which they are seated, which may comprise their seating position, the hearing impaired user can tailor the system 14 to amplify ambient sound from all other locations in the vehicle 10, or from specific locations. It is also possible to amplify the voice of a speaker other than the user through the headrest arrays 102 of the user’s seat to improve the signal-to-noise ratio within the vehicle 10.
Figure 4 shows a start-up process 200 of the system 14. It is envisaged that the process 200 of Figure 4 will take place immediately upon activation of the vehicle 10 (i.e. the engine is turned on), although it would also be possible for the process 200 to be implemented when the vehicle 10 has not been activated. For example, the system 14 may implement the process 200 when a user is detected in the interior 12 of the vehicle 10.
At a first step 202 of the process 200 of Figure 4, the system 14 determines whether a user within the interior 12 of the vehicle 10 is wearing a wireless-enabled hearing assistance device 88. The exchange module 26 is configured to search for and detect hearing assistance devices 88, and upon detection of the hearing assistance devices 88, communicates the information about the detected devices 88 to the on-board computer 18. For example, Bluetooth enabled devices 88 have a default or user12 chosen identifier associated with them, which can be displayed to the user to help with selection of the correct device 88.
Upon detection 204 of a hearing assistance device 88 and communication of the associated information, the on-board computer 18 determines 206 whether the hearing assistance device 88 has been connected with the hearing assistance system 14 before. This determination is made by comparison of the information received concerning the device 88 with information stored within the data storage memory 36 where a record of the device history is stored. If it is determined that the hearing assistance device 88 has been identified and connected previously 208, the exchange module 26 connects 210 automatically to the hearing assistance device 88 and the processor 30 implements 212 associated user settings that were specified when the device 88 was connected before.
If it is determined 206 that the detected hearing assistance device 88 has not been connected before 214, the system 14 invites 216 the user to connect or ‘pair’ the detected hearing assistance device 88 to the hearing assistance system 14. In this case, following connection 218 of the hearing assistance device 88 to the system 14, the system 14 invites 220 users to specify their preferred accessibility settings. This is performed via the HMI 20 and will be discussed in more detail later when considering Figures 5 to 7. Following specification of the user’s preferred settings, the settings are implemented 212.
If, at the first step 202 of the process 200 of Figure 4, no hearing assistance device 88 is detected 222 within a predetermined time period, 10 seconds for example, the processor 30 instructs the exchange module 26 to cease searching for hearing assistance devices 88 and determines 224 whether any settings have been specified before by comparison with the data storage memory 36. These are specifically settings where no hearing assistance device 88 is present. If settings have been specified before 226, the system 14 implements these settings 212. If no settings are found 228, the user is invited 220 to specify their preferred accessibility settings, preferences or characteristics via the HMI 20. The system 14 may also communicate to the user that a Bluetooth device 88 has not been found and/or a connection has not been established (if the data storage memory 36 contains information relating to a hearing assistance device 88), and subsequently prompt the user to enable Bluetooth and restart the process if they wish to connect a hearing assistance device 88 to the system 14. The user settings are implemented 212 if the user consequently sets new accessibility settings for the system 14.
It is envisaged that step 224 and step 202 may be implemented simultaneously. For example, if accessibility settings have been previously specified 226 and do not require a hearing assistance device 88 connection, then the process may implement 212 these settings immediately. If, following the implementation 212 of these settings, a hearing assistance device 88 is detected 204, the system 14 may communicate to the user via the HMI 20 that a hearing assistance device 88 has been detected and that the user should provide guidance on how the system 14 should proceed.
Alternatively, the system 14 may implement a ranking system to allow the user to rank configurations of the system 14 in order of preference. A configuration comprises information regarding the number of connections to hearing assistance devices 88, and which devices 88 to connect to, and information regarding groups of user settings. Groups of user settings may include which alert signals to send directly to the hearing assistance devices 88, and I or the level of reinforcement to be provided by the loudspeakers 40.
For example, the user may alter the rankings of the configurations so that upon activation, the system 14 firstly attempts to operate according to a first configuration, whereby the system 14 connects to two previously identified hearing assistance devices 88 and implements a first group of settings. If the first configuration cannot be achieved, the system 14 would then attempt to operate according to a second configuration, in which the system 14 connects to only one previously identified hearing assistance device 88 and implements a second group of settings. In the event that the system 14 cannot operate according to the first and second configurations, a third configuration would be operated. It is envisaged that the final configuration would be that the system 14 implements a third group of settings as no hearing assistance devices 88 are detected. By operating the system 14 using rankings, if the user is not using or wearing one or both of their hearing assistance devices 88 when using the vehicle 10, the system 14 automatically defaults to the first implementable group of settings.
Figures 5 to 7 illustrate an example of how the specification of preferred user characteristics, preferences or settings at step 220 of Figure 4 may be realised on a touch-sensitive HMI 20. Figures 5 to 7 show consecutive screen arrangements 300, 400, 500 where the user is prompted to make selections to tailor the system 14 to their needs. It will be assumed here that one hearing assistance device 88 has been identified in the vehicle 10 by the activation process, that the one hearing assistance device 88 had not been previously connected with the system 14, and that the user has already connected the hearing assistance device 88 with the system 14 following invitation to do so at step 216.
Figure 5 illustrates an HMI screen 300 whereby the hearing impaired user is prompted to select a zone on a representation 302 of the vehicle 10. The zone may correspond, for example, to the user’s seating position within the vehicle. All screens in Figures 5 to 7 include a title portion 304, a user identifier 306, and three selection portions 308, 310, 312 that outline the selections made on previous screens as the user moves through the setup process. The selection positions in Figures 5 to 7 include the zone selection portion 308, the left ear selection portion 310, and the right ear selection portion 312, although many other selections may also be indicated here.
Once the user has used the input interface 46 of the HMI 20 to select their zone, the HMI 20 displays the second screen 400 shown in Figure 6 which relates to the level of hearing impairment of the user. It will be noticed that the zone selection portion 308 has been updated, here identifying that the rear-right zone has been selected.
In Figure 6, a screen 400 is presented to the user in which they are asked to specify the level of hearing impairment for each ear using a list 402 of predetermined hearing loss intervals measured in decibels (dB). It is expected that a user will know their hearing loss level, having been diagnosed appropriately. If the user is hearing impaired in one ear only, it is possible to not select a level of hearing loss for a particular ear. The system 14 would then default to settings for no hearing loss and no hearing assistance device 88 for that ear.
The user is also asked to specify with which ear their hearing assistance device 88 is used by selecting the correct option using selection boxes 404 shown to the left of each level list 402. As the case being considered is where the system 14 has been identified and has been connected to a single hearing assistance device 88 only, if the user selects that they use a hearing assistance device 88 in their left ear, then the system 14 will assume a default setting of no right ear hearing assistance device 88, unless otherwise indicated. If the user selects the hearing assistance device option for both ears to indicate that they use a hearing assistance device 88 in each ear, then the system 14 would communicate to the user that only a single hearing assistance device 88 has been connected to the system 14.
Following selection of the appropriate options in the screen 400 of Figure 6, the HMI 20 displays the screen 500 of Figure 7. The left ear selection portion 310 and right ear selection portion 312 have been updated to reflect the options chosen by the user at the screen 400 of Figure 6. The user has indicated that their hearing assistance device 88 is worn in their left ear and that their hearing loss in that ear is at level 4 (60-69dB hearing loss). The user has specified that they do not wear a hearing assistance device 88 in their right ear, and has a hearing loss of between 20dB and 39dB, level 1.
At the screen 500 of Figure 7, the user is asked to select 502 which audio alerts or other sounds 504 they wish to hear through their hearing assistance device 88 during vehicle operation. This is by no means an exhaustive list of the potential alert audio that may be routed to the hearing assistance device 88, and the system 14 can be configured to transmit many different vehicle audio alerts to a connected hearing assistance device 88.
It should be noted that the HMI screen configurations 300, 400, 500 of Figures 5 to 7 are by no means a complete set but are just illustrative of how the implementation of settings may be achieved. Many other screen configurations are possible, and the settings that may be implemented vary highly. For example, the system 14 may be highly automated, with the user only required to choose their seating position or zone and their level of hearing impairment. The system 14 would then implement appropriate settings without any further user input.
In some embodiments, the user may choose to have their hearing tested by the vehicle 10 to ascertain a correct level selection. For example, in some embodiments, a user may instruct the hearing assistance system 14 via the HMI 20 to conduct an audiometric test. During the audiometric test the hearing assistance system 14 outputs a plurality of audible sounds having different frequencies and I or volume levels. The user listens to the plurality of audible sounds and responds to each via the HMI 20 to indicate how well they have heard each sound. In this manner an audiogram may be generated for each of the user’s ears. The audiogram is indicative of the user’s hearing acuity and / or impairment (specific to a particular ear) and is used by the hearing assistance system 14 to derive the user characteristics or settings to be implemented 212.
In a further embodiment, the hearing assistance system 14 may conduct a covert audiometric test. During such a covert audiometric test the hearing assistance system 14 outputs a plurality of audible commands at differing frequencies and I or volume levels in which the user is requested to operate a vehicle system. For example, the system 14 may request that the user change a vehicle setting, e.g. open I close a window, change a radio station, operate a heating or ventilation control etc. The hearing assistance system 14 monitors the vehicle systems for changes therein elicited by the audible command. If there is a correlation between the audible command and a change in a vehicle system then the hearing assistance system 14 makes a determination that the user has heard the audible command and responded thereto. If there is no change in a vehicle system in response to an audible commend then the hearing assistance system determines that the user has not heard the audible command and infers a hearing impairment in relation to the given frequency and /or volume level at which the audible command was output. In this manner an audiogram may be generated for each of the user’s ears. The audiogram is indicative of the user’s hearing acuity and or impairment (specific to a particular ear) and is used by the system 14 to derive the user characteristics or settings to be implemented 212.
An additional or alternative hearing test may be performed for users wearing a hearing assistance device such as a hearing aid or headphones. In this embodiment a microphone is provided within the hearing assistance device to enable otoacoustic emission tracking (OAE). By way of explanation, when sound waves hit the eardrum, causing the bones of the middle ear to send signals to the cochlea, the vibrations create faint, low-intensity sounds that are sent back out through the ear canal.
In this embodiment a plurality of audible sounds having different frequencies and I or volume levels are output to the user’s ear from the hearing assistance device. The microphone within the hearing assistance device measures each of the plurality of audible sounds entering and returning from the ear and the hearing assistance device (optionally in combination with the hearing assistance system 14) determines how well each of the plurality of audible sounds was actually heard. In this manner an audiogram may be generated for each of the user’s ears. The audiogram is indicative of the user’s hearing acuity and I or impairment (specific to a particular ear) and is used by the hearing assistance system 14 to derive the user characteristics or settings to be implemented 212.
In other embodiments, the user may be asked to specify other users in the vehicle 10 and their level of hearing impairment, the system 14 then tailoring the settings within the interior 12 of the vehicle 10 to multiple users. In doing this, the system 14 will therefore be able to transmit differently tailored alerts and combinations of information to each individual hearing impaired user (who may or may not be using hearing assistance devices 88), or to users who do not have a hearing impairment.
In other embodiments, other screens may be incorporated into the user set-up procedure whereby the user is asked to select which alerts should be sent to the loudspeakers 40.
Examples of alerts or other sounds that may be transmitted to hearing assistance devices 88 or reinforced through the loudspeakers 40 by the system 14 are: indicator tones; seat belt warnings; parking aid tones to indicate the proximity of the vehicle 10 to exterior objects; instructions from the satellite navigation system 34; signals from the infotainment system 33 such as music or radio; operational vehicle tones for alerting the user to a particular fault within the engine; phone alerts and phone conversations; conversational assist, where the system 14 uses the microphones 76, a seat identification system (not shown) and beamforming techniques to identify a zone corresponding to a microphone 76 or microphones 76 in which a speaking occupant is positioned and to reduce ambient noise and increase the volume level of the speaking occupant’s speech; confirmation tones to alert the user that a certain setting has been activated; text narration; low fuel warnings; and low charge warnings for electric vehicles.
The system 14 may also continue to be connected to the user’s hearing assistance device 88 after the user has exited the vehicle 10 up to a certain time limit or distance from the vehicle 10. If the user walks 20m away from the vehicle 10, for example, the system 14 may disconnect from the hearing assistance device 88. By doing this, the system 14 is therefore able to continue to provide audio alerts to the user, such as that the vehicle 10 has been locked, that the lights have been left on, or that the alarm is sounding.
Additionally, operative noises made by the vehicle 10 may be transmitted to the user’s hearing assistance device 88 or sounded through the loudspeakers 40 to aid in the user’s hearing of these sounds. For example, a digital reproduction of the engine noise may be transmitted to the hearing assistance device 88 to aid the user in operation of the vehicle 10 and to prevent the user from over-revving the engine or to prevent stalling. Not only does this improve the driving experience, but also improves the driving skill of the user.
The user may connect a hearing assistance device 88 to the exchange module 26 via a mobile device such as a mobile phone. Optionally, the mobile phone may then be used in place of the HMI 20.
Alternatively, the system 14 may include variable audio alerts. The audio alerts may vary in volume, by altering the amplitude of the output signal or may vary in the sound that is heard by the user by altering the waveform of the output signal. An example of this is that a user may request that the alerts are spoken so that the reason for the alert is immediately obvious. This is made possible because the alert is transmitted directly to the hearing assistance device 88 of the user, ensuring that losses between the system 14 and the user are minimised.
Furthermore, the system 14 may be varied to further aid the user in hearing the alert. Some users do not have broadband impairment, but may have hearing loss that is particularly prevalent at certain frequencies or across frequency bands. Therefore, the user may not ordinarily be able to hear the alert as its audio component is in a band that is inaudible to the user. Even if this alert is then transmitted to a hearing assistance device 88, the user may not hear the alert clearly. Therefore, the system 14 is able to vary the frequency content or individual frequency components of the audio alerts to aid the user further. The user would have to specify this during the settings selection process.
In an alternative embodiment, the vehicle is operable in an autonomous mode, and therefore comprises an autonomous driving control module that controls the vehicle during the autonomous mode. When the vehicle is operating in the autonomous mode, it may be necessary for the autonomous driving control module to request that the user retakes control of the vehicle. This may occur, for example, so that the user can guide the vehicle into and out of motorway driving or when an emergency vehicle is detected. Alternatively, a request for user input may occur following a malfunction of a vehicle component that is necessary for the operation of the vehicle in the autonomous mode. In this case, it is necessary for the system to be able to communicate the request to the user, and so electrical output signals may be generated and communicated to the user’s hearing assistance device in accordance with the request from the autonomous driving control module.
Many modifications may be made to the above examples without departing from the scope of the invention as defined in the accompanying claims.

Claims (23)

1. A system for a vehicle for providing improved audio for a hearing impaired user, the system comprising:
an input configured to receive electrical input signals from at least one vehicle mounted device, wherein the electrical input signals are representative of an operative state of the vehicle;
a user input module configured to enable user input of at least one user characteristic; and a processor configured to generate electrical output signals corresponding to audio and to transmit the electrical output signals to at least one audio output device for outputting the audio to the user, wherein the generation of the electrical output signals is dependent upon the at least one user characteristic.
2. The system as claimed in claim 1, wherein the processor is configured to determine a hearing impairment level for the user.
3. The system as claimed in claim 2, wherein the processor is configured to determine the hearing impairment level automatically.
4. The system as claimed in claim 2 or claim 3, wherein the user input module is configured to enable user input of a hearing impairment level for the user.
5. The system as claimed in any of claims 2 to 4, wherein the hearing impairment level comprises a first hearing impairment level and a second hearing impairment level corresponding to a left ear and a right ear of the user, respectively.
6. The system as claimed in any of claims 2 to 5, wherein the processor is configured to generate electrical output signals in dependence on the hearing impairment level.
7. The system as claimed in any of claims 1 to 6, wherein the processor is configured to transmit the electrical output signals in dependence upon the user characteristic.
8. The system as claimed in claim 7, wherein the user input module is configured to enable user selection of the at least one audio output device.
9. The system as claimed in claim 7 or claim 8, wherein the processor is configured to transmit the electrical output signals to a hearing assistance device and/or a loudspeaker.
10. The system as claimed in any of claims 7 to 9, wherein the processor is configured to transmit the electrical output signals to at least two hearing assistance devices and/or at least two loudspeakers.
11. The system as claimed in any of claims 7 to 10, wherein the user input module is configured to enable user selection of a zone for the hearing impaired user, and wherein the system is configured to transmit electrical output signals in dependence on the selected zone.
12. The system as claimed in claim 11, wherein the user input module is configured to enable user selection of a zone for a passenger within the vehicle, and wherein the processor is configured to transmit electrical output signals corresponding to signals received from a microphone associated with the selected zone.
13. The system as claimed in any of claims 1 to 12, wherein the user characteristic relates to at least one feature of the electrical output signal.
14. The system as claimed in claim 13, wherein the feature comprises a waveform.
15. The system as claimed in claim 13 or claim 14, wherein the feature comprises an amplitude.
16. The system as claimed in any of claims 13 to 15, wherein the feature comprises frequency content.
17. The system as claimed in any of claims 1 to 16, wherein the user input module comprises an infotainment system of the vehicle.
18. A vehicle comprising the system as claimed in any of claims 1 to 17.
19. A processor for use in the system as claimed in any of claims 1 to 17, or for use in the vehicle as claimed in claim 18.
20. A method of providing improved audio for a hearing impaired user in a vehicle, the method comprising:
receiving from at least one vehicle mounted device electrical signals representative of an operative state of the vehicle;
receiving from a user input module electrical signals representative of at least one user characteristic;
processing the received electrical signals representative of the operative state of the vehicle in dependence on the received electrical signals representative of the at least one user characteristic;
generating electrical output signals corresponding to audio in dependence on said processed electrical signals;
transmitting the electrical output signals to at least one audio output device (24) for outputting the audio to the user.
21. A media device storing computer program code adapted, when loaded into or run on a computer or processor, to cause the computer or processor to become a system according to any of claims 1 to 17.
22. A media device storing computer program code adapted, when loaded into or run on a computer or processor, to cause the computer or processor to perform the method of claim 20.
5
23. A system, a processor, a method or a vehicle substantially as herein described, with reference to the accompanying figures.
Intellectual
Property
Office
Application No: GB1620115.4 Examiner: Mr Vivek Raghavan
GB201620115A 2016-11-29 2016-11-29 Improvements relating to hearing assistance in vehicles Active GB2557178B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
GB201620115A GB2557178B (en) 2016-11-29 2016-11-29 Improvements relating to hearing assistance in vehicles
PCT/EP2017/077978 WO2018099677A1 (en) 2016-11-29 2017-11-01 Improvements relating to hearing assistance in vehicles
US16/349,911 US20200066070A1 (en) 2016-11-29 2017-11-01 Improvements relating to hearing assistance in vehicles

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB201620115A GB2557178B (en) 2016-11-29 2016-11-29 Improvements relating to hearing assistance in vehicles

Publications (3)

Publication Number Publication Date
GB201620115D0 GB201620115D0 (en) 2017-01-11
GB2557178A true GB2557178A (en) 2018-06-20
GB2557178B GB2557178B (en) 2020-01-01

Family

ID=58073223

Family Applications (1)

Application Number Title Priority Date Filing Date
GB201620115A Active GB2557178B (en) 2016-11-29 2016-11-29 Improvements relating to hearing assistance in vehicles

Country Status (1)

Country Link
GB (1) GB2557178B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019208206A1 (en) * 2019-06-05 2020-12-10 Audi Ag Method and system for providing an artificially generated driving noise of a motor vehicle
US11030863B2 (en) 2019-10-02 2021-06-08 Toyota Motor Engineering & Manufacturing North America, Inc. Systems and methods for providing audio information in a vehicle
WO2023179966A1 (en) * 2022-03-22 2023-09-28 Schuster Oliver Odysseus Audio system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050086058A1 (en) * 2000-03-03 2005-04-21 Lemeson Medical, Education & Research System and method for enhancing speech intelligibility for the hearing impaired
WO2008015293A2 (en) * 2007-09-27 2008-02-07 Phonak Ag Method for operating a hearing device and corresponding hearing system and arrangement
US20150365771A1 (en) * 2014-06-11 2015-12-17 GM Global Technology Operations LLC Vehicle communiation with a hearing aid device
DE102014218065A1 (en) * 2014-09-10 2016-03-24 Volkswagen Aktiengesellschaft Adaptation of acoustic parameters of a driver assistance system of a motor vehicle

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050086058A1 (en) * 2000-03-03 2005-04-21 Lemeson Medical, Education & Research System and method for enhancing speech intelligibility for the hearing impaired
WO2008015293A2 (en) * 2007-09-27 2008-02-07 Phonak Ag Method for operating a hearing device and corresponding hearing system and arrangement
US20150365771A1 (en) * 2014-06-11 2015-12-17 GM Global Technology Operations LLC Vehicle communiation with a hearing aid device
DE102014218065A1 (en) * 2014-09-10 2016-03-24 Volkswagen Aktiengesellschaft Adaptation of acoustic parameters of a driver assistance system of a motor vehicle

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019208206A1 (en) * 2019-06-05 2020-12-10 Audi Ag Method and system for providing an artificially generated driving noise of a motor vehicle
US11030863B2 (en) 2019-10-02 2021-06-08 Toyota Motor Engineering & Manufacturing North America, Inc. Systems and methods for providing audio information in a vehicle
WO2023179966A1 (en) * 2022-03-22 2023-09-28 Schuster Oliver Odysseus Audio system

Also Published As

Publication number Publication date
GB201620115D0 (en) 2017-01-11
GB2557178B (en) 2020-01-01

Similar Documents

Publication Publication Date Title
US20200066070A1 (en) Improvements relating to hearing assistance in vehicles
CN107396249B (en) System for providing occupant-specific acoustic functions in a transportation vehicle
RU2722106C2 (en) System and method of individual sound insulation in acoustic zones of vehicle
US9743213B2 (en) Enhanced auditory experience in shared acoustic space
EP2611213B1 (en) Sound system with individual playback zones
GB2557178A (en) Improvements relating to hearing assistance in vehicles
US20060023890A1 (en) Sound field controller and method for controlling sound field
JP2022516058A (en) Hybrid in-car speaker and headphone-based acoustic augmented reality system
CN115769601A (en) Method for outputting a user-specific acoustic signal with an output unit, computer program product and electronic signal processing system
CN114194128A (en) Vehicle volume control method, vehicle, and storage medium
GB2557177A (en) Improvements relating to hearing assistance in vehicles
KR101673787B1 (en) Sound output system for vehicle
KR101575437B1 (en) An alarm sound controlling method using the internal speakers of the vehicle and an apparatus for this
KR20210110599A (en) In-car Headphone Acoustic Augmented Reality System
US10552117B1 (en) Vehicle audio settings management
US20230254654A1 (en) Audio control in vehicle cabin
US11974103B2 (en) In-car headphone acoustical augmented reality system
US11950090B2 (en) In-car adaptive sound quality output method, device, storage medium and car audio system
EP4114043A1 (en) System and method for controlling output sound in a listening environment
KR20060030296A (en) Apparatus and method for automatically controlling balance of audio system for vehicle
JP2017147702A (en) Sound field controller, sound field control system, and sound field control method
CN115426585A (en) Sound alarm control method and system for automobile cabin
WO2024003927A1 (en) Vehicle status communication system and method
CN113496694A (en) Vehicle acoustic system, vehicle seat and vehicle
CN114767097A (en) Vehicle-mounted hearing test