EP3236673A1 - Ajustement d'une prothèse auditive basé sur des scénarios d'interaction d'utilisateur - Google Patents

Ajustement d'une prothèse auditive basé sur des scénarios d'interaction d'utilisateur Download PDF

Info

Publication number
EP3236673A1
EP3236673A1 EP16165711.9A EP16165711A EP3236673A1 EP 3236673 A1 EP3236673 A1 EP 3236673A1 EP 16165711 A EP16165711 A EP 16165711A EP 3236673 A1 EP3236673 A1 EP 3236673A1
Authority
EP
European Patent Office
Prior art keywords
hearing aid
user
user interaction
hearing
scenario
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP16165711.9A
Other languages
German (de)
English (en)
Inventor
Philipp Schneider
Francois JULITA
Aliaksei TSITOVICH
Tim Roth
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sonova Holding AG
Original Assignee
Sonova AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sonova AG filed Critical Sonova AG
Priority to EP16165711.9A priority Critical patent/EP3236673A1/fr
Publication of EP3236673A1 publication Critical patent/EP3236673A1/fr
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R25/00Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception
    • H04R25/70Adaptation of deaf aid to hearing loss, e.g. initial electronic fitting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2225/00Details of deaf aids covered by H04R25/00, not provided for in any of its subgroups
    • H04R2225/41Detection or adaptation of hearing aid parameters or programs to listening situation, e.g. pub, forest
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2225/00Details of deaf aids covered by H04R25/00, not provided for in any of its subgroups
    • H04R2225/55Communication between hearing aids and external devices via a network for data exchange

Definitions

  • the invention relates to a method, a computer program and a computer-readable medium for adjusting a hearing aid. Furthermore, the invention relates to a fitting system for adjusting a hearing aid.
  • fitting rooms located in a shop, audiologist office
  • the fitting process may be a lengthy, complex process, supported by specific tools, primarily designed for highly trained hearing care professionals.
  • WO 2015/009564 A1 relates to an online hearing and fitting system for non-expert users, who can interactively adjust a hearing aid based on acoustic test signals.
  • WO 2013/117214 A1 relates to a hearing aid fitting system adapted for remote fitting of a hearing aid system.
  • WO 00/22874 A1 relates to a fitting system for hearing devices with a mobile telephone as input device.
  • EP 1 353 530 A1 relates to a system for training of hearing aid users, which comprises training units and individual user profiles stored in a database.
  • Objectives of the invention are to simplify the fitting process of a hearing aid, to provide a fitting process that may be performed by an untrained user of a hearing aid and to provide a fitting process, which results in a more individual adjustment of the hearing aid.
  • a hearing aid may be a device, which is adapted for being carried by a user at least partially in or on the ear.
  • a hearing aid also may be a Cochlear implant device, with parts implanted inside the head.
  • the hearing aid may be adapted for processing sound signals based on sound processing parameters stored in the hearing aid, for example such that hearing deficiencies of a user of the hearing aid are compensated.
  • the sound signals may be generated by a microphone of the hearing aid and/or may be received via another input of the hearing aid such as a T-coil or other interface, like a radio receiver.
  • the sound processing parameters which may be stored in a memory of the hearing aid, may be parameters for a frequency dependent amplifier of the hearing aid and/or may encode how the sound signal is converted into a signal provided to the hearing sense of the user.
  • the method comprises: creating a user profile of the hearing aid user in a server device, the user profile comprising at least a user identification and hearing aid information specifying the hearing aid; and preparing a list of user interaction scenarios in the server device, based on the user profile.
  • the user profile may be created by a hearing care professional with a fitting application executed in a fitting device. This may be performed during a first fitting process of the hearing aid, in which also the list of user interaction scenarios may be defined, which may be supported with a fitting application.
  • the user profile may be a data structure that may be stored in a database provided by the server device.
  • the fitting device may be interconnected via Internet with the server device and the fitting application may cause the server device to create (automatically) the user profile and/or the list of user interaction scenarios in the database.
  • the user profile may contain information about the user of the hearing aid and about the hearing aid itself. For example, a unique user identification may be used for identifying the data stored for the user in the database.
  • the hearing aid information may comprise a type number and/or serial number of the hearing aid, a list of features of the hearing aid, and/or configuration data of the hearing aid, such as the actual sound processing parameters currently stored in the hearing aid.
  • a user interaction scenario also may be represented with a data structure stored in the database in the server device.
  • a user interaction scenario may comprise a data structure, in which information relating to a scenario and/or situation, in which the user may interact with the hearing aid, is stored.
  • a user interaction scenario may be seen as a container for information relating to a specific task, the user should perform with the hearing aid, and/or for information about the functionality of the hearing aid that is used, changed, adjusted, etc. during this task.
  • a user interaction scenario may comprise information about a specific place, typical location and/or situation (such as a restaurant), how the hearing aid may be used there and/or how the hearing aid functionality may be improved there.
  • the list of user interaction scenarios may be stored in the server device. This list may be prepared with the fitting application and/or may be automatically generated by the server device and/or the fitting device. Every user may have an individual list of user interaction scenarios.
  • the server device need not be a single device but also may be provided by a system of devices, for example, a cloud computing facility.
  • the fitting device may be a PC or other computing device located at the site of the hearing aid professional.
  • the method comprises: selecting a user interaction scenario from the list of user interaction scenarios a mobile device of the user communicatively interconnected with the server device; determining one or more hearing aid adjustment parameters based on the selected user interaction scenario; and displaying one or more input controls for adjusting the determined one or more hearing aid adjustment parameters with the mobile device.
  • Communicatively interconnected may mean that the respective two devices are adapted to exchange data with each other with a communication protocol.
  • the selection of the user interactions scenario and/or the adjustment of the one or more hearing aid adjustment parameters may be performed with an application executed in the mobile device and/or with web browser.
  • the user may select the corresponding user interaction scenario manually. It also may be possible that the user interaction scenario is automatically selected, for example based on timing information, a timer and/or by automatically detecting the situation and/or place.
  • the "mobile" application may be downloaded from the server device or an app store into the mobile device. It also may be possible that the application is a web based application mainly executed in the server device, only having a user interface processed by the mobile application, for example with a web browser.
  • the mobile device may be connected via Internet with the server device and the application may communicate with the server device via Internet. For example, the application may send the user identification to the server device and the server device may send the list of user interaction scenarios to the application.
  • the mobile device determines hearing aid adjustment parameters associated with the user interaction scenario.
  • Hearing aid adjustment parameters have to be discriminated from the sound processing parameters.
  • Hearing aid adjustment parameters may encode how the user may adjust the hearing aid, wherein sound processing parameters encode, how the hearing aid (and in particular its amplifier and/or further elements inside the hearing aid) process the input sound data into output sound data.
  • the one or more hearing aid adjustment parameters of a specific user interaction scenario may be stored in the specific user interaction scenario.
  • the mobile device may display an input control for the hearing aid adjustment parameter.
  • Every hearing aid adjustment parameter may be of a specific type (such as yes/no, range, list of values, etc.) and based on this type, the input control (such as a toggle button, a slider, a selection list, etc.) may be selected and/or presented. It also may be possible that the type of input control to be used is stored together with the corresponding hearing aid adjustment parameter.
  • the user may have the option to adjust the one or more hearing aid adjustment parameters associated with a user interaction scenario, when he is in the situation and/or at the place, which is specified in the user interaction scenario.
  • the method comprises: after adjustment of at least one of the hearing aid adjustment parameters with the one or more input controls: deriving sound processing parameters for the hearing aid based on the adjusted one or more hearing aid adjustment parameters; and optionally transferring and/or applying the derived sound processing parameters in the hearing aid, such that the hearing aid is adapted for generating optimized sound signals based on the applied sound processing parameters.
  • sound processing parameters are derived from the changed/adjusted one or more hearing aid adjustment parameters.
  • the adjusted hearing aid parameters may be transmitted to the server device, which based on the hearing aid information in the user profile, determines the corresponding sound processing parameters.
  • the sound processing parameters are determined by the mobile device.
  • the sound processing parameters stored in the hearing aid may be retrieved from the hearing aid, altered based on the one or more adjusted hearing aid adjustment parameters (in the mobile device and/or the server device) and may be stored back to the hearing aid.
  • the one or more adjusted hearing aid adjustment parameters are sent to the hearing aid, which itself derives the corresponding sound processing parameters.
  • the hearing aid may be fitted by a user in specific situations and/or at specific places without the need to adjust complicated sound processing parameters.
  • the user simply may adjust some input controls associated with the specific situation and/or place and/or may be guided by the mobile device.
  • the user directly can influence the hearing aid and has a direct feedback about the changed/adjusted behaviour of the hearing aid in a specific situation and/or at a specific place.
  • the user profile comprises first additional, diagnostic information specifying hearing deficiencies of the user.
  • diagnostic information such as a hearing profile of the user and/or characteristics tinnitus
  • the diagnostic information may be used for at least partially automatically creating the list of user interaction scenarios.
  • the user profile comprises second additional, lifestyle information specifying hearing scenarios, the user is exposed.
  • the lifestyle information may comprise information on the gender, the age, the family situation, etc. of the user.
  • the lifestyle information also may specify, which situations and/or places a user may be exposed (such as a church, a restaurant, the inside of a car, TV watching, etc.).
  • the lifestyle information may be input by the hearing care specialist into the fitting application, which then updates the user profile in the server device.
  • the lifestyle information may be used for at least partially automatically creating the list of user interaction scenarios.
  • user profiles of a plurality of users are stored in the server device.
  • one server device may administrate a plurality of users and their data.
  • the server device may be connected to a plurality of mobile devices and/or a plurality of fitting devices.
  • selecting the user interaction scenario from a list of possible user interaction scenarios is carried out automatically based on the user profile.
  • a list of possible user interaction scenarios may be stored in the server device and/or the preparation of the list of user interaction scenarios of a user may be based on automatically selecting user interaction scenarios from the list of possible user interaction scenarios.
  • the list of possible user interactions may be configured for a plurality of possible situations, places, events and/or hearing aid types. This list may be provided for all users.
  • the server device may select only those possible user interaction scenarios, which fit to the user profile.
  • a possible user interaction scenario is automatically selected for a user, when the user profile of the user comprises hearing aid information, diagnostic information and/or lifestyle information assigned with the possible user interaction scenario.
  • the possible user interaction scenarios may be associated/assigned with specific entries and/or values that may be set in user profiles. For example, user interaction scenarios for a specific feature of a hearing aid only may be selected for users having a hearing aid with that feature. Furthermore, user interaction scenarios relating to tinnitus only may be selected for users having tinnitus. These selections may be performed by the server device and/or the fitting device.
  • At least some of the user interaction scenarios are displayed by the mobile device to be selected by the user.
  • the list of user interaction scenarios or at least a part of this list may be presented to the user by a user interface of an application executed in the mobile device. The user then may select the user interaction scenario, he wishes to deal with, with the user interface.
  • handling information provided by a user interaction scenario is displayed by the mobile device.
  • an application of the mobile device also may provide further information to the user via its user interface, for example, how the hearing aid adjustment parameters may influence the hearing aid and/or how the hearing aid directly may be adjusted (for example by turning on and off specific features of the hearing aid with switches and/or knobs provided by the hearing aid).
  • the user interaction scenario is selected automatically based on timing information stored in the user interaction scenario.
  • a further possibility for selecting a user interaction scenario may be based on a date and/or time or time offset assigned to the user interaction scenario. For example, a specific user interaction scenario (such as a weekly or monthly hearing test) may be automatically provided to the user.
  • the user interaction scenario is selected automatically based on another associated user interaction scenario, which has been completed. It may be possible that there is a chain of user interaction scenarios that may be selected (and interacted with) one after the other.
  • the user interaction scenario is selected automatically based on a history of interaction scenarios, which have been completed.
  • the adjustment of at least one of the hearing aid adjustment parameters is based on an ambient sound signal, for example from sound generated in the ambient of the user.
  • the hearing aid may process an ambient sound signal.
  • An ambient sound signal may be a signal that is generated with a microphone of the hearing aid or the mobile device, i.e. which may be based on sound generated in the vicinity of the user.
  • the ambient sound signal may not be stored in the mobile device or the server device.
  • the user interaction scenario may be based on real sound and not on sound that was recorded before the user interaction scenario was selected.
  • the user interaction scenario is associated with one or more hearing scenarios an ambient sound signal is analyzed for determining a hearing scenario; wherein the selection of the user interaction scenario is based on the determined hearing scenario.
  • An ambient sound signal may be analyzed for determining a hearing scenario, wherein the user interaction scenario is selected automatically based on the determined hearing scenario associated with the user interaction scenario. It may be possible that the hearing aid and/or the mobile device records ambient sound and that this sound is regularly analyzed, whether a specific hearing scenario for the user is present. For example, it may be determined, whether the user is in a room with strong echo, whether the user listens to a radio or to music, and/or whether the user is inside a car. Some user interaction events may be associated with specific characteristics of ambient sound and may be selected, when these characteristics are determined from the ambient sound signal. In other words, the user selection scenario may be selected in dependence of the acoustic environment of the user.
  • the ambient sound signal is processed and/or recorded by the hearing aid and/or by the mobile device and/or the ambient sound signal is analyzed by the mobile device and/or the server device. It may be possible that the ambient sound signal is recorded by a microphone of the hearing aid, processed by the hearing aid and then sent to the mobile device or via the mobile device to the server device. On the server device or on the mobile device, the ambient sound signal may be analyzed, whether it contains characteristics associated with a user interaction scenario.
  • the sound processing parameters are derived from the one or more hearing aid adjustment parameters in the mobile device. After the user has adjusted the one or more hearing aid adjustment parameters with the mobile device, the sound processing parameters may be directly determined in the mobile device and then sent to the hearing aid, where they are applied.
  • the sound processing parameters are derived from the one or more hearing aid adjustment parameters in the hearing aid. After the user has adjusted the one or more hearing aid adjustment parameters with the mobile device, the one or more hearing aid adjustment parameters may be sent to the hearing aid, where the sound processing parameters are determined and applied.
  • the sound processing parameters are derived from the one or more hearing aid adjustment parameters in the server device. After the user has adjusted the one or more hearing aid adjustment parameters with the mobile device, the one or more hearing aid adjustment parameters may be sent to the server device, where the sound processing parameters may be determined. The sound processing parameters then may be sent via the mobile device to the hearing aid, where they are applied.
  • an audible notification indicating an operation mode of the hearing aid is assigned with the user interaction scenario.
  • the hearing aid may generate specific audible notification (such as specific beeps), which may indicate the user, that the hearing aid has switched into a specific mode or that it may be possible (or reasonable) that the hearing aid may switch into a specific mode.
  • specific audible notification such as specific beeps
  • the hearing aid may generate a beep, when the user enters into a magnetic field that may be processed by a T-coil.
  • Specific user interaction scenarios which may relate to scenarios, in which these operation modes of the hearing aid may be employed, may also be assigned with such audible indications.
  • An input control for triggering the audible notification in the hearing aid with the mobile device may be displayed by the mobile device, when such a user interaction scenario is selected. With such a trigger, a user may learn, which audible notification relates to which operation mode.
  • an input control for providing rating and/or further user input related to the user interaction scenario is displayed by the mobile device, wherein the rating and/or further user input is stored in the user profile.
  • the user may rate the user interaction scenario, for example, whether the user interaction scenario was helpful for him or not.
  • the user may input further information, such as a personal note. Rating and/or further information input to the mobile device, for example, the selection of yes and no, a selection from a list or a text input to the mobile device, may be sent from the mobile device to the server device and/or may be stored in the user profile.
  • the user may provide feedback to the server device, which also may store a history of feedback for the specific user and/or for a specific user interaction scenario with respect to different users.
  • the feedback or the rating of a user interaction scenario also may be provided by a so-called significant other (for example a relative of the user) and/or a hearing care professional.
  • the rating of specific user interaction scenarios by a hearing care professional also may be performed with an application executed in the fitting device.
  • the one or more hearing aid adjustment parameters comprise at least one of: volume, gain, clarity, sound input type (such as a microphone, T-coil for processing a signal from an induction loop, an audio stream from a TV, a telephone connection), directivity, tonal balance, dynamic compression, loudness, a frequency transposition parameter describing for example a frequency shift between an input signal and an output signal and/or a frequency compression rate, hearing program type (speech, speech in noise, music).
  • the hearing aid adjustment parameters may be defined as psycho acoustic parameters.
  • the hearing aid parameters may be adjusted by selection from a list or by selection of a value from a range of values.
  • the sound processing parameters encode at least one of: an adjusting a frequency dependent amplifying of sound signals, a mixing of sound signals from more than one sound source, such as mixing of sound signals picked up by microphones and/or sound signals picked up through a wired or wireless connection.
  • the sound processing parameters may be defined as technical parameters or control parameters of the hearing aid, which may directly control the behavior of the hearing aid.
  • the sound processing signals may be derived from the hearing aid adjustment parameters based on formulas and/or lookup tables.
  • the method comprises: generating a user profile and/or modifying a user profile with a fitting application executed in a fitting device communicatively interconnected with the server device.
  • a hearing care professional may generate the user profile with a fitting application at his site, which also may be used for fitting the hearing aid in a more complicated way.
  • the sound processing parameters may be directly adjusted with the fitting application.
  • the user profile is modified with the fitting application, for example, during a further visit of the hearing aid user at the hearing care professional.
  • a modified user profile results in the generation of further user interaction scenarios for the list of the user.
  • the method comprises: defining and/or modifying a user interaction scenario in the list of user interaction scenarios for a user with the fitting application. Another possibility is that one or more user interaction scenarios are manually defined with the fitting application and stored in the list of the user. It also may be possible that such manually generated user interaction scenarios are copied into the list of possible user interaction scenarios.
  • the list of user interaction scenarios and and/or one or more user interaction scenarios may be customized by a hearing care professional with the fitting application.
  • the computer program may have different parts run in the mobile device, the server device and/or the fitting device.
  • a computer-readable medium may be a floppy disk, a hard disk, an USB (Universal Serial Bus) storage device, a RAM (Random Access Memory), a ROM (Read Only Memory), an EPROM (Erasable Programmable Read Only Memory) or a FLASH memory.
  • a computer-readable medium may also be a data communication network, e.g. the Internet, which allows downloading a program code.
  • the computer-readable medium may be a non-transitory or transitory medium.
  • a further aspect of the invention relates to a fitting system for adjusting a hearing aid, the fitting system comprising a mobile device communicatively interconnected with the hearing aid and adapted for transferring sound processing parameters to the hearing aid, and/or a server device communicatively interconnected with the mobile device.
  • the fitting system may be adapted for performing the method as described in the above and in the following.
  • the fitting system further comprises a fitting device communicatively interconnected with the server device.
  • the fitting device may be adapted for defining a user profile and/or for defining user interaction scenarios for a specific user.
  • Fig. 1 schematically shows a hearing aid 10.
  • a hearing aid 10 may be a device that may be put at least partially into an ear of a user to at least partially compensate an auditory defect of the user.
  • a hearing aid 10 may be near the ear, at least partially in the ear channel and/or carried on the ear. It has to be noted that a hearing aid 10 may comprise two separate devices for each ear of the user. It also may be possible that the hearing aid 10 comprises a Cochlear implant, which may be inside the head of the user.
  • the hearing aid 10 comprises an input 12 for receiving sound data and an output 14 for generating signals stimulating the hearing sense of the user.
  • the input 12 may comprise a microphone 16 and a sender/receiver 18 for digital signals, which may be transferred via infrared and/or electromagnetic waves (such as radio waves, for example Bluetooth ⁇ ).
  • a further receiver for electromagnetic waves also may be a so-called T-coil or telecoil 20.
  • the output 14 may comprise a loudspeaker 22 in the ear channel or a stimulation device inside the cochlear.
  • An analog signal from one if the inputs may be transformed by a corresponding transducer 24 into a digital signal.
  • the microphone 16 and/or the T-coil 20 may generate analog sound signals, which may be then transduced into digital sound signals or sound data that may be processed by an amplifier 26.
  • the amplifier transforms input sound data into optimized output sound data for the output 14.
  • the optimized sound data then may be transduced by a further transducer 24 into an analog output signal for the loudspeaker 22.
  • the amplifier 26 may comprise a processor (or at least a digital electronic circuit), which may perform the transformation of the sound data.
  • the hearing aid 10 furthermore comprises a controller 28 with a memory and a processor, which controls the operation of the amplifier 26. It may be possible that the amplifier 26 is a module of the controller 28.
  • sound processing parameters 30 are stored, which parametrize the control of the amplifier 26.
  • the sound processing parameters 30 may encode an adjusting a frequency dependent amplifying of sound data in the amplifier 26 and/or a mixing of sound data from more than one source 16, 18, 20.
  • the sound processing parameters also may control a processing of sound data inside the hearing aid 10.
  • sound signals are received by the receiver 18 (which, for example, may have been transferred via Bluetooth ⁇ ) that may be evaluated by a controller 28 of the hearing aid and input into the amplifier 26.
  • the controller 28 Via the receiver 18 (for example via Bluetooth ⁇ ), the controller 28 also may receive control signals. For example, the controller 28 may receive modified sound processing parameters 30 or may receive commands to send the actual applied sound processing parameters via the receiver/sender 18 to a further device. Furthermore, the controller 28 may receive commands for modifying the sound processing parameters and/or for switching into another operation mode.
  • Fig. 2 shows a fitting system 32 for fitting the hearing aid 10.
  • the fitting system 32 comprises a server device 34 and a plurality of mobile devices 36, which are communicatively connected with the server device, for example via Internet 38. Every mobile device 36 may be associated with a user, which carries a hearing aid 10.
  • the mobile device 36 which may be a Smartphone, a tablet computer or similar device, may communicate with the hearing aid 10 via its sender/receiver 18, for example via Bluetooth ⁇ . Also, the mobile device 36 may communicate with the server device 24 via Internet. As will be described in detail below, the server device 34 may send the mobile device specific data structures called user interaction scenarios 40, which, inter alia, may comprise information about the usage of the hearing aid 10 in a specific scenario/situation and how the hearing aid 10 may be adjusted in such a situation. Based on the user interaction scenario 40, the mobile device 36 may generate modified sound processing parameters 30 and may send them to the hearing aid 10.
  • user interaction scenarios 40 may comprise information about the usage of the hearing aid 10 in a specific scenario/situation and how the hearing aid 10 may be adjusted in such a situation.
  • the mobile device 36 may generate modified sound processing parameters 30 and may send them to the hearing aid 10.
  • the visualization of the user interaction scenario 40 and the communication with the server device 34 and the hearing aid 10 may be performed by an application 42 running in the mobile device 36.
  • the application may be downloaded from the server device 34 (or another source) and/or may be a web application with parts running in the server device 34.
  • the server device 34 may be a virtual device, for example executed in a cloud computing facility.
  • a database 44 is provided, in which a plurality of user profiles and further data relating to the users and/or its hearing aids is stored. This database 44 also may store the user interaction scenarios 40 for each user.
  • the fitting system 32 furthermore may comprise one or more fitting devices 46.
  • Each fitting device 46 may be a computer, PC (but also a tablet computer) situated in a shop/office of a hearing care specialist.
  • a fitting application 48 is executed that may be used for fitting a hearing aid 10.
  • the hearing aid specialist may use the fitting device 46 and/or the fitting application 48 for directly adjusting the hearing aid 10 by modifying the sound processing parameters 30 (which, however, may not be transferred via the mobile device 36, but via a direct connection).
  • the fitting device 46 may be communicatively connected with the server device 34, for example via Internet 38.
  • the hearing care specialist may input information about the user of the hearing aid, for example about his life situation, into the fitting application, which will then send this information to the server device 34, which will create a user profile of the user in the database and which may create user interaction scenarios 40 for the user based on this information.
  • the fitting process of the hearing aid 10 may be divided into parts that are either performed at the store/office of the hearing care specialist (with the aid of the fitting application 48) and into parts that may be performed with the application 42, for example by the user.
  • a first fitting may be performed by the hearing care specialist and further improvements of the fitting may be performed by the user in real-life situations at home or at other places. This real-life fitting is supported by the application 42.
  • the real-life fitting part performed with the application 42 may be divided into individual task or user interaction scenarios 40. With every user interaction scenario 40, the user may try to improve his hearing experience by adjusting the hearing aid 10 with his mobile device 36. It has to be noted that the term "user interaction scenario” may either refer to the situation/scenario, in which the user performs the fitting of the hearing aid 10 himself in a specific scenario as well as to the data structure that supports this fitting and that may be stored in the server device 34, sent to the mobile device 36 and/or visualized by the application 42.
  • the user interaction scenarios 40 for a user may be defined by the hearing care professional, for example, together with the hearing aid user, and may be stored as a list in the server device 34 in the database 44.
  • the user interaction scenarios 40 of a user (or at least some of them) may be sent from the server device 34 to the mobile device 36, for example, when requested by the application 42.
  • Fig. 3 schematically shows a user interface 50 of the mobile device 36.
  • the application 42 may provide the user the possibility to select a user interaction scenario 40 from a list, for example, with a list input control 52 on the user interface 50.
  • a user interaction scenario 40 may relate to "family gathering”, “car driving”, “TV watching”, “church”, “T-coil usage”, etc.
  • These names 54 of the user interaction scenarios 40 may be displayed on the user interface 50 (for example with the list input control 52) and based upon this, the user may select one of the user interaction scenarios 40.
  • the application 42 When a user interaction scenario 40 has been selected, the application 42 shows controls and/or information on its user interface 50 relating to the selected user interaction scenario 40.
  • the user interaction scenario 40 may comprise handling information 56, which is displayed on the user interface 40, for example in a textbox 58.
  • handling information 56 may explain, how to switch on a TV-link of hearing aid 10 in a user interaction scenario 40 relating to TV, or how to disable a phone option in a user interaction scenario 40 relating to a church visit.
  • an audible notification is associated with the user interaction scenario 40.
  • a hearing aid 10 when the user enters an environment, in which a T-coil 20 receives input signals, a hearing aid 10 usually generates a specific beep (audible notification), which may trigger the user to turn on the T-coil 20.
  • the user interface 52 may comprise an input control 60 for triggering the audible notification, such as a button.
  • the user may learn, which audible notification relates to which operation mode/function of his hearing aid 10.
  • the user interaction scenario 40 is automatically selected by the application 52, which receives an indication from the hearing aid 10 that an area with T-coil reception has been entered.
  • the user interaction scenario 40 also may comprise one or more hearing aid adjustment parameters 64, which may be adjusted by the user for the user interaction scenario 40 with further input controls, such as a slider input control 62 or toggle input control 66.
  • a user adjustment parameter 64 may be volume, gain, clarity (that may be adjusted within a range of values) or may be a hearing program type (that may be selected from a list).
  • this hearing aid adjustment parameter 64 will be immediately used for deriving changed sound processing parameters 30 for the hearing aid 10 and for implementing them in the hearing aid 10. In such a way, a user immediately may sense the different behavior of the hearing aid 10, and in particular in a real-life scenario/situation.
  • a user interaction scenario 40 also may be rated by the user.
  • the interface 50 may comprise an input control 68 for rating the user interaction scenario 40, which may be a simple toggle button for answering, whether the user interaction scenario 40 has been helpful for the user.
  • Fig. 4 schematically shows a structure of the database 44 stored in the server device 34.
  • the database 44 comprises a list 70 of user profiles 72. Every user profile 72 is associated with a user of a hearing aid 10. For example, during the first fitting, the hearing care specialist may input information about the user into the fitting application and the server device 34 may create a user profile 72 for this user based on the input information.
  • a user profile 72 may comprise a user identification 74, which may be unique for the user, hearing aid information 76 specifying the hearing aid 10, diagnostic information 78 specifying hearing deficiencies of the user and/or lifestyle information 80 specifying hearing scenarios, the user is exposed.
  • the user identification 74 may be used for interrelating the user profile 72 with a mobile device 36, which is requesting information from the server device 34. Also further information about the user like his name, etc. may be stored in his user profile 72.
  • the hearing aid information 76 may comprise information about the hearing aid 10, such as the type of the hearing aid 10, features of the hearing aid 10 and/or operation modes of the hearing aid 10. It also may be possible that the actual sound processing parameters 30 actually applied to the hearing aid or a history thereof are stored in the hearing aid information 76.
  • the diagnostic information 78 may comprise information about the hearing deficiencies of the user.
  • the diagnostic information 78 may comprise a hearing curve recorded by the hearing care specialist and/or whether the user has a tinnitus.
  • the lifestyle information 80 may comprise information about the user, which may be relevant for fitting the hearing aid.
  • the lifestyle information 80 may comprise the gender and the age of the user.
  • the lifestyle information 80 may furthermore comprise, whether the user is often in special places like restaurants, churches, cars, etc.
  • the user profile 72 also comprises a list 82 of user interaction scenarios 40 for the user.
  • a user interaction scenario 40 may comprise a name 54, handling information 56, one or more hearing aid adjustment parameters 64 and feedback information 83. With every hearing aid adjustment parameter 64, also its actual value and/or a history of adjusted values may be stored. Furthermore, a history of the feedback information may be stored in the user interaction scenario 40.
  • a user interaction scenario 40 also may comprise a link 84 to one or more other user interaction scenarios 40, for example, user interaction scenarios 40 that may be selected, when a user has completed the user interaction scenario 40 with the link 84.
  • a user interaction scenario 40 also may comprise an identifier 85 for an audible notation.
  • an identifier 85 for an audible notation For example, when the application should display a button 60 as described above, the corresponding sound or an identifier for the sound may be stored in the user interaction scenario 40.
  • a user interaction scenario 40 also may comprise a time trigger 86.
  • the user interaction scenario may be automatically selected by the application 42, when a specific date and/or time has been reached or when a specific time duration has been completed.
  • a user interaction scenario 40 also may comprise an ambient sound trigger 87. It may be possible that a user interaction scenario 40 is selected automatically, when a specific ambient sound situation is detected.
  • the ambient sound trigger 87 may specify that a specific user interaction scenario 40 is selected, when music, car sound or a room with large echo is detected.
  • the database 44 may comprise a list 88 of possible user interaction scenarios 90.
  • the possible user interaction scenarios 90 may have the same structure as the user interaction scenarios 40 of a user profile 72. Additionally, a possible user interaction scenario may be associated with specific hearing aid information 76, diagnostic information 78 and/or lifestyle information 80.
  • the list 82 of user interaction scenarios 40 of a specific user may be generated with the list 88 of possible user interaction scenarios 90 by selecting those possible user interaction scenarios 90 that fit with the information 76, 78, 80 of the user profile 72.
  • Fig. 5 shows a flow diagram for a method for adjusting the hearing aid 10 with the fitting system 32.
  • a user profile 72 of the hearing aid user is created in the server device 34.
  • the user profile 72 is generated with the fitting application.
  • a hearing care specialist may input information about the user into the fitting application 48, which then may trigger the server device 34 to generate a corresponding user profile 72. It also may be possible that the user profile 72 is modified with the fitting application, for example during a next visit of the user at the hearing care specialist.
  • step S12 the list 82 of user interaction scenarios 40 is prepared based on the user profile 72.
  • the list 82 is prepared automatically by the server device 34.
  • the preparation of the list 82 of user interaction scenarios 40 of a specific user may be based on automatically selecting (i.e. copying) user interaction scenarios 40 from the list 88 of possible user interaction scenarios 90.
  • a possible user interaction scenario 90 may be automatically selected for the specific user, when the user profile 72 of the user comprises hearing aid information 76, diagnostic information 78 and/or lifestyle information 80 assigned with the possible user interaction scenario 90.
  • user interaction scenarios 40 for a specific feature of a hearing aid 10 only may be copied, when the hearing aid 10 of the specific user has this feature.
  • user interaction scenarios 40 in the list 82 of a specific user are defined and/or modified with the fitting application 48.
  • the hearing care specialist may customize the automatically prepared list 82 for the specific user.
  • the application 42 may be installed in the mobile device 36 and/or the application 42 may register in the server device 34.
  • the list 82 of user interaction scenarios 40 may then be loaded at least partially into the mobile device 36.
  • step S14 a user interaction scenario 40 from the list 82 is selected with the application 42. After that, the user interaction scenario may be displayed as described with respect to Fig. 3 .
  • the user interaction scenario 40 may be selected manually by the user with the application 42. Another possibility is, that the user interaction scenario 40 is selected automatically based on a timer 86 stored in the user interaction scenario 40. For example, some user interaction scenarios 40 may be selected regularly or at specific times.
  • a user interaction scenario 40 also may be selected automatically based on another associated user interaction scenario 40, which has been completed. For example, when the user has provided feedback for a user interaction scenario 40, he just has interacted with and the user interaction scenario 40 has a link 84 to a further user interaction scenario 40, this further user interaction scenario 40 may be displayed.
  • a user interaction scenario 40 is selected based upon an ambient sound situation of the user.
  • an ambient sound signal may be analysed for determining a hearing scenario.
  • the mobile device 36 (which also may comprise a microphone) and in particular the application may continuously record and analyse ambient sound. It also may be possible that the ambient sound signal is recorded by the hearing aid 10 and transmitted to the mobile device 36, where it is analysed. The ambient sound signal also may be transmitted to the server device 34, where it is analysed.
  • a user interaction scenario 40 which via an ambient sound trigger 87 is associated with the hearing scenario, is selected automatically.
  • the ambient sound trigger may comprise an identifier for a hearing scenario, which may be selected from a list of hearing scenarios.
  • step S16 when the user interaction scenario 40 has been selected, one or more hearing aid adjustment parameters 64 based on the selected user interaction scenario 40 are determined.
  • the type and the actual value from the hearing aid adjustment parameters 64 may be stored in the selected user interaction scenario 40.
  • the application 42 may display one or more input controls 62, 66 for adjusting the determined one or more hearing aid adjustment parameters 64. Also the type of input control 62, 66 may be stored in the selected user interaction scenario 40.
  • handling information 56 may be displayed by the application 42. The user can then read the handling information 56 and actuate the input controls 62, 66 to adjust the one or more hearing aid adjustment parameters 64.
  • step S18 after adjustment of at least one of the hearing aid adjustment parameters 64 with the one or more input controls 62, 66, sound processing parameters 30 for the hearing aid 10 are derived based on the adjusted one or more hearing aid adjustment parameters 64.
  • Sound processing parameters 30 may be calculated from the hearing aid adjustment parameters 64 based on an algorithm, which is adapted to modify the actual applied sound processing parameters 30 in the hearing aid 10 correspondingly. Also, sound processing parameters 30 may be selected from predefined sound processing parameters 30, which, for example, may be stored in the server device 34 and/or the mobile device 36.
  • calculations/selections may be performed in the hearing aid 10, the mobile device 36 and/or the server device 34.
  • step S20 the derived sound processing parameters 30 may be applied in the hearing aid 10, such that the hearing aid 10 is adapted for generating optimized sound signals based on the applied sound processing parameters 30.
  • Steps S18 and S20 may be performed directly every time after the user has adjusted a hearing aid adjustment parameter 64. In such a way, the user directly can hear the difference of his adjustment. It has to be noted that before and/or after the adjustment of the hearing aid adjustment parameters 64 and the application of the sound processing parameters 30, the hearing aid 10 may process an ambient sound signal based on sound generated in the ambient of the user. Thus, the user may adjust his hearing aid 10 directly in the scenario/situation, to which the hearing aid 10 should be adapted in a better way.
  • step S22 when the user wants to finish the user interaction scenario 40, he may rate the user interaction scenario 40 with the application 42. For example, the user may write a text regarding his experience with the user interaction scenario.
  • the rating or feedback information 83 may be input into the mobile application 42 and then may be stored in the user profile 72.
EP16165711.9A 2016-04-18 2016-04-18 Ajustement d'une prothèse auditive basé sur des scénarios d'interaction d'utilisateur Withdrawn EP3236673A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP16165711.9A EP3236673A1 (fr) 2016-04-18 2016-04-18 Ajustement d'une prothèse auditive basé sur des scénarios d'interaction d'utilisateur

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP16165711.9A EP3236673A1 (fr) 2016-04-18 2016-04-18 Ajustement d'une prothèse auditive basé sur des scénarios d'interaction d'utilisateur

Publications (1)

Publication Number Publication Date
EP3236673A1 true EP3236673A1 (fr) 2017-10-25

Family

ID=55910735

Family Applications (1)

Application Number Title Priority Date Filing Date
EP16165711.9A Withdrawn EP3236673A1 (fr) 2016-04-18 2016-04-18 Ajustement d'une prothèse auditive basé sur des scénarios d'interaction d'utilisateur

Country Status (1)

Country Link
EP (1) EP3236673A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017106359A1 (de) 2017-03-24 2018-09-27 Sennheiser Electronic Gmbh & Co. Kg Vorrichtung und Verfahren zur Verarbeitung von Audiosignalen zur Verbesserung der Sprachverständlichkeit
WO2020081653A3 (fr) * 2018-10-19 2020-06-18 Bose Corporation Personnalisation de dispositifs audio d'aide à la conversation
CN111492672A (zh) * 2017-12-20 2020-08-04 索诺瓦公司 智能在线听力设备性能管理
CN113015073A (zh) * 2019-12-20 2021-06-22 西万拓私人有限公司 用于调整听力仪器的方法和相关的听力系统
US11089402B2 (en) 2018-10-19 2021-08-10 Bose Corporation Conversation assistance audio device control
EP4284022A1 (fr) * 2022-05-25 2023-11-29 Sonova AG Système auditif

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000022874A2 (fr) 2000-02-18 2000-04-20 Phonak Ag Systeme d'adaptation
EP1353530A1 (fr) 2002-04-12 2003-10-15 Siemens Audiologische Technik GmbH Entraínement individuel de l'ouie pour des porteurs de prothèses auditives
US20110200214A1 (en) * 2010-02-12 2011-08-18 Audiotoniq, Inc. Hearing aid and computing device for providing audio labels
US20110235835A1 (en) * 2008-12-12 2011-09-29 Widex A/S Method for fine tuning a hearing aid
US20110293123A1 (en) * 2010-05-25 2011-12-01 Audiotoniq, Inc. Data Storage System, Hearing Aid, and Method of Selectively Applying Sound Filters
EP2549397A1 (fr) * 2012-07-02 2013-01-23 Oticon A/s Procédé de personnalisation d'une prothèse auditive
WO2013117214A1 (fr) 2012-02-07 2013-08-15 Widex A/S Système d'ajustement de prothèse auditive et procédé d'ajustement d'un système d'aide auditive
WO2015009564A1 (fr) 2013-07-16 2015-01-22 iHear Medical, Inc. Système et procédés de réglage en ligne de prothèse auditive destinés à un utilisateur non-expert
EP2876899A1 (fr) * 2013-11-22 2015-05-27 Oticon A/s Dispositif d'aide auditive réglable

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000022874A2 (fr) 2000-02-18 2000-04-20 Phonak Ag Systeme d'adaptation
EP1353530A1 (fr) 2002-04-12 2003-10-15 Siemens Audiologische Technik GmbH Entraínement individuel de l'ouie pour des porteurs de prothèses auditives
US20110235835A1 (en) * 2008-12-12 2011-09-29 Widex A/S Method for fine tuning a hearing aid
US20110200214A1 (en) * 2010-02-12 2011-08-18 Audiotoniq, Inc. Hearing aid and computing device for providing audio labels
US20110293123A1 (en) * 2010-05-25 2011-12-01 Audiotoniq, Inc. Data Storage System, Hearing Aid, and Method of Selectively Applying Sound Filters
WO2013117214A1 (fr) 2012-02-07 2013-08-15 Widex A/S Système d'ajustement de prothèse auditive et procédé d'ajustement d'un système d'aide auditive
EP2549397A1 (fr) * 2012-07-02 2013-01-23 Oticon A/s Procédé de personnalisation d'une prothèse auditive
WO2015009564A1 (fr) 2013-07-16 2015-01-22 iHear Medical, Inc. Système et procédés de réglage en ligne de prothèse auditive destinés à un utilisateur non-expert
EP2876899A1 (fr) * 2013-11-22 2015-05-27 Oticon A/s Dispositif d'aide auditive réglable

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017106359A1 (de) 2017-03-24 2018-09-27 Sennheiser Electronic Gmbh & Co. Kg Vorrichtung und Verfahren zur Verarbeitung von Audiosignalen zur Verbesserung der Sprachverständlichkeit
CN111492672A (zh) * 2017-12-20 2020-08-04 索诺瓦公司 智能在线听力设备性能管理
CN111492672B (zh) * 2017-12-20 2022-10-21 索诺瓦公司 听力设备及其操作方法
WO2020081653A3 (fr) * 2018-10-19 2020-06-18 Bose Corporation Personnalisation de dispositifs audio d'aide à la conversation
US10795638B2 (en) 2018-10-19 2020-10-06 Bose Corporation Conversation assistance audio device personalization
US11089402B2 (en) 2018-10-19 2021-08-10 Bose Corporation Conversation assistance audio device control
US11809775B2 (en) 2018-10-19 2023-11-07 Bose Corporation Conversation assistance audio device personalization
CN113015073A (zh) * 2019-12-20 2021-06-22 西万拓私人有限公司 用于调整听力仪器的方法和相关的听力系统
EP3840418A1 (fr) * 2019-12-20 2021-06-23 Sivantos Pte. Ltd. Procédé d'ajustement d'un instrument auditif et système auditif associé
US11601765B2 (en) 2019-12-20 2023-03-07 Sivantos Pte. Ltd. Method for adapting a hearing instrument and hearing system therefor
EP4284022A1 (fr) * 2022-05-25 2023-11-29 Sonova AG Système auditif
EP4284023A1 (fr) * 2022-05-25 2023-11-29 Sonova AG Système de commande hiérarchique pour système auditif

Similar Documents

Publication Publication Date Title
EP3236673A1 (fr) Ajustement d'une prothèse auditive basé sur des scénarios d'interaction d'utilisateur
US20210084420A1 (en) Automated Fitting of Hearing Devices
US8526649B2 (en) Providing notification sounds in a customizable manner
US20140193008A1 (en) System and method for fitting of a hearing device
EP3468227B1 (fr) Système avec programme informatique et serveur de requêtes de service de dispositif auditif
US10341790B2 (en) Self-fitting of a hearing device
WO2010066305A1 (fr) Procédé d’accord fin d’un appareil auditif
CN111492672B (zh) 听力设备及其操作方法
AU2023203651A1 (en) A hearing assistance system
CN108235181A (zh) 在音频处理装置中降噪的方法
EP2830330B1 (fr) Système d'aide à l'audition et procédé d'ajustement d'un système d'aide à l'audition
CN110753295A (zh) 可定制个人声音传送系统的校准方法
US11882413B2 (en) System and method for personalized fitting of hearing aids
US10873816B2 (en) Providing feedback of an own voice loudness of a user of a hearing device
US11683649B2 (en) System, method and computer program for interactively assisting a user in evaluating a hearing loss
US11323809B2 (en) Method for controlling a sound output of a hearing device

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20180426