EP3489919B1 - Apparatus, method and computer program for indicating an emergency situation - Google Patents

Apparatus, method and computer program for indicating an emergency situation Download PDF

Info

Publication number
EP3489919B1
EP3489919B1 EP17203574.3A EP17203574A EP3489919B1 EP 3489919 B1 EP3489919 B1 EP 3489919B1 EP 17203574 A EP17203574 A EP 17203574A EP 3489919 B1 EP3489919 B1 EP 3489919B1
Authority
EP
European Patent Office
Prior art keywords
sound signal
user
emergency situation
environment
detected sound
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP17203574.3A
Other languages
German (de)
French (fr)
Other versions
EP3489919A1 (en
Inventor
Güner ÖZTEKIN
Evren Gökhan YILMAZ
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vestel Elektronik Sanayi ve Ticaret AS
Original Assignee
Vestel Elektronik Sanayi ve Ticaret AS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vestel Elektronik Sanayi ve Ticaret AS filed Critical Vestel Elektronik Sanayi ve Ticaret AS
Priority to EP17203574.3A priority Critical patent/EP3489919B1/en
Priority to TR2017/20943A priority patent/TR201720943A2/en
Publication of EP3489919A1 publication Critical patent/EP3489919A1/en
Application granted granted Critical
Publication of EP3489919B1 publication Critical patent/EP3489919B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/009Signalling of the alarm condition to a substation whose identity is signalled to a central station, e.g. relaying alarm signals in order to extend communication range
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B29/00Checking or monitoring of signalling or alarm systems; Prevention or correction of operating errors, e.g. preventing unauthorised operation
    • G08B29/18Prevention or correction of operating errors
    • G08B29/185Signal analysis techniques for reducing or preventing false alarms or for enhancing the reliability of the system
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B3/00Audible signalling systems; Audible personal calling systems
    • G08B3/10Audible signalling systems; Audible personal calling systems using electric transmission; using electromagnetic transmission
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B5/00Visible signalling systems, e.g. personal calling systems, remote indication of seats occupied
    • G08B5/22Visible signalling systems, e.g. personal calling systems, remote indication of seats occupied using electric transmission; using electromagnetic transmission
    • G08B5/36Visible signalling systems, e.g. personal calling systems, remote indication of seats occupied using electric transmission; using electromagnetic transmission using visible light sources
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B6/00Tactile signalling systems, e.g. personal calling systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B7/00Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00
    • G08B7/06Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00 using electric transmission, e.g. involving audible and visible signalling through the use of sound and light sources

Definitions

  • the present disclosure relates to an apparatus, a method and a computer program for indicating an emergency situation.
  • headbands, glasses, helmets or other wearable apparatuses have been equipped with microphones and displays so that visual information can be output when audio information in the environment of a user is indicative of an emergency situation.
  • US4237449A describes a signalling device for making hard of hearing persons aware of an impending happening in his or her proximity.
  • US5651070A describes a listening device for alerting an individual unable to hear warning signals having a means for prerecording a warning signal, converting the signal into digital information, and storing the digital information.
  • US2014/253326A1 describes techniques for receiving an alarm sound including information related to an emergency event
  • US2008/293453A1 describes a remote ring monitor for a mobile communications device comprising an audio sensor to detect a sound signal emitted by the monitored device and one or more indicators which may provide visual, auditory or tactile alert signals to the user .
  • the haptic signal may comprise a vibration signal.
  • the detected sound signal may comprise a name (e.g. name of the user or specified names), a siren (e.g. an ambulance siren, a police siren, a fire brigade siren), an alarm (e.g. a fire alarm, an earthquake alarm, a hurricane alarm, a flood alarm), a car horn, a voice (e.g. a human voice as opposed to a machine generated voice), a word in a specific language such as "look out", "help", "beware”, or other.
  • a name e.g. name of the user or specified names
  • a siren e.g. an ambulance siren, a police siren, a fire brigade siren
  • an alarm e.g. a fire alarm, an earthquake alarm, a hurricane alarm, a flood alarm
  • a car horn e.g. a human voice as opposed to a machine generated voice
  • a voice e.g. a human voice as opposed to a machine generated voice
  • the apparatus may comprise: a visual output unit configured to output a visual signal when the detected sound signal is indicative of an emergency situation in the environment of the user.
  • the visual signal may comprise a light signal, for example a flashing light signal.
  • the visual output unit may be configurable to output a light signal when the detected sound signal is indicative of an emergency situation in the environment of the user only when the light level in the environment of the user is below a threshold.
  • the processing unit is configured to:
  • the predetermined sound signal stored on the apparatus may be programmed by the manufacturer during manufacture or by the user after purchase.
  • the user may program the apparatus to download a sound signal from another apparatus or to record a sound signal from the environment of the user (e.g. a sound signal generated by the user's pet such as a user's dog barking).
  • a sound signal generated by the user's pet such as a user's dog barking.
  • the apparatus may comprise: a communication unit configured to communicate with another apparatus via a communication link.
  • the other apparatus may be a smart phone, a desktop computer, a laptop computer, a table computer, a television, a server or other.
  • the communication link may be a wireless link
  • the communication link may be a Bluetooth link.
  • the communication link may be an Internet link, for example a Wi-Fi link.
  • the processing unit is configured to: if the detected sound signal is not associated with a predetermined sound signal stored on the apparatus, communicate with another apparatus to query whether the detected sound signal is associated with a predetermined sound signal stored on the other apparatus and to receive a response to the query; and if the detected sound signal is associated with a predetermined sound signal stored on the other apparatus, determine that the detected sound signal is indicative of an emergency situation in the environment of the user.
  • the processing unit may be configured to: if the detected sound signal is indicative of an emergency situation in the environment of the user, instruct another apparatus to display an indication of an emergency situation in the environment of the user.
  • the indication may comprises a text and/or an image.
  • the processing unit is configured to: if the detected sound signal is indicative of an emergency situation in the environment of the user, instruct another apparatus to display information related to the emergency situation in the environment of the user.
  • the information may be retrieved by the other apparatus from a network.
  • the information may comprise podcasts, articles, photographs, videos, etc.
  • the apparatus may be portable and preferably wearable.
  • the apparatus may be a watch.
  • the apparatus may be waterproof.
  • the method may comprise:
  • the method may comprise:
  • the method may comprise: if the detected sound signal is indicative of an emergency situation in the environment of the user, instructing another apparatus to display an indication of an emergency situation in the environment of the user.
  • a computer program for an apparatus comprising software code portions for performing the above method when said computer program is run on the apparatus.
  • Existing apparatus for assisting people with hearing impairment rely on visual information to indicate an emergency situation to a user. Accordingly, they may be ineffective in circumstances where visual information cannot be perceived by the user. For example, the user may have a visual impairment. The user may be asleep. The user may not be able to wear the apparatus because s/he is having a shower, a swim or practising another activity that is incompatible with wearing the apparatus.
  • FIG. 1 shows schematically an example of a system according to an embodiment.
  • the system comprises a watch 2, a television set 4, a tablet computer 6, a laptop computer 8, a mobile phone 10 and a server 12 connected to a network 14.
  • the network 14 may be a local area network and/or a wide area network (e.g. Internet).
  • not all of a television set 4, a tablet computer 6, a laptop computer 8, a mobile phone 10 and a server 12 are required or used.
  • the watch 2 is indirectly connected to the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 and the server 12 via one or more routers 16.
  • the watch 2 may communicate with the routers 16 over a medium range wireless communication link (e.g. Wi-Fi link).
  • the watch 2 is directly connected to the television 4, the tablet computer 6, the laptop computer 8 and the mobile phone 10.
  • the watch 2 may communicate over a short range wireless communication link (e.g. Bluetooth link).
  • FIG 2 shows schematically an example of the watch 2 shown in Figure 1 according to an embodiment.
  • the watch 2 comprises a sound detection unit 20 for detecting a sound signal in the environment of a user.
  • the sound detection unit 20 may comprise a microphone.
  • the watch 2 comprises a haptic output unit 22 for outputting a haptic signal.
  • the haptic signal may comprise a vibration signal, a heat signal, a braille signal, an electric shock signal or other.
  • the haptic output unit may comprise a vibration motor, a heating resistance, a braille display, an electrode or other.
  • the watch 2 comprises a visual output unit 24 for outputting a visual signal.
  • the visual signal may comprise a text, an image, a light (the colour and/ or flashing frequency of which may be adjustable for example) or other.
  • the visual output unit 24 may comprise a display, a light emitting diode or other.
  • the watch 2 comprises a communication unit 26 for communicating with the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 and the server 12.
  • the communication unit 26 may comprise a Bluetooth communication unit, a Wi-Fi communication unit or other.
  • the watch 2 comprises a power unit 28, a processing unit 30 and a memory unit 32.
  • the memory unit 32 contains instructions which when executed by the processing unit 30 allow the watch 2 to perform the method of Figure 3 (described in further details below).
  • the memory unit 32 further comprises a sound database storing predetermined (i.e. predefined) sound signals.
  • the detected sound signal may comprise a name (e.g. name of the user and/or specified names of one or more other people), a siren (e.g. an ambulance siren, a police siren, a fire brigade siren), an alarm (e.g. a fire alarm, an earthquake alarm, a hurricane alarm, a flood alarm), a car horn, a voice (e.g. a human voice as opposed to a machine generated voice), one or more words or phrases in a specific language such as "look out", "help", "beware”, or other.
  • a name e.g. name of the user and/or specified names of one or more other people
  • a siren e.g. an ambulance siren, a police siren, a fire brigade siren
  • an alarm e.g. a fire alarm, an earthquake alarm, a hurricane alarm, a flood alarm
  • a car horn e.g. a human voice as opposed to a machine generated voice
  • a voice e.g
  • the sound database may be populated by the manufacturer during manufacture and/or by the user after purchase.
  • the user may program the watch 2 to download and store a predetermined sound signal stored on the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12.
  • the user may also program the watch 2 to capture a sound (e.g. specific words or phrases, including for example specific names, the sound of a user's dog barking, etc.) and store the captured sound in the sound database.
  • a sound e.g. specific words or phrases, including for example specific names, the sound of a user's dog barking, etc.
  • the watch 2 comprises a housing.
  • the housing may be waterproof.
  • Figure 3 shows schematically an example of a flow diagram of a method of operating the watch 2 according to an embodiment.
  • step 302 the watch 2 is activated (i.e. the various units are powered on).
  • step 304 the sound detection unit 20 determines whether a sound signal is detected. If a sound signal is detected the method goes to step 306. If no sound signal is detected the method loops back to step 304.
  • the processing unit 30 processes the detected sound signal to perform sound recognition.
  • the processing may include analogue to digital conversion, time to frequency conversion, filtering, speech detection or other.
  • the processing unit 30 determines whether the detected sound signal is associated (i.e. matches) with a predetermined sound signal in the sound database in the memory unit 32. If the detected sound signal is associated with a predetermined sound signal in the sound database in the memory unit 32, the method goes to step 308. If the detected sound signal is not associated with a predetermined sound signal in the sound database, the method goes to step 318.
  • the processing unit 30 identifies an emergency situation. For example, the processing unit 30 may determine that a car is coming when the detected sound signal matches with an ambulance siren or a car horn. The processing unit may determine that a fire is ongoing when the detected sound signal matches with a fire alarm or other.
  • the haptic output unit 22 outputs a haptic signal and/or the visual output unit 24 outputs a visual signal.
  • the haptic signal and/or the visual signal may be based on the emergency situation.
  • the intensity or the frequency of the haptic signal and/or the visual signal may vary based on the type of emergency situation or the degree of emergency of the emergency situation.
  • step 312 the communication unit 26 connects to the television 4, the tablet computer 6, the laptop computer 8 or the mobile phone 10.
  • step 314 the communication unit 26 instructs the television 4, the tablet computer 6, the laptop computer 8 or the mobile phone 10 to remotely display an indication (i.e. a notification) of the identified emergency situation.
  • the visual output unit 24 locally displays an indication of the identified emergency situation.
  • the indication may comprise a text (e.g. "Warning: ongoing fire!”, “Warning: ongoing hurricane!, “Warning: ongoing flood!, “Warning: ongoing earthquake!, “Warning: ongoing tsunami!, “Warning: ongoing thunder!” or other), and/or an image (e.g. a flame icon, a hurricane icon, a flood icon, an earthquake icon, a tsunami icon, a thunder icon, or other).
  • a text e.g. "Warning: ongoing fire!”, “Warning: ongoing hurricane!, “Warning: ongoing flood!”, “Warning: ongoing earthquake!, “Warning: ongoing tsunami!, “Warning: ongoing thunder!” or other
  • an image e.g. a flame icon, a hurricane icon, a flood icon, an earthquake icon, a tsunami icon, a thunder icon, or other.
  • the watch 2, the television 4, the tablet computer 6, the laptop computer 8 or the mobile phone 10 checks whether there is an actual emergency situation before displaying the indication. Checking can be done by connecting to the network 14 and analysing one or more of current or breaking news items, articles, videos, podcast, images trending thereon, etc..
  • step 316 the communication units 26 instructs the television 4, the tablet computer 6, the laptop computer 8 or the mobile phone 10 to display further information related to the identified emergency situation.
  • the visual output unit 24 locally displays further information related to the identified emergency situation.
  • the information may comprise a text (e.g. fire brigade phone number, flood related newspaper article, hurricane instructions, transcription of a podcast related to an ongoing earthquake) and/or an image (e.g. map of the building showing the location of closest exits, safe rooms or fire extinguishers).
  • a text e.g. fire brigade phone number, flood related newspaper article, hurricane instructions, transcription of a podcast related to an ongoing earthquake
  • an image e.g. map of the building showing the location of closest exits, safe rooms or fire extinguishers.
  • the information may be stored locally on the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or remotely on the server 12.
  • the method then loops back to step 304.
  • step 318 i.e. for the case that the detected sound signal is not associated with a predetermined sound signal in the sound database of the watch 2
  • the communication unit 24 connects to some other apparatus, such as for example one or more of the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12. It will be understood that the communication unit 24 may connect to the same or a different apparatus to the apparatus it connects with in step 312.
  • step 320 the communication unit 24 transmits the detected sound signal to the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12 to remotely determine whether the detected sound signal is associated (e.g. matches) with a predetermined sound signal on the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12.
  • the communication unit 24 then receives a result of the determination.
  • the communication unit 24 receives a predetermined sound signal from the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12.
  • the watch 2 can then locally determine whether the detected sound signal is associated (e.g. matches) with a predetermined sound signal stored on the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12.
  • the method goes to step 308. If the detected sound signal is not associated (e.g. does not match) with a predetermined sound signal stored on the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12, the method goes to step 322.
  • step 322 the processing unit identifies a lack of emergency situation and loops back to step 304.
  • An advantage of the watch 2 over existing apparatus for assisting people with hearing impairment is that it does not only rely on visual information to indicate an emergency situation to a user. Accordingly, the user can be notified of an emergency situation even where visual information cannot be perceived by the user.
  • the identification may be more accurate.
  • the watch 2 as shown in Figure 2 is represented as a schematic block diagram for the purposes of explaining the functionality of the watch 2 only. Hence, it is understood that each unit of the watch is a functional block for performing the functionality ascribed to it herein. Each unit may be implemented in hardware, software, firmware, or a combination thereof.
  • the processing unit referred to herein may in practice be provided by a single chip or integrated circuit or plural chips or integrated circuits, optionally provided as a chipset, an application-specific integrated circuit (ASIC), field-programmable gate array (FPGA), digital signal processor (DSP), graphics processing units (GPUs), etc.
  • the chip or chips may comprise circuitry (as well as possibly firmware) for embodying at least one or more of a data processor or processors, a digital signal processor or processors, baseband circuitry and radio frequency circuitry, which are configurable so as to operate in accordance with the exemplary embodiments.
  • the exemplary embodiments may be implemented at least in part by computer software stored in (non-transitory) memory and executable by the processor, or by hardware, or by a combination of tangibly stored software and hardware (and tangibly stored firmware).
  • a memory unit This may be provided by a single device or by plural devices. Suitable devices include for example a hard disk and non-volatile semiconductor memory.
  • the invention also extends to computer programs, particularly computer programs on or in a carrier, adapted for putting the invention into practice.
  • the program may be in the form of non-transitory source code, object code, a code intermediate source and object code such as in partially compiled form, or in any other non-transitory form suitable for use in the implementation of processes according to the invention.
  • the carrier may be any entity or device capable of carrying the program.
  • the carrier may comprise a storage medium, such as a solid-state drive (SSD) or other semiconductor-based RAM; a ROM, for example a CD ROM or a semiconductor ROM; a magnetic recording medium, for example a floppy disk or hard disk; optical memory devices in general; etc.
  • SSD solid-state drive
  • ROM read-only memory
  • magnetic recording medium for example a floppy disk or hard disk
  • optical memory devices in general etc.
  • an apparatus comprising:

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Alarm Systems (AREA)
  • Emergency Alarm Devices (AREA)

Description

    Technical Field
  • The present disclosure relates to an apparatus, a method and a computer program for indicating an emergency situation.
  • Background
  • In recent years, various solutions have been proposed for assisting people with hearing impairment in their everyday life. For example, headbands, glasses, helmets or other wearable apparatuses have been equipped with microphones and displays so that visual information can be output when audio information in the environment of a user is indicative of an emergency situation.
  • US4237449A describes a signalling device for making hard of hearing persons aware of an impending happening in his or her proximity.
  • US5651070A describes a listening device for alerting an individual unable to hear warning signals having a means for prerecording a warning signal, converting the signal into digital information, and storing the digital information.
  • US2014/253326A1 describes techniques for receiving an alarm sound including information related to an emergency event
  • US2008/293453A1 describes a remote ring monitor for a mobile communications device comprising an audio sensor to detect a sound signal emitted by the monitored device and one or more indicators which may provide visual, auditory or tactile alert signals to the user.
  • Summary
  • The invention is defined by the appended claims. According to a first aspect disclosed herein, there is provided an apparatus for indicating an emergency situation, as claimed in claim 1.
  • The haptic signal may comprise a vibration signal.
  • The detected sound signal may comprise a name (e.g. name of the user or specified names), a siren (e.g. an ambulance siren, a police siren, a fire brigade siren), an alarm (e.g. a fire alarm, an earthquake alarm, a hurricane alarm, a flood alarm), a car horn, a voice (e.g. a human voice as opposed to a machine generated voice), a word in a specific language such as "look out", "help", "beware", or other.
  • The apparatus may comprise:
    a visual output unit configured to output a visual signal when the detected sound signal is indicative of an emergency situation in the environment of the user.
  • The visual signal may comprise a light signal, for example a flashing light signal.
  • The visual output unit may be configurable to output a light signal when the detected sound signal is indicative of an emergency situation in the environment of the user only when the light level in the environment of the user is below a threshold.
  • The processing unit is configured to:
    • determine whether the detected sound signal is associated with a predetermined sound signal stored on the apparatus; and
    • if the detected sound signal is associated with a predetermined sound signal stored on the apparatus, determine that the detected sound signal is indicative of an emergency situation in the environment of the user.
  • The predetermined sound signal stored on the apparatus may be programmed by the manufacturer during manufacture or by the user after purchase.
  • The user may program the apparatus to download a sound signal from another apparatus or to record a sound signal from the environment of the user (e.g. a sound signal generated by the user's pet such as a user's dog barking).
  • The apparatus may comprise:
    a communication unit configured to communicate with another apparatus via a communication link.
  • The other apparatus may be a smart phone, a desktop computer, a laptop computer, a table computer, a television, a server or other.
  • The communication link may be a wireless link,
  • The communication link may be a Bluetooth link.
  • The communication link may be an Internet link, for example a Wi-Fi link.
  • The processing unit is configured to: if the detected sound signal is not associated with a predetermined sound signal stored on the apparatus, communicate with another apparatus to query whether the detected sound signal is associated with a predetermined sound signal stored on the other apparatus and to receive a response to the query; and
    if the detected sound signal is associated with a predetermined sound signal stored on the other apparatus, determine that the detected sound signal is indicative of an emergency situation in the environment of the user.
  • The processing unit may be configured to:
    if the detected sound signal is indicative of an emergency situation in the environment of the user, instruct another apparatus to display an indication of an emergency situation in the environment of the user.
  • The indication may comprises a text and/or an image.
  • The processing unit is configured to:
    if the detected sound signal is indicative of an emergency situation in the environment of the user, instruct another apparatus to display information related to the emergency situation in the environment of the user.
  • The information may be retrieved by the other apparatus from a network. The information may comprise podcasts, articles, photographs, videos, etc.
  • The apparatus may be portable and preferably wearable.
  • The apparatus may be a watch.
  • The apparatus may be waterproof.
  • According to a second aspect disclosed herein, there is provided a method comprising:
    • detecting a sound signal in an environment of a user;
    • determining whether the detected sound signal is indicative of an emergency situation in the environment of the user ; and
    • outputting a haptic signal when the detected sound signal is indicative of an emergency situation in the environment of the user.
  • The method may comprise:
    • determining whether the detected sound signal is associated with a predetermined sound signal stored on the apparatus; and
    • if the detected sound signal is associated with a predetermined sound signal stored on the apparatus, determining that the detected sound signal is indicative of an emergency situation in the environment of the user.
  • The method may comprise:
    • if the detected sound signal is not associated with a predetermined sound signal stored on the apparatus, instructing another apparatus to determine whether the detected sound signal is associated with a predetermined sound signal stored on the other apparatus; and
    • if the detected sound signal is associated with a predetermined sound signal stored on the other apparatus, determining that the detected sound signal is indicative of an emergency situation in the environment of the user.
  • The method may comprise:
    if the detected sound signal is indicative of an emergency situation in the environment of the user, instructing another apparatus to display an indication of an emergency situation in the environment of the user.
  • According to a third aspect disclosed herein, there is provided a computer program for an apparatus, comprising software code portions for performing the above method when said computer program is run on the apparatus.
  • Brief Description of the Drawings
  • For assisting understanding of the present disclosure and to show how embodiments may be put into effect, reference is made by way of example to the accompanying drawings in which:
    • Figure 1 shows schematically an example of a system according to an embodiment described herein;
    • Figure 2 shows schematically an example of an apparatus which is part of the system of Figure 1 according to an embodiment described herein; and
    • Figure 3 shows schematically an example of a flow diagram of a method of operating the apparatus of Figure 2 according to an embodiment described herein.
    Detailed Description
  • Existing apparatus for assisting people with hearing impairment rely on visual information to indicate an emergency situation to a user. Accordingly, they may be ineffective in circumstances where visual information cannot be perceived by the user. For example, the user may have a visual impairment. The user may be asleep. The user may not be able to wear the apparatus because s/he is having a shower, a swim or practising another activity that is incompatible with wearing the apparatus.
  • Figure 1 shows schematically an example of a system according to an embodiment. The system comprises a watch 2, a television set 4, a tablet computer 6, a laptop computer 8, a mobile phone 10 and a server 12 connected to a network 14. The network 14 may be a local area network and/or a wide area network (e.g. Internet). In some examples, not all of a television set 4, a tablet computer 6, a laptop computer 8, a mobile phone 10 and a server 12 are required or used.
  • The watch 2 is indirectly connected to the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 and the server 12 via one or more routers 16. The watch 2 may communicate with the routers 16 over a medium range wireless communication link (e.g. Wi-Fi link).
  • The watch 2 is directly connected to the television 4, the tablet computer 6, the laptop computer 8 and the mobile phone 10. The watch 2 may communicate over a short range wireless communication link (e.g. Bluetooth link).
  • Figure 2 shows schematically an example of the watch 2 shown in Figure 1 according to an embodiment.
  • The watch 2 comprises a sound detection unit 20 for detecting a sound signal in the environment of a user. The sound detection unit 20 may comprise a microphone.
  • The watch 2 comprises a haptic output unit 22 for outputting a haptic signal. The haptic signal may comprise a vibration signal, a heat signal, a braille signal, an electric shock signal or other. The haptic output unit may comprise a vibration motor, a heating resistance, a braille display, an electrode or other.
  • The watch 2 comprises a visual output unit 24 for outputting a visual signal. The visual signal may comprise a text, an image, a light (the colour and/ or flashing frequency of which may be adjustable for example) or other. The visual output unit 24 may comprise a display, a light emitting diode or other.
  • The watch 2 comprises a communication unit 26 for communicating with the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 and the server 12. The communication unit 26 may comprise a Bluetooth communication unit, a Wi-Fi communication unit or other.
  • The watch 2 comprises a power unit 28, a processing unit 30 and a memory unit 32. The memory unit 32 contains instructions which when executed by the processing unit 30 allow the watch 2 to perform the method of Figure 3 (described in further details below). The memory unit 32 further comprises a sound database storing predetermined (i.e. predefined) sound signals.
  • The detected sound signal may comprise a name (e.g. name of the user and/or specified names of one or more other people), a siren (e.g. an ambulance siren, a police siren, a fire brigade siren), an alarm (e.g. a fire alarm, an earthquake alarm, a hurricane alarm, a flood alarm), a car horn, a voice (e.g. a human voice as opposed to a machine generated voice), one or more words or phrases in a specific language such as "look out", "help", "beware", or other.
  • The sound database may be populated by the manufacturer during manufacture and/or by the user after purchase. For example, the user may program the watch 2 to download and store a predetermined sound signal stored on the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12. The user may also program the watch 2 to capture a sound (e.g. specific words or phrases, including for example specific names, the sound of a user's dog barking, etc.) and store the captured sound in the sound database. In this way, the sound database can be customized by the user.
  • The watch 2 comprises a housing. The housing may be waterproof.
  • Figure 3 shows schematically an example of a flow diagram of a method of operating the watch 2 according to an embodiment.
  • In step 302, the watch 2 is activated (i.e. the various units are powered on).
  • In step 304, the sound detection unit 20 determines whether a sound signal is detected. If a sound signal is detected the method goes to step 306. If no sound signal is detected the method loops back to step 304.
  • In step 306, the processing unit 30 processes the detected sound signal to perform sound recognition. The processing may include analogue to digital conversion, time to frequency conversion, filtering, speech detection or other.
  • The processing unit 30 determines whether the detected sound signal is associated (i.e. matches) with a predetermined sound signal in the sound database in the memory unit 32. If the detected sound signal is associated with a predetermined sound signal in the sound database in the memory unit 32, the method goes to step 308. If the detected sound signal is not associated with a predetermined sound signal in the sound database, the method goes to step 318.
  • In step 308, the processing unit 30 identifies an emergency situation. For example, the processing unit 30 may determine that a car is coming when the detected sound signal matches with an ambulance siren or a car horn. The processing unit may determine that a fire is ongoing when the detected sound signal matches with a fire alarm or other.
  • In step 310, the haptic output unit 22 outputs a haptic signal and/or the visual output unit 24 outputs a visual signal. The haptic signal and/or the visual signal may be based on the emergency situation. For example, the intensity or the frequency of the haptic signal and/or the visual signal may vary based on the type of emergency situation or the degree of emergency of the emergency situation.
  • In step 312, the communication unit 26 connects to the television 4, the tablet computer 6, the laptop computer 8 or the mobile phone 10.
  • In step 314, the communication unit 26 instructs the television 4, the tablet computer 6, the laptop computer 8 or the mobile phone 10 to remotely display an indication (i.e. a notification) of the identified emergency situation. Alternatively, the visual output unit 24 locally displays an indication of the identified emergency situation.
  • The indication may comprise a text (e.g. "Warning: ongoing fire!", "Warning: ongoing hurricane!", "Warning: ongoing flood!", "Warning: ongoing earthquake!", "Warning: ongoing tsunami!", "Warning: ongoing thunder!" or other), and/or an image (e.g. a flame icon, a hurricane icon, a flood icon, an earthquake icon, a tsunami icon, a thunder icon, or other).
  • In an implementation, the watch 2, the television 4, the tablet computer 6, the laptop computer 8 or the mobile phone 10 checks whether there is an actual emergency situation before displaying the indication. Checking can be done by connecting to the network 14 and analysing one or more of current or breaking news items, articles, videos, podcast, images trending thereon, etc..
  • In step 316, the communication units 26 instructs the television 4, the tablet computer 6, the laptop computer 8 or the mobile phone 10 to display further information related to the identified emergency situation. Alternatively, the visual output unit 24 locally displays further information related to the identified emergency situation.
  • The information may comprise a text (e.g. fire brigade phone number, flood related newspaper article, hurricane instructions, transcription of a podcast related to an ongoing earthquake) and/or an image (e.g. map of the building showing the location of closest exits, safe rooms or fire extinguishers).
  • The information may be stored locally on the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or remotely on the server 12.
  • The method then loops back to step 304.
  • In step 318 (i.e. for the case that the detected sound signal is not associated with a predetermined sound signal in the sound database of the watch 2), the communication unit 24 connects to some other apparatus, such as for example one or more of the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12. It will be understood that the communication unit 24 may connect to the same or a different apparatus to the apparatus it connects with in step 312.
  • In step 320, the communication unit 24 transmits the detected sound signal to the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12 to remotely determine whether the detected sound signal is associated (e.g. matches) with a predetermined sound signal on the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12. The communication unit 24 then receives a result of the determination.
  • Alternatively, the communication unit 24 receives a predetermined sound signal from the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12. The watch 2 can then locally determine whether the detected sound signal is associated (e.g. matches) with a predetermined sound signal stored on the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12.
  • If the detected sound signal is associated (e.g. matches) with a predetermined sound signal stored on the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12, the method goes to step 308. If the detected sound signal is not associated (e.g. does not match) with a predetermined sound signal stored on the television 4, the tablet computer 6, the laptop computer 8, the mobile phone 10 or the server 12, the method goes to step 322.
  • In step 322, the processing unit identifies a lack of emergency situation and loops back to step 304.
  • An advantage of the watch 2 over existing apparatus for assisting people with hearing impairment is that it does not only rely on visual information to indicate an emergency situation to a user. Accordingly, the user can be notified of an emergency situation even where visual information cannot be perceived by the user.
  • Moreover, it does not only rely on sound signals stored on the watch 2 to identify an emergency situation. Accordingly, the identification may be more accurate.
  • It will be understood that although the watch 2 has been described as an example of an embodiment, other embodiments may encompass other wearable or non-wearable apparatus.
  • The watch 2 as shown in Figure 2 is represented as a schematic block diagram for the purposes of explaining the functionality of the watch 2 only. Hence, it is understood that each unit of the watch is a functional block for performing the functionality ascribed to it herein. Each unit may be implemented in hardware, software, firmware, or a combination thereof.
  • It will be understood that the processing unit referred to herein may in practice be provided by a single chip or integrated circuit or plural chips or integrated circuits, optionally provided as a chipset, an application-specific integrated circuit (ASIC), field-programmable gate array (FPGA), digital signal processor (DSP), graphics processing units (GPUs), etc. The chip or chips may comprise circuitry (as well as possibly firmware) for embodying at least one or more of a data processor or processors, a digital signal processor or processors, baseband circuitry and radio frequency circuitry, which are configurable so as to operate in accordance with the exemplary embodiments. In this regard, the exemplary embodiments may be implemented at least in part by computer software stored in (non-transitory) memory and executable by the processor, or by hardware, or by a combination of tangibly stored software and hardware (and tangibly stored firmware).
  • Reference is made herein to a memory unit. This may be provided by a single device or by plural devices. Suitable devices include for example a hard disk and non-volatile semiconductor memory.
  • Although at least some aspects of the embodiments described herein with reference to the drawings comprise computer processes performed in processing systems or processors, the invention also extends to computer programs, particularly computer programs on or in a carrier, adapted for putting the invention into practice. The program may be in the form of non-transitory source code, object code, a code intermediate source and object code such as in partially compiled form, or in any other non-transitory form suitable for use in the implementation of processes according to the invention. The carrier may be any entity or device capable of carrying the program. For example, the carrier may comprise a storage medium, such as a solid-state drive (SSD) or other semiconductor-based RAM; a ROM, for example a CD ROM or a semiconductor ROM; a magnetic recording medium, for example a floppy disk or hard disk; optical memory devices in general; etc.
  • The examples described herein are to be understood as illustrative examples of embodiments of the invention. Further embodiments and examples are envisaged. Any feature described in relation to any one example or embodiment may be used alone or in combination with other features. In addition, any feature described in relation to any one example or embodiment may also be used in combination with one or more features of any other of the examples or embodiments, or any combination of any other of the examples or embodiments. Furthermore, equivalents and modifications not described herein may also be employed within the scope of the invention, which is defined in the claims.
  • In other aspects, there may also be provided an apparatus comprising:
    • a communication unit configured to communicate to another apparatus via a communication link;
    • a sound detection unit configured to detect a sound signal in an environment of a user;
    • a processing unit configured to:
      • if the detected sound signal is not associated with a predetermined sound signal on the apparatus, instruct the other apparatus to determine whether the detected sound signal is associated with a predetermined sound signal on the other apparatus; and
      • if the detected sound signal is associated with a predetermined sound signal on the other apparatus, determine that the detected sound signal is indicative of an emergency situation in the environment of the user.

Claims (13)

  1. An apparatus (2) for indicating an emergency situation, the apparatus (2) comprising:
    a sound detection unit (20) configured to detect a sound signal in an environment of a user;
    a processing unit (30) configured to determine whether the detected sound signal is indicative of an emergency situation in the environment of the user; and
    a haptic output unit (22) configured to output a haptic signal when the detected sound signal is indicative of an emergency situation in the environment of the user
    characterized in that the processing unit (30) is configured to:
    determine whether the detected sound signal is associated with a predetermined sound signal stored on the apparatus;
    if the detected sound signal is not associated with a predetermined sound signal stored on the apparatus (2), communicate with another apparatus (4, 6, 8, 10) to query whether the detected sound signal is associated with a predetermined sound signal stored on the other apparatus (4, 6, 8, 10) and to receive a response to the query; and
    if the detected sound signal is associated with a predetermined sound signal stored on the other apparatus (4, 6, 8, 10), determine that the detected sound signal is indicative of an emergency situation in the environment of the user.
  2. An apparatus (2) according to claim 1, comprising:
    a visual output unit (24) configured to output a visual signal when the detected sound signal is indicative of an emergency situation in the environment of the user.
  3. An apparatus (2) according to claim 2, wherein the visual output unit (24) is configurable to output a light signal when the detected sound signal is indicative of an emergency situation in the environment of the user only when the light level in the environment of the user is below a threshold.
  4. An apparatus (2) according to any of claims 1 to 3, wherein the processing unit (30) is configured to:
    if the detected sound signal is associated with a predetermined sound signal stored on the apparatus (2), determine that the detected sound signal is indicative of an emergency situation in the environment of the user.
  5. An apparatus (2) according to claim 4, wherein the predetermined sound signal on the apparatus (2) is stored by the manufacturer or by the user.
  6. An apparatus (2) according to any of claims 1 to 5, wherein the processing unit (30) is configured to:
    if the detected sound signal is indicative of an emergency situation in the environment of the user, instruct another apparatus (4, 6, 8, 10) to display an indication of an emergency situation in the environment of the user.
  7. An apparatus (2) according to any of claims 1 to 6, wherein the processing unit (30) is configured to:
    if the detected sound signal is indicative of an emergency situation in the environment of the user, instruct another apparatus (4, 6, 8, 10) to display information related to the emergency situation in the environment of the user.
  8. An apparatus (2) according to claims 1 to 7, wherein the apparatus (2) is a wearable apparatus.
  9. A method comprising:
    detecting (304) a sound signal in an environment of a user;
    determining (306) whether the detected sound signal is indicative of an emergency situation in the environment of the user ; and
    outputting (310) a haptic signal when the detected sound signal is indicative of an emergency situation in the environment of the user;
    characterized by:
    determine whether the detected sound signal is associated with a predetermined sound signal stored on an apparatus;
    if the detected sound signal is not associated with a predetermined sound signal stored on the apparatus, instructing (320) another apparatus to determine whether the detected sound signal is associated with a predetermined sound signal stored on the other apparatus; and
    if the detected sound signal is associated with a predetermined sound signal stored on the other apparatus, determining (308) that the detected sound signal is indicative of an emergency situation in the environment of the user.
  10. A method according to claim 9, comprising:
    if the detected sound signal is associated with a predetermined sound signal stored on the apparatus, determining that the detected sound signal is indicative of an emergency situation in the environment of the user.
  11. A method according to any of claims 9 to 10, comprising:
    if the detected sound signal is indicative of an emergency situation in the environment of the user, instructing (314) another apparatus to display an indication of an emergency situation in the environment of the user.
  12. A method according to any of claims 9 to 11, comprising:
    if the detected sound signal is indicative of an emergency situation in the environment of the user, instruct (316) another apparatus to display information related to the emergency situation in the environment of the user.
  13. A computer program for an apparatus, comprising software code portions for performing the method of any of claims 9 to 12 when said computer program is run on the apparatus.
EP17203574.3A 2017-11-24 2017-11-24 Apparatus, method and computer program for indicating an emergency situation Active EP3489919B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP17203574.3A EP3489919B1 (en) 2017-11-24 2017-11-24 Apparatus, method and computer program for indicating an emergency situation
TR2017/20943A TR201720943A2 (en) 2017-11-24 2017-12-20 APPARATUS, METHOD AND COMPUTER PROGRAM FOR INDICATING AN EMERGENCY

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP17203574.3A EP3489919B1 (en) 2017-11-24 2017-11-24 Apparatus, method and computer program for indicating an emergency situation

Publications (2)

Publication Number Publication Date
EP3489919A1 EP3489919A1 (en) 2019-05-29
EP3489919B1 true EP3489919B1 (en) 2021-07-07

Family

ID=60569600

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17203574.3A Active EP3489919B1 (en) 2017-11-24 2017-11-24 Apparatus, method and computer program for indicating an emergency situation

Country Status (2)

Country Link
EP (1) EP3489919B1 (en)
TR (1) TR201720943A2 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120306631A1 (en) * 2011-06-03 2012-12-06 Apple Inc. Audio Conversion To Vibration Patterns

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4237449A (en) * 1978-06-16 1980-12-02 Zibell J Scott Signalling device for hard of hearing persons
US5651070A (en) * 1995-04-12 1997-07-22 Blunt; Thomas O. Warning device programmable to be sensitive to preselected sound frequencies
US20080293453A1 (en) * 2007-05-25 2008-11-27 Scott J. Atlas Method and apparatus for an audio-linked remote indicator for a wireless communication device
US9171450B2 (en) * 2013-03-08 2015-10-27 Qualcomm Incorporated Emergency handling system using informative alarm sound

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120306631A1 (en) * 2011-06-03 2012-12-06 Apple Inc. Audio Conversion To Vibration Patterns

Also Published As

Publication number Publication date
EP3489919A1 (en) 2019-05-29
TR201720943A2 (en) 2019-06-21

Similar Documents

Publication Publication Date Title
US10506411B1 (en) Portable home and hotel security system
US20210056981A1 (en) Systems and methods for managing an emergency situation
US20210287522A1 (en) Systems and methods for managing an emergency situation
US9721457B2 (en) Global positioning system equipped with hazard detector and a system for providing hazard alerts thereby
GB2528996A (en) Motion monitoring method and device
WO2011000113A1 (en) Multiple sound and voice detector for hearing- impaired or deaf person
AU2016205895B2 (en) Determining entry into or exit from a place while a tracking device is in the place
US20180336382A1 (en) Systems and Methods for Generating Alert to Avoid Misplacement of a Device or Item
EP3489919B1 (en) Apparatus, method and computer program for indicating an emergency situation
WO2016113697A1 (en) Rescue sensor device and method
EP2983148B1 (en) Voice-sensitive emergency alert system
JP7071375B2 (en) A system for monitoring the physical condition of at least one user, and a method for monitoring the physical condition of the user.
US9280914B2 (en) Vision-aided hearing assisting device
US11722813B2 (en) Situational awareness, communication, and safety for hearing protection devices
KR20160131678A (en) Portable device for protecting having camera and method for taking picture
US11645902B2 (en) Headphone loss prevention alarm
CN110031976A (en) A kind of glasses and its control method with warning function
KR20170143239A (en) Smart band apparatus for alarm
US11176799B2 (en) Global positioning system equipped with hazard detector and a system for providing hazard alerts thereby
US20160117914A1 (en) Security system, security arrangement and method therfore
JP2005107895A (en) Security system and security method
WO2017190803A1 (en) Ambient sound monitoring and visualizing system for hearing impaired persons
WO2019034244A1 (en) Image display device provided with sound prioritization function
KR101572807B1 (en) Method, apparatus and system for transmitting image signal by wearable device
US11967221B2 (en) Emergency device

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20191031

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20200929

RIN1 Information on inventor provided before grant (corrected)

Inventor name: YILMAZ, EVREN GOEKHAN

Inventor name: OEZTEKIN, GUENER

GRAJ Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted

Free format text: ORIGINAL CODE: EPIDOSDIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

INTC Intention to grant announced (deleted)
GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20210305

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1409340

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210715

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602017041561

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20210707

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1409340

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210707

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211108

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211007

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211007

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211008

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602017041561

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

26N No opposition filed

Effective date: 20220408

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211124

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211130

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20211130

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211124

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211130

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220630

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20171124

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220630

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20231123

Year of fee payment: 7

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: TR

Payment date: 20231122

Year of fee payment: 7

Ref country code: DE

Payment date: 20231121

Year of fee payment: 7

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707