EP3673469A1 - Automatisiertes detektieren einer notsituation einer oder mehrerer personen - Google Patents
Automatisiertes detektieren einer notsituation einer oder mehrerer personenInfo
- Publication number
- EP3673469A1 EP3673469A1 EP18803885.5A EP18803885A EP3673469A1 EP 3673469 A1 EP3673469 A1 EP 3673469A1 EP 18803885 A EP18803885 A EP 18803885A EP 3673469 A1 EP3673469 A1 EP 3673469A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- phrase
- emotions
- emergency situation
- amd
- alerting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/016—Personal emergency signalling and security systems
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
- G08B21/0407—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
- G08B21/043—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting an emergency event, e.g. a fall
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
- G08B21/0438—Sensor means for detecting
- G08B21/0469—Presence detectors to detect unsafe condition, e.g. infrared sensor, microphone
Definitions
- the invention relates to a method for automatically de tektieren an emergency situation of one or more persons in a surveillance area. Furthermore, the invention relates to an alerting method. Furthermore, the invention relates to a monitoring system. Moreover, the invention relates to a
- an emergency number for example the 112
- an emergency brake can be triggered.
- Emergency platforms and intercom systems are often installed on platforms and in public buildings. For smartphones, there are applications with which people in distress can make an emergency call. Farther Random witnesses can also trigger an alarm in the manner described.
- the said alarming methods require the active triggering of the alarm by the person concerned or a corresponding witness.
- victims are physically and mentally unable to press an alarm button or even dial a phone number.
- US 2007/0 183 604 A1 describes a method and a system which serve to monitor an environment. Acoustic data is collected via microphones and noise sources are identified from this data. Based on the comparison of parameter values, a situation is classified as normal or as deviant. In the latter case, a response to resolve the anomaly is initiated.
- US 5,666,157 describes a surveillance system in which video-related movements of persons are compared with reference movements characteristic of criminal acts and intentions. The extent of criminal intentions is determined and an appropriate alarm signal is generated.
- US 2015/0 070 166 A1 describes a system and a method for detecting the use of a firearm in a building. Microphones hidden in telephones are used to record acoustic data. The collected data is processed centrally. Based on the detection, a building control system can also be controlled.
- WO 01/33134 A1 describes a method and an apparatus for detecting a danger emanating from a living environment. It will be of a living thing Outgoing physiological signals are recorded and evaluated in order to detect a hazard and, if appropriate, automatically trigger measures to counteract the danger at the place of residence of the animal.
- data of a monitoring system are transmitted by means of cameras which are directed to sensors.
- the sensors have a modulatable light source whose light is captured in images taken by the cameras.
- a classification of an event in a monitored area is performed on the basis of sensor signals.
- This object is achieved by a method for detecting an emergency situation of at least one person according to claim 1, an alerting method according to claim 7, a moni monitoring system according to claim 8 and an alarm system ge according to claim 10.
- acoustic measured data are continuously acquired by the monitoring area. Furthermore, an automated search of the detected acoustic measurement data takes place after a predetermined alarm phrase in real time.
- the alerting phrase includes a short spoken phrase that is publicly represented in the surveillance area.
- emotions are determined on the basis of the acoustic measurement data, preferably at the same time. In other words, emotions or patterns to be assigned to emotions are searched for in the measurement data. In the further course of the method it is determined whether there is a temporal correlation between a found alarming phrase and determined emotions attributable to an emergency situation.
- this process can also take place sequentially. For example, at first one type of the two mentioned information to be identified is determined and subsequently only the other type. In this variant, evaluating both types of information may be limited to a subset of the data in which at least one of the two types of information is present. Finally, an emergency situation is established in the event that there is a temporal correlation between a found alerting phrase and detected emotions.
- a potential victim does not have to actively operate any technical devices to indicate an emergency situation, so that it is easier to detect such an emergency situation.
- time is saved, which would otherwise be required for operating an alarm device.
- the exact location of the event can be determined on the basis of the position of a sensor unit necessary for acquiring the acoustic measurement data. Due to the combination of the alerting phrase with the detection, the evaluation and the assignment of emotions, it is possible with great certainty to distinguish a real emergency situation from a harmless situation. For example, in a joking use of the alerting phrase, no emotion associated with an emergency situation is detected. By contrast, in a real emergency situation, the acoustic measurement data is used an emotion characteristic of an emergency situation is detected, so that an abuse of the alerting phrase can be prevented.
- the method according to the invention for automatically detecting an emergency situation of one or more persons in a surveillance area is initially carried out. Moreover, there is an automatic triggering of an alarm in the event that an emergency situation has been detected.
- a Alarmie tion for example, a visual or audible Sig nal
- a signal light can be generated, which confirms the victim that the alarm has taken place, and causes a possibly existing offenders to abort his act, as this Monitoring by a monitoring system and the triggering of the alarm is clarified.
- the alerting method shares the advantages of the inventive method for automatically detecting an emergency situation of one or more persons in a surveillance area.
- the monitoring system according to the invention comprises an acoustic sensor unit for continuous sensor-based acquisition of acoustic measurement data from the monitoring area.
- Part of the monitoring system according to the invention is also an evaluation unit for automatically searching the acquired acoustic measurement data for a predetermined alerting phrase in real time.
- the alerting phrase includes a short spoken sentence that is publicly displayed in the surveillance area.
- the monitoring system according to the invention has an emotion determination unit for automatically determining emotions on the basis of the measurement data.
- Part of the monitoring system according to the invention is also a correlation determination unit, which is set up to determine whether there is a temporal correlation between a found alarming phrase and detected emotions.
- the monitoring System on an emergency situation detection unit.
- the emergency situation detection unit is set up to determine an emergency situation in the event that there is a temporal correlation between a found alarming phrase and detected emotions that are associated with an emergency situation. That is, the emergency situation detection unit conveys the information that there is an emergency situation when it receives the message that there is a temporal correlation between a found alarming phrase and detected emotions that are associated with an emergency situation.
- a short sentence can be recognized much more securely than a single keyword. Instead of a single sentence, several sentences or other meaningful sequences of words may be used.
- the monitoring system according to the invention shares the advantages of the method according to the invention for the automated detection of an emergency situation of one or more persons in a surveillance area.
- the alarm system according to the invention comprises the monitoring system according to the invention and an alarm unit for automa ticated triggering of an alarm in response to an emergency situation detected by the monitoring system.
- the inven tion proper alarm system shares the advantages of erfindungsge MAESSEN monitoring system.
- Some components of the monitoring system according to the invention may be formed after supplementing an acoustic sensor for the greater part in the form of software components.
- This relates in particular to parts of the evaluation unit, the emotion determination unit, the correlation determination unit and the emergency situation detection unit.
- these components but also in part, in particular when it comes to very fast calculations, in the form of software-supported hardware, such as FPGAs or the like, be realized.
- the needed Interfaces for example, when it comes only to a transfer of data from other software components, be designed as software interfaces. However, they can also be configured as hardware-based interfaces, which are controlled by suitable software.
- a largely software implementation has the advantage that even so far for monitoring a range IN ANY computer systems for a possible addition of additional hardware elements, such as the acoustic sensor unit's rule, can be easily retrofitted by a software update to the on to work according to the invention.
- the task is also solved by a corresponding computer program product with a computer program which can be loaded directly into a memory device of such a computer system, cut with program section in order to execute all the steps of the method according to the invention when the computer program is executed in the computer system.
- Such a computer program product may, in addition to the computer program, optionally contain additional components, such as e.g. a documentation and / or additional components, also hardware components, such as e.g. Hardware keys (dongles, etc.) for using the software include
- a compu terlesbares medium such as a memory stick, a hard disk or other portable or fixed Schwarzer disk serve, on which the readable from a computer unit and executable program sections of the computer program are stored.
- the computer unit may e.g. have this one or more cooperating micro-processors or the like.
- determining the emotions and determining a temporal correlation between a found alerting phrase and a detected emotion comprises automatically determining a time of occurrence of the alerting phrase in the event that an alerting phrase has been detected, and automatically determining Emotions based on the acoustic measurement data in a time window, which includes the time point of occurrence of the alerting phrase.
- the determination of a temporal correlation between the alerting phrase and emotions occurs by searching a time window around the time at which the alerting phrase occurs. The size of the time window can be determined, for example, based on empirical values. An existing database can be used by learning methods, such as machine learning, to establish such parameters. If, in the said time window, emotions are found that can be attributed to an emergency situation, a message appears that an emergency situation exists.
- the detection of emotions comprises the detection of at least one of the following characteristics of the measured data:
- the alarm phrase is determined on the basis of characteristics of the chronological progression of a recorded sound amplitude of the acquired acoustic measured data. In this case, a comparison with a known pattern of a sound amplitude, which is assigned to a particular alarming phrase, serve to detect this alarming phrase.
- the emotions are determined on the basis of characteristics of the temporal course of a recorded sound amplitude of the acquired acoustic measurement data.
- a pattern of a specific time course of a sound amplitude or a pattern of a bestimm th frequency spectrum which is derivable by Fourier transform from the time course of a sound amplitude, serve as a comparison pattern for comparison with the recorded th measurement data.
- different patterns associated with different emotions can be read from a database and compared with the measurement data to interpret the acquired measurement data and to determine the type of emotions currently occurring. On the basis of the type of emotions, it is then possible, in combination with the alarm phrase, to conclude the occurrence of an emergency situation and possibly even to determine the nature of the emergency situation. For example, based on the frequency spectrum on the emotions anger and fear can be concluded. This combination of emotions can be combined with a walttat. If, on the other hand, only the emotion fear is detected, an accident situation is more likely.
- the correlation determination unit has a time determination unit for automatically determining a time of occurrence of the alarming phrase in the event that an alarming phrase has been detected.
- the correlation determination unit includes a time-slot setting unit for setting a time window that includes the time of occurrence of the alerting phrase.
- the correlation determination unit comprises a detection unit for detecting emotions in the specified time window.
- the emotion determination unit is thus set up to automatically determine emotions in the specified time window.
- the search for emotion patterns in the audio data can be limited to time windows oriented on the appearance of alerting phrases, so that the evaluation of the acquired monitoring data is accelerated.
- FIG. 1 shows a flow chart, which illustrates a method for automated detection of an emergency situation of one or more persons in a monitoring area and for triggering an alarm according to an embodiment of the invention
- FIG. 2 shows a flowchart which illustrates a method for automated detection of an emergency situation of one or more persons in a monitoring area and for triggering an alarm according to a second embodiment of the invention
- 3 is a block diagram showing an alarm system with a
- FIG 1 a flowchart 100 is shown, which drive a Ver for automated detection of an emergency situation egg ner or more people in a surveillance area UB (see FIG 3) and for triggering an alarm according to an imple mentation example of the invention illustrates.
- the monitoring area ÜB may include, for example, a waiting area of a train station, for example a platform, on which people are staying for a certain time.
- acoustic measurement data AMD are continuously acquired by the monitoring area ÜB.
- the measured data reproduce the noises occurring or to be perceived in the monitoring area UB, for example in the form of a sonogram or an amplitude-time curve.
- the amplitude-time curve is then examined automatically in step l.II for sections which correspond to a sonogram of a predetermined alarming phrase AP.
- Such an alerting phrase AP may be, for example, "I need help" and be clearly visible in the monitoring area ÜB.
- an alerting phrase AP has been found, which is marked "j" in FIG l.III passed. If no alarm phrase AP found, which is marked in Figure 2 with "n” is returned to the step 1.
- step l.III an automated determination of a time t A at which the alerting phrase AP was found continues to take place. That is, the position of a curve section associated with the alarm phrase is determined in an amplitude-time diagram. On the basis of this time t A is a time window F t set at the step 1. IV, which is subsequently, at step IV respect occurring tender emotions E or certain emotions E sounds to be assigned is automatically searched.
- step IV In the event that in the step IV portions of the amplitude-time diagram were found in the time window F t , which can be associated with an emotion E, which is to bring in connection with an emergency situation, which in FIG 1 with "j is moved to the step l.VI, in which it is determined that an emergency situation NS exists, in the event that in the step lV no emotion E could be determined who was able to communicate with an emergency situation is gene, which is marked in Figure 1 with "n", it is returned to the step 1.1 and the monitoring process is continued.
- step l.VII an alarm message AL is triggered on the basis of the determined emergency situation NS. For example, the alarm message AL to an auxiliary team or auxiliary instance, such as the police or private security services, are transmitted, which can provide timely assistance to the person concerned.
- FIG 2 a flowchart 200 is shown, which drive a Ver for automated detection of an emergency situation egg ner or more people in a surveillance area UB (see FIG 2) and for triggering an alarm according to a two-th embodiment of the invention illustrated.
- a parallel detection of the alerting phrase AP and the emotions E In contrast to the first exemplary embodiment shown in FIG. 1, in the second exemplary embodiment a parallel detection of the alerting phrase AP and the emotions E.
- step 2 I, as in step 1, I continuously acquires acoustic measurement data AMD from the monitoring area UB. Unlike in the first embodiment illustrated in FIG. 1, however, the detection and evaluation of the alarming phrase AP and of the emotions E in the second exemplary embodiment take place in parallel in time, wherein the steps II. II to 2.IV correspond to the steps 1 .II to 1. IV ie, the acoustic measurement data AMD is searched for an alert phrase AP and a time window F t for synchronous with the alert phase AP occurs. tende emotions E set. Furthermore, temporally parallel to it in step 2.V emotions E are determined on the basis of the acoustic measurement data AMD.
- step 2.VI it is further determined whether detected emotions E in the fixed time window F t are related to the alerting phrase AP. In the event that emotions E are detected in the time window F t , which is marked "j" in FIG. 2, the process proceeds to step 2. VII. In step 2 .VII it is determined that an emergency situation NS exists. If appropriate, the nature of the emergency situation is also deduced on the basis of the type of emotions E, so that a more exact message can be forwarded about the nature of the emergency situation In the event that no emotions E in the time slot F t could be determined, which are to be associated with an emergency situation, which is marked in Figure 2 with "n", then returns to the step 2.1 and the monitoring process is continued.
- an alarm message AL is triggered on the basis of the determined emergency situation NS in step 2. VIII.
- the alarm message AL to an auxiliary team or auxiliary instance, such as the police or private security services, are transmitted, which can promptly give the affected person NEN help.
- FIG 3 is a block diagram is shown, which illustrates a Alarmsys system 1 with a monitoring system 10 according to an embodiment of the invention Ausry. part of
- Alarm system 1 is in addition to the mentioned monitoring system 10 also an alarm unit 16, can be set with the auxiliary forces of egg ner emergency situation NS in a monitoring area ÜB in knowledge.
- the monitoring system 10 also includes an acoustic Sen sorritt 11 with a sound recording microphone, which serves to record continuous acoustic measurement data AMD of the surveillance area ÜB.
- the acoustic measurement data AMD are transmitted to an evaluation unit 12, which The acoustic measurement data AMD automatically searches in real time for a predetermined alert phrase AP.
- the acoustic measurement data AMD are also transmitted to an emotion detection unit 13 which searches the acoustic measurement data AMD for emotions E in the manner described above for the automatic mode.
- the correlation determination unit 14 comprises a time determination unit 14a, which automatically determines a time t A of the occurrence of the alarming phrase AP. On the basis of the determined time point t A , a time window F t is automatically determined by a time window determination unit 14b, which is also part of the correlation determination unit 14. The time window F t is searched by a detection unit 14c, which is also part of the correlation determination unit 14, for the occurrence of emotions E assigned to an emergency situation. The emotion E receives the detection unit 14c from the emotion determination unit 13.
- this result EE is transmitted to an emergency situation detection unit 15, which determines on the basis of this result EE that an emergency situation NS is present.
- This information NS is finally forwarded by the monitoring system 10 to the alerting unit 16 which, as already mentioned, alerts suitable assistants.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Gerontology & Geriatric Medicine (AREA)
- Computer Security & Cryptography (AREA)
- Engineering & Computer Science (AREA)
- Psychiatry (AREA)
- Psychology (AREA)
- Social Psychology (AREA)
- Alarm Systems (AREA)
- Emergency Alarm Devices (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102017221819 | 2017-12-04 | ||
PCT/EP2018/080146 WO2019110215A1 (de) | 2017-12-04 | 2018-11-05 | Automatisiertes detektieren einer notsituation einer oder mehrerer personen |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3673469A1 true EP3673469A1 (de) | 2020-07-01 |
Family
ID=64332009
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP18803885.5A Withdrawn EP3673469A1 (de) | 2017-12-04 | 2018-11-05 | Automatisiertes detektieren einer notsituation einer oder mehrerer personen |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP3673469A1 (de) |
WO (1) | WO2019110215A1 (de) |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5666157A (en) | 1995-01-03 | 1997-09-09 | Arc Incorporated | Abnormality detection and surveillance system |
US6275806B1 (en) | 1999-08-31 | 2001-08-14 | Andersen Consulting, Llp | System method and article of manufacture for detecting emotion in voice signals by utilizing statistics for voice signal parameters |
WO2001033134A1 (de) | 1999-11-02 | 2001-05-10 | Helmut Ehrlich | Verfahren und vorrichtung zum erkennen von gefahrensituationen |
DE60108373T2 (de) * | 2001-08-02 | 2005-12-22 | Sony International (Europe) Gmbh | Verfahren zur Detektion von Emotionen in Sprachsignalen unter Verwendung von Sprecheridentifikation |
US20070183604A1 (en) | 2006-02-09 | 2007-08-09 | St-Infonox | Response to anomalous acoustic environments |
SG178563A1 (en) | 2009-08-24 | 2012-03-29 | Agency Science Tech & Res | Method and system for event detection |
US20150070166A1 (en) | 2013-09-09 | 2015-03-12 | Elwha Llc | System and method for gunshot detection within a building |
WO2015164224A1 (en) * | 2014-04-21 | 2015-10-29 | Desoyza Erangi | Wristband and application to allow one person to monitor another |
DE202015102253U1 (de) | 2015-05-04 | 2015-06-01 | Samsung Electronics Co., Ltd. | Vorrichtung zum Koppeln eines am Körper tragbaren Geräts und eines Smart-Geräts |
US9965680B2 (en) | 2016-03-22 | 2018-05-08 | Sensormatic Electronics, LLC | Method and system for conveying data from monitored scene via surveillance cameras |
US10628682B2 (en) | 2016-04-29 | 2020-04-21 | International Business Machines Corporation | Augmenting gesture based security technology using mobile devices |
-
2018
- 2018-11-05 EP EP18803885.5A patent/EP3673469A1/de not_active Withdrawn
- 2018-11-05 WO PCT/EP2018/080146 patent/WO2019110215A1/de unknown
Also Published As
Publication number | Publication date |
---|---|
WO2019110215A1 (de) | 2019-06-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3279700B1 (de) | Zentralisiertes sicherheitsinspektionverwaltungssystem | |
DE60215049T2 (de) | Überwachungssystem mit verdächtigem verhaltensdetektion | |
DE60208166T2 (de) | Automatisches system zur überwachung von unabhängigen personen, die gelegentlich hilfe brauchen | |
WO2017001611A1 (de) | Verfahren und vorrichtung zum zuordnen von geräuschen und zum analysieren | |
CN111063162A (zh) | 静默报警方法、装置、计算机设备和存储介质 | |
DE102014115223A1 (de) | Verfahren und Vorrichtung zur Bewegungsüberwachung | |
CN109493555A (zh) | 一种基于智能监控技术的校园宿舍楼安防监控系统 | |
CN109508736A (zh) | 一种基于深度学习的监狱异常情况监测方法及监测系统 | |
CN108038808A (zh) | 一种监控方法、装置、终端设备及存储介质 | |
DE102014219692A1 (de) | Verfahren zum Erkennen einer Gefahrensituation in einem Fahrzeuginnenraum eines Fahrzeugs und Gefahrenerkennungssystem | |
CN108109331A (zh) | 监控方法及监控系统 | |
AT513101B1 (de) | Überwachungssystem, Freiflächenüberwachung sowie Verfahren zur Überwachung eines Überwachungsbereichs | |
CN109584907A (zh) | 一种异常报警的方法和装置 | |
EP3499473A1 (de) | Automatisiertes detektieren von gefahrensituationen | |
EP3516636B1 (de) | Audioüberwachungssystem | |
DE202013101354U1 (de) | Vorrichtung zur Überwachung der momentanen Mobilität von Personen in privaten oder öffentlichen Räumen | |
CN106448055A (zh) | 一种监控报警的方法及装置 | |
EP3673469A1 (de) | Automatisiertes detektieren einer notsituation einer oder mehrerer personen | |
EP3493171A1 (de) | Detektion von aggressivem verhalten in öffentlichen transportmitteln | |
CN111210602A (zh) | 一种夜间风险防控方法、系统及存储介质 | |
DE102014105937A1 (de) | Überwachungssystem und Überwachungsverfahren | |
CN108022585A (zh) | 信息处理方法、装置及电子设备 | |
EP3273418A1 (de) | Mehrstufiges totmann-alarmsystem und -verfahren | |
DE102016211049A1 (de) | Verfahren und Vorrichtung zum eine Vorrichtung und ein Verfahren zur Ausgabe wenigstens eines Alarmsignals | |
US10380097B1 (en) | Physiological-based detection and tagging in communications data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20200325 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20201218 |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20210429 |