WO2022035403A1 - Ensemble permettant de déterminer la perception situationnelle de premiers répondeurs et procédé de fonctionnement de cet ensemble - Google Patents

Ensemble permettant de déterminer la perception situationnelle de premiers répondeurs et procédé de fonctionnement de cet ensemble Download PDF

Info

Publication number
WO2022035403A1
WO2022035403A1 PCT/TR2021/050783 TR2021050783W WO2022035403A1 WO 2022035403 A1 WO2022035403 A1 WO 2022035403A1 TR 2021050783 W TR2021050783 W TR 2021050783W WO 2022035403 A1 WO2022035403 A1 WO 2022035403A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
user
unit
sensor unit
localisation
Prior art date
Application number
PCT/TR2021/050783
Other languages
English (en)
Inventor
Tolga SONMEZ
Idil GOKALP
Serdar KOSE
Caglar AKMAN
Mesut GOZUTOK
Original Assignee
Havelsan Hava Elektronik San. Ve Tic. A.S.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Havelsan Hava Elektronik San. Ve Tic. A.S. filed Critical Havelsan Hava Elektronik San. Ve Tic. A.S.
Publication of WO2022035403A1 publication Critical patent/WO2022035403A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/683Means for maintaining contact with the body
    • A61B5/6831Straps, bands or harnesses
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02438Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1112Global tracking of patients, e.g. by using GPS
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0242Operational features adapted to measure environmental factors, e.g. temperature, pollution
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6814Head
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6823Trunk, e.g., chest, back, abdomen, hip
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6824Arm or wrist
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6828Leg

Definitions

  • the invention is a method consisting of a hardware assembly for monitoring the current status of first responders from a center in real-time and software developed to run on this hardware. It offers a new and complete solution in the field of compiling, receiving, processing and reporting this sensor data with a group of sensors to be placed on or worn by a person in this respect. The invention also presents an innovative solution regarding the classification of the body movements and general health status of the first responder during data processing.
  • the European patent application EP2025178 mentions a system for localization of first responder teams. Accordingly, three-dimensional acceleration measurement is performed with a sensor to be attached to the response team and the location was determined. However, there was no element related to situational awareness monitoring including health, body position, and anomaly detection in the said system. It is not possible to monitor whether the first responders perform the necessary movements (for example, whether they act low in case of fire, stress and fatigue levels, health conditions, body position forms, and body position anomalies) for this reason. The subject of the invention in this respect was not considered similar because it lacked the elements to solve the technical problem it deals with.
  • U.S. patent US10415975 mentions data from sensors placed on the human body at certain locations and a method for recording them in a database. However, it is mentioned here that the data obtained from the sensors attached to the body are matched with 3D motion according to a database. The training of deep learning and artificial neural networks and the classification of movements and body shape are not available in this study. In addition, the application of this system by a person skilled in the art cannot access the situational awareness of first responders. This study was not considered to be similar to the art specified in the invention.
  • Patent application US20110046915 refers to a system that operates by reading the sensors placed on the user via remote stationary readers. Such a system cannot be used by first responders because such an installation is impossible. It is impossible to provide this communication in a proper way even if it is a pre-installed system. The subject of the invention was not considered similar since it was impossible to find a solution to the technical problem solved by the product even from this point of view alone.
  • U.S. patent application US20160339293 mentions a method that uses an IMU sensor to determine the head position.
  • the head position is not sufficient in terms of the technical problem solved. Apart from this, it is not considered similar since there is no information about team monitoring and visualisation.
  • the method of the invention first comprises a motion sensor unit (101), health sensor unit (102), optional localisation unit (103), gateway unit (104), and augmented reality headset (401) for each user (100).
  • Motion sensor units (101) function by being fixed on the body of each user (100) at certain points (head, right/left upper/lower arm, right/left upper/lower leg, and body).
  • the motion sensor unit (101) includes a triaxial magnetometer (200), a triaxial accelerometer (201), a triaxial gyroscope (202), a processor (203), and the wireless communication unit (204) as well as a battery. It is essential that the motion sensor units (101) are placed on the body of each user (100) in body sections matched with their numbers.
  • the health sensor unit (102) includes a temperature sensor (205), pulse sensor (206), a processor (203), and a wireless communication unit (204) as well as a battery.
  • the localisation units (103) include an ultrawideband localisation unit (207), a processor (203), and a wireless communication unit (204) as well as a battery.
  • the user (100) are used by the user (100) to position with high precision by dropping them at random points on the route.
  • the instantaneous accelerometer (201) data the calibrated accelerometer (201) data“ the instantaneous gyroscope (202) data ⁇ s> , the calibrated gyroscope (202) data ⁇ K , and the instantaneous magnetometer (200) data ⁇ ' are shown as the calibrated magnetometer (200) data KK .
  • the angle information of the body parts to which the motion sensor units (101) are attached is obtained after the data collected on the motion sensors (101), after data collected on the motion sensors are calibrated with calibration equations (1-6) and combined with fusion algorithms and filtering processes.
  • the quaternion angle information obtained for each part is combined and used to model all body movements.
  • Body temperature information and pulse information are measured on the health sensor unit (102).
  • the pulse information can be measured from the health sensor unit (102) attached to the wrist or by the health sensor unit (102) attached under the motion sensor unit (101).
  • the localisation units (103) are carried as a set on the user (100). There are at least 5 localisation units (103) in 1 set. The user (100) drops at least 4 of these units to random points on the route the user (100) travels. The position of the other localisation units (103) is recorded in the system by the user (100) based on the first released localisation unit (103) which is used as reference (0,0,0) for use in the localisation phase. One of the localisation units (103) is carried on the user (100) to track the position of the user (100).
  • the distances between the localisation units (103), which are dropped in the field and whose location is known and the localisation unit (103) carried on the user (100), are measured via the propagating ultrawideband signals and the location of the user (100) is estimated and tracked from these distances by multilateration method.
  • Each motion sensor unit (101), health sensor unit (102), and localisation unit (103) are connected to a gateway (104) via BLE and instantly transmit the received data to the gateway (104).
  • the data obtained by adding the time information to the beginning of the semiprocessed data transmitted to the gateway (104) when they reach the gateway (104) is transferred by the gateway (104) to the server (105).
  • Data is collected on the server (105) through data standardization and data acquisition module (301), and these collected data are combined on data processing module (302) and converted into appropriate input information for the deep learning module (304). This input information is classified by feeding it in the deep learning module (304) and transformed into situational awareness information.
  • the timestamped versions of the data on the server (105) are also introduced into a three- dimensional animation motor (303).
  • This three-dimensional animation engine (303) can run on the server (105) and transfer images to the clients (106) as well as running directly on the client (106) and being shown to the user (100) via user (100) interfaces.
  • This three- dimensional animation engine (303) has the capability of simultaneously processing the information of one or more users (100).
  • Information of the person who is using the sensors is recorded on the server (105) via the user (100) interface, or the previously recorded values are collected from the user (100) database.
  • the data generated for situational awareness on the server (105) is transmitted to the clients (106) via various user (100) interfaces.
  • the augmented reality headset (401) or user (100) interface over the tablet is offered for the first responder teams in the field.
  • a user interface is offered via the operator interface (402) for the operators located in the center.
  • An important step in the operation of the product subject to the invention is the 3D model calibration, which is performed through the mathematical operations shown in equation (7) and equation (8). Accordingly, the sensor data of the person standing in the calibration position (a predetermined stationary position) is collected, and the calibration matrix is calculated by multiplying these data by the reference stationary position. The obtained calibration matrix is then multiplied by all the data that comes instantly, thus obtaining the calibrated data.
  • the motion sensor unit (101), the health sensor unit (102), and the localisation unit (103) mentioned in the description of the product in question are a group of sensors powered by a battery and include sensors designed to measure acceleration, rotation, temperature, heartbeat, and magnetic field. These are the minimum requirements for the healthy operation of the product subject to the invention, and new ones can be added according to the need.
  • the sensor module which is expressed as an acceleration sensor, is a triple sensor array that measures the acceleration in three planes perpendicular to each other and each one can sense in the direction of the normal vector of that plane only. Acceleration, exposed G force measurements, and similar calculations on the three planes can be easily made on the server (105) in this way. Gyroscope measures the rotation around three-axis passing through imaginary circles perpendicular to each other.
  • the magnetometer (200) was used to measure exposed magnetic field values.
  • the accelerometer (201) and magnetometer (200) were evaluated as absolute sensors and used to reduce the error in the gyroscope data and sensor fusion in this context.
  • the temperature sensor (205) is the general name given to the sensors that measure the temperature of the environment, and the pulse sensor (206) is the general name given to the sensors that measure the heart rate of the person.
  • the ultrawideband localisation units (207) were used for ultrawide band signal propagation between the receiver and the transmitter and therefore calculating the distance between these units depending on its transmission time in the air and localisation based on these distances. It is sufficient for these sensors to generate data for the proper operation of the product subject to the invention; there are no additional technical requirements and special sensor preferences.
  • Gateway (104) is the network device that can be placed by the first responder teams during their deployment to the site, carried on them, or placed by another team before or during the deployment. These are modems designed to form a wireless network, media converter antennas, and other equipment (batteries, etc.) necessary to maintain their operation.
  • the sensor data collected from the team members can be transferred to any server (105) in this way. Meanwhile, location estimation can be made by determining the distance of the team member to other team members or to the gateway (104) through the gateways (104).
  • a deep learning module (304) is a platform that learns various movements made by the user (100) over time by processing the collected data. It is ensured that a model is generated to automatically make sense of the signal received over time by entering the previously collected user (100) data and the user (100) tags of that data to this platform.
  • User (100) conditions such as activity, fatigue, anomaly (high stress, etc.), and authentication can be detected during the operation in real-time sensor data in this way. For this reason, intra-team awareness is developed.
  • the number of motion sensor units (101) mentioned herein is ten for each team member, the number of health sensor units (102) is at least 1, and the number of motion tracking units (103) is at least 10. This number can be increased or decreased; however, our studies have shown that the number sufficient to obtain complete data from a person’s movement is 10 for motion sensor unit (101), 1 for health sensor unit (102), and 5 for localisation unit (103).
  • Motion sensor units (101) should be considered as a set consisting of a total of ten sensors as lower leg (2), upper leg (2), chest (1), head (1), upper arm (2), and lower arm (2).
  • the health sensor unit (102) should be considered as a wristband attached to the wrist and a unit attached under the motion sensor unit (101) in the chest.
  • Localisation units (103) should be considered to be carried on the user (100) and placed around when necessary. All these sensor units can be connected to a single gateway (104) or can be individually connected and transfer data separately.
  • the system option which offers the ability to increase or decrease the sensor modules, allows the system to be expanded horizontally. Meanwhile, various health sensors (102) can be easily integrated into the system if deemed necessary.
  • the user (100) interface element may be a computer, tablet, or augmented reality headset (401). Augmented reality glasses (401) or tablets can be used for the teams in the field.
  • a user interface (100) is also available for the operators in the operation center. This interface can easily be changed according to the preferences of the user (100), or more than one interface can be selected from this list if the management of the team is performed by more than one person. For example, data can be transferred to the headquarters employee’s desktop computer via a network connection while it is transferred to the unit supervisor with the augmented reality headset (401). There are no obstacles to use of the same interface more than once, multiple interface displays can be easily provided by a person skilled in the art.
  • Figure 1 is a schematic illustration of the distribution of motion sensor units (101), health sensor units (102), localisation units (103), and gateway units (104) placed on a person. Accordingly, motion sensor (101) and health sensor units (102) are fixed on the person as shown.
  • FIG. 2 is a schematic illustration of the working principle of the method of the invention. Accordingly, the data collected from the sensor units is transferred to the server (105) via the BLE/Wi-Fi element.
  • the data collected by the data collection module (301) is combined with the data processing module (302).
  • the data is introduced into the three-dimensional animation engine (303) and meanwhile, the deep learning module (304) can be classified as an activity if a model has already been created.
  • Outputs are transferred to the user (100) via the desired user (100) interface. This interface is designed to be accessible via any AR/VR, tablet, or computer.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Physiology (AREA)
  • Artificial Intelligence (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Cardiology (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • Signal Processing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Psychiatry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Fuzzy Systems (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Emergency Alarm Devices (AREA)

Abstract

L'invention concerne un procédé consistant en un ensemble matériel pour surveiller l'état actuel d'équipes de premier secours à partir d'un centre en temps réel et un logiciel développé pour fonctionner sur ce matériel. Il offre une solution nouvelle et complète dans le domaine de la compilation, de la réception, du traitement et du rapport de ces données de capteur avec un groupe de capteurs devant être placés sur ou portés par une personne à cet égard. L'invention présente également une solution innovante concernant la classification des mouvements corporels et l'état général de santé de l'employé d'équipe de premier secours pendant le traitement de données.
PCT/TR2021/050783 2020-08-11 2021-08-10 Ensemble permettant de déterminer la perception situationnelle de premiers répondeurs et procédé de fonctionnement de cet ensemble WO2022035403A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TR2020/12579 2020-08-11
TR202012579 2020-08-11

Publications (1)

Publication Number Publication Date
WO2022035403A1 true WO2022035403A1 (fr) 2022-02-17

Family

ID=77615204

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/TR2021/050783 WO2022035403A1 (fr) 2020-08-11 2021-08-10 Ensemble permettant de déterminer la perception situationnelle de premiers répondeurs et procédé de fonctionnement de cet ensemble

Country Status (2)

Country Link
TR (1) TR202106597A2 (fr)
WO (1) WO2022035403A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009021068A1 (fr) * 2007-08-06 2009-02-12 Trx Systems, Inc. Localiser, suivre et/ou surveiller le personnel et/ou les capitaux à la fois à l'intérieur et à l'extérieur
US20180293866A1 (en) * 2015-11-04 2018-10-11 Avante International Technology, Inc. Personnel tracking and monitoring system and method employing protective gear including a personnel electronic monitor device
US20200034501A1 (en) * 2017-02-22 2020-01-30 Middle Chart, LLC Method and apparatus for position based query with augmented reality headgear

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009021068A1 (fr) * 2007-08-06 2009-02-12 Trx Systems, Inc. Localiser, suivre et/ou surveiller le personnel et/ou les capitaux à la fois à l'intérieur et à l'extérieur
US20180293866A1 (en) * 2015-11-04 2018-10-11 Avante International Technology, Inc. Personnel tracking and monitoring system and method employing protective gear including a personnel electronic monitor device
US20200034501A1 (en) * 2017-02-22 2020-01-30 Middle Chart, LLC Method and apparatus for position based query with augmented reality headgear

Also Published As

Publication number Publication date
TR202106597A2 (tr) 2021-06-21

Similar Documents

Publication Publication Date Title
CN109579853B (zh) 基于bp神经网络的惯性导航室内定位方法
US8744765B2 (en) Personal navigation system and associated methods
US8825435B2 (en) Intertial tracking system with provision for position correction
CN105938189B (zh) 多人协作式楼层定位方法和系统
CN105832315A (zh) 不受个体运动状态与环境位置影响的远程监测系统
KR101101003B1 (ko) 센서노드를 이용한 신체의 움직임 및 균형 감지 시스템 및 방법
US20200205698A1 (en) Systems and methods to assess balance
US20050033200A1 (en) Human motion identification and measurement system and method
FR2930421A1 (fr) Dispositif d'acquisition et de traitement de donnees physiologiques d'un animal ou d'un humain au cours d'une activite physique
Chapman et al. Assessing shoulder biomechanics of healthy elderly individuals during activities of daily living using inertial measurement units: high maximum elevation is achievable but rarely used
CN106725445B (zh) 一种脑电波控制的便携式人体运动损伤监护系统与方法
Yi et al. Wearable sensor data fusion for remote health assessment and fall detection
CN103889325A (zh) 用于监测用户的设备和用于校准该设备的方法
CN107330240A (zh) 一种基于双手环传感器的智能远程特护监控系统及方法
Hamdi et al. Lower limb motion tracking using IMU sensor network
CN106650300B (zh) 一种基于极限学习机的老人监护系统及方法
Wu et al. A multi-sensor fall detection system based on multivariate statistical process analysis
CN109816951A (zh) 一种考场防作弊监控系统
Brzostowski Toward the unaided estimation of human walking speed based on sparse modeling
Madrigal et al. 3D motion tracking of the shoulder joint with respect to the thorax using MARG sensors and data fusion algorithm
WO2022035403A1 (fr) Ensemble permettant de déterminer la perception situationnelle de premiers répondeurs et procédé de fonctionnement de cet ensemble
Martin Real time patient's gait monitoring through wireless accelerometers with the wavelet transform
Zhao et al. Design of an IoT-based mountaineering team management device using Kalman filter algorithm
Carvalho et al. Instrumented vest for postural reeducation
TR2021006597A2 (tr) Aci̇l durum eki̇pleri̇ni̇n durumsal farkindaliklarinin tespi̇ti̇ i̇çi̇n bi̇r düzenek ve bu düzeneği̇n çaliştirilmasi i̇çi̇n bi̇r yöntem

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21856364

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21856364

Country of ref document: EP

Kind code of ref document: A1