WO2021115064A1 - 基于可穿戴传感器的健身运动识别方法 - Google Patents
基于可穿戴传感器的健身运动识别方法 Download PDFInfo
- Publication number
- WO2021115064A1 WO2021115064A1 PCT/CN2020/129525 CN2020129525W WO2021115064A1 WO 2021115064 A1 WO2021115064 A1 WO 2021115064A1 CN 2020129525 W CN2020129525 W CN 2020129525W WO 2021115064 A1 WO2021115064 A1 WO 2021115064A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- fitness exercise
- method based
- recognition method
- signal
- fitness
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 61
- 230000011218 segmentation Effects 0.000 claims abstract description 24
- 230000001133 acceleration Effects 0.000 claims abstract description 15
- 230000008569 process Effects 0.000 claims abstract description 12
- 230000009467 reduction Effects 0.000 claims abstract description 10
- 238000000605 extraction Methods 0.000 claims abstract description 9
- 238000007781 pre-processing Methods 0.000 claims abstract description 8
- 238000012545 processing Methods 0.000 claims abstract description 8
- 239000013598 vector Substances 0.000 claims description 17
- 230000009471 action Effects 0.000 claims description 15
- 238000013528 artificial neural network Methods 0.000 claims description 9
- 238000004364 calculation method Methods 0.000 claims description 6
- 238000012847 principal component analysis method Methods 0.000 claims description 3
- 238000012549 training Methods 0.000 claims description 3
- 238000005192 partition Methods 0.000 claims 1
- 238000010606 normalization Methods 0.000 abstract description 3
- 230000006399 behavior Effects 0.000 description 19
- 230000033001 locomotion Effects 0.000 description 12
- 238000005516 engineering process Methods 0.000 description 7
- 230000007613 environmental effect Effects 0.000 description 6
- 239000010410 layer Substances 0.000 description 5
- 230000004927 fusion Effects 0.000 description 3
- 210000003423 ankle Anatomy 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 230000000306 recurrent effect Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 210000000689 upper leg Anatomy 0.000 description 2
- 210000000707 wrist Anatomy 0.000 description 2
- 230000002411 adverse Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000010835 comparative analysis Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 210000003141 lower extremity Anatomy 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000004377 microelectronic Methods 0.000 description 1
- 238000000513 principal component analysis Methods 0.000 description 1
- 239000002356 single layer Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C23/00—Combined instruments indicating more than one navigational value, e.g. for aircraft; Combined measuring devices for measuring two or more variables of movement, e.g. distance, speed or acceleration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/02—Preprocessing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/08—Feature extraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/12—Classification; Matching
Definitions
- the invention belongs to the field of computer technology and application technology, and relates to a fitness exercise recognition method based on a wearable sensor.
- the human body behavior recognition method based on video equipment realizes human body behavior recognition by acquiring the image information of human motion and analyzing its image sequence.
- video acquisition equipment is required to obtain target video information.
- Such equipment is usually large, fixed in location, high power consumption, large calculations, and poor anti-interference ability, which may cause unforeseen environments. Factors interfere with the data, which makes it only suitable for fixed occasions and cannot adapt to long-term continuous human behavior records.
- the sensor is integrated into the sports equipment.
- some manufacturers have introduced similar equipment.
- users can scan the QR code to activate the sensor to record the data.
- the movement information is sent to the user's mobile phone.
- HAR Human Activity Recognition
- Environment-based sensors can generally be used in fixed-scene environments, such as homes and gyms. Generally, sensors are placed on some objects, and when the user uses these sensors, the sensors are activated to record user behavior information. Although the environmental sensor-based method has been used in some scenarios, such a system layout still requires a large number of sensors. The cost of the equipment is higher than that of ordinary equipment, and because of the need for power or built-in power, there are certain safety risks, and The outdoor behavior of users cannot be monitored, and it is more difficult to distinguish specific users.
- Mobile wearable sensing devices can monitor the user's indoor and outdoor behaviors at any time through sensor nodes worn on the human body. Secondly, because the sensor equipment is independently owned by itself, there is no need to worry about privacy leakage, and wearable sensors can integrate many kinds of components to collect different signals, through which different body movements of the human body and the health of the body can be analyzed. The status is detected simultaneously.
- a Chinese patent (name: a method for user behavior recognition based on smart mobile device sensors, application number: CN201910347816.4) discloses a method for user behavior recognition based on smart mobile device sensors. The method obtains acceleration and angular velocity data. , It is imaged and processed according to the way of image processing, but the recognition action is relatively simple, which is not enough to support the current real needs.
- the Chinese patent (name: human behavior recognition method based on convolutional neural network and recurrent neural network, application number: CN201910580116.X) discloses a human behavior recognition method based on convolutional neural network and recurrent neural network.
- the data In addition to the sensor data, the data also collects the RGB video of the scene. It is easy to be affected by light, obstacles, etc. during the collection, and an ideal environment is required to collect data.
- the purpose of the present invention is to provide a wearable sensor-based fitness exercise recognition method, which uses the acceleration and angular velocity signals collected by the motion sensor integrated in the wearable device during the fitness exercise of the individual to extract exercise information, So as to realize the recognition of fitness exercises.
- Step 1 Obtain the inertial sensor signal during the fitness exercise and perform preprocessing
- Step 2 Perform window segmentation on the signal preprocessed in Step 1;
- Step 3 Perform feature extraction on the signal segmented in Step 2;
- Step 4 normalize the feature data extracted in step 3;
- Step 5 Perform feature dimensionality reduction on the features processed in step 4;
- Step 6 Identify the features processed in step 5.
- the acquisition process of fitness exercise signals in step 1 is: use the accelerometer and gyroscope built in the wearable device to acquire the acceleration signals and angular velocity signals of the fitness exercise;
- the process of signal preprocessing is: linear interpolation and desiccation processing on the collected acceleration signal and angular velocity signal.
- step 2 the window segmentation method includes segmentation based on sliding windows, window segmentation based on event definitions, and window segmentation based on action definitions.
- each window has a 50% overlap of information.
- step 3 includes maximum value, minimum value, mean value, variance, skewness, kurtosis, maximum peak value and energy of the discrete Fourier transform spectrogram;
- ⁇ represents the mean
- ⁇ 2 represents the variance
- S max represents the maximum value in the vector
- S min represents the minimum value in the vector
- ske represents the skewness
- kur represents the kurtosis
- S DFT (k) represents the discrete Fourier transform
- E represents energy
- step 4 The specific process of step 4 is as follows: the extracted features are formed into a feature vector, and the vector is normalized to the interval [0,1] for the training of the classifier;
- step 5 principal component analysis is used for dimensionality reduction.
- step 6 a hierarchical method based on neural network is used to identify the behavior of people exercising in the gym.
- the beneficial effect of the present invention is that the fitness exercise recognition method based on the wearable sensor of the present invention uses the accelerometer and gyroscope integrated in the wearable sensor to separately collect the acceleration signal and angular velocity signal of the human body during fitness exercise. Through preprocessing, window segmentation and feature extraction of acceleration and angular velocity signals, dimensionality reduction processing and classification modeling of the extracted features are implemented to realize the recognition of fitness exercises.
- Fig. 1 is a flowchart of a fitness exercise recognition method based on a wearable sensor of the present invention.
- the fitness exercise recognition method based on the wearable sensor of the present invention specifically includes the following steps:
- Step 1 Obtain the inertial sensor signal during the fitness exercise and perform preprocessing
- Linear interpolation Because the built-in acceleration sensor in the wearable device has worse performance than the independent acceleration sensor, the sampling clock is unstable, resulting in unequal time intervals between consecutive acceleration sampling points. To solve this problem, linear interpolation is used The method to ensure that the time interval between two sample points is fixed.
- the motion signal captured by the accelerometer and gyroscope contains a lot of noise, which may be caused by the unfixed position of the sensor, the resetting of exercises during fitness, and the shaking of the body when overcoming weights.
- the moving average filter is a low-pass filter, which can effectively reduce the influence of random interference. Therefore, the present invention proposes a fifth-order moving average filter algorithm to eliminate noise and reduce signal noise caused by the collection environment.
- Step 2 Perform window segmentation on the signal preprocessed in step 1; perform sliding window segmentation technology on the preprocessed data stream, set the window length to 2 seconds, and each window has a 50% overlap of information.
- segmentation techniques mainly include three methods: sliding window-based segmentation, event-based window segmentation, and action-defined window segmentation.
- Event-based window segmentation is to divide the data stream according to different events.
- the start and end of each window represent the beginning and end of an event. This method needs to use related algorithms to determine the start and end of the event, so it is rarely used in action recognition.
- Window segmentation based on action definition divides the data stream into windows of different time lengths according to different action types, and each window represents an action. This method is mainly based on the difference between different action signals to segment, and it is difficult to apply to real-time action recognition systems.
- Sliding window segmentation technology refers to the use of fixed-length windows to segment the data stream.
- the length of the window is set to 1 second, 3 seconds, 6 seconds, 12 seconds, etc.
- the data of adjacent windows can be partially overlapped or completely disjoint.
- the purpose of data overlap is to be more accurate Process the transition between actions.
- the present invention uses sliding window segmentation technology for processing, the window length is set to 2 seconds, and there is 50% signal overlap between adjacent windows, which can effectively avoid the loss of information.
- Step 3 Perform feature extraction on the signal segmented in Step 2;
- the feature extraction is divided into two parts: the bottom layer (ankle, thigh) and the top layer (waist, wrist, arm).
- the bottom layer ankle, thigh
- the top layer waist, wrist, arm
- the average value, variance and energy of the bottom layer are extracted as features.
- the movements of the upper body are more complex and similar, so more features need to be extracted, and finally the maximum value, minimum value, mean value, variance, skewness, kurtosis, and the 5 maximum peaks of the discrete Fourier transform spectrogram are selected. Energy, etc. as a feature.
- the extraction of data features in the present invention includes two parts: time domain and frequency domain features.
- the time domain features include maximum, minimum, mean, variance, skewness, and kurtosis; the frequency domain features mainly select the five maximum peaks and energy of the discrete Fourier transform spectrum.
- ⁇ represents the mean
- ⁇ 2 represents the variance
- S max represents the maximum value in the vector
- S min represents the minimum value in the vector
- ske represents the skewness
- kur represents the kurtosis
- S DFT (k) represents the discrete Fourier transform ( Discrete Fourier Transform (DFT) is the peak value of the k-th element
- E represents energy.
- Step 4 normalize the feature data extracted in step 3;
- the extracted features are formed into a feature vector, and the vector is normalized to the interval [0,1] through a formula for the training of the classifier.
- the present invention uses normalization to process the data so that it is limited to the range of [0,1].
- the formula is as follows:
- Step 5 Perform feature dimensionality reduction on the features processed in step 4;
- the dimensionality of the composed feature vector is high, so it is necessary to reduce the dimensionality of the features
- the present invention uses the principal component analysis method (PCA) to reduce the dimensionality of the feature vector obtained as described above.
- PCA principal component analysis method
- Step 6 Identify the features processed in step 5.
- the back propagation (BP) neural network is used to train and classify the samples.
- the features mean, variance, energy
- the features are extracted from the ankle and thigh node data
- only train one BP neural network to classify the four lower body states, so that different concurrent actions can be divided into four groups, effectively reducing the decision boundary Complexity.
- On the top layer of the system extract the features (12 features) of the data of the wrist, arm and waist nodes and perform dimensionality reduction, and design the top-level neural network corresponding to different lower limb states to identify upper body movements and infer the final fitness movements .
- the characteristics of the fitness exercise recognition method based on the wearable sensor of the present invention are that the accelerometer and gyroscope integrated in the wearable sensor are used to separately collect the acceleration signal and angular velocity signal of the human body during fitness exercise.
- window segmentation and feature extraction of acceleration and angular velocity signals, dimensionality reduction processing and classification modeling of the extracted features are implemented to realize the recognition of fitness exercises.
- the main contents include: an effective fixed-length sliding window segmentation method is proposed to divide the sensor data stream.
- the window length is 2 seconds, and there is 50% signal overlap in adjacent windows, which can effectively avoid the loss of information;
- An effective neural network-based hierarchical recognition method realizes the recognition of concurrent upper and lower body actions during fitness;
- An effective feature extraction method is proposed, which selects different time domains of exercise cycles according to different layers And frequency domain characteristics, can better reflect the characteristics of the runtime; finally use several common classification techniques including least squares, naive Bayes and k-nearest neighbor algorithm test comparison, and summarize the most effective for fitness exercise recognition method.
Abstract
Description
Claims (8)
- 基于可穿戴传感器的健身运动识别方法,其特征在于:具体包括如下步骤:步骤1,获取健身运动过程中的惯性传感信号并进行预处理;步骤2,对步骤1预处理后的信号进行窗口分割;步骤3,对步骤2分割后的信号进行特征提取;步骤4,对步骤3提取的特征数据进行归一化处理;步骤5,对步骤4处理后特征进行特征降维;步骤6,对经步骤5处理后的特征进行识别。
- 根据权利要求1所述的基于可穿戴传感器的健身运动识别方法,其特征在于:所述步骤1中健身运动信号的获取过程为:利用可穿戴设备中内置的加速度计和陀螺仪来获取健身运动的加速度信号和角速度信号;信号预处理的过程为:对采集到的加速度信号和角速度信号进行线性插值和去燥处理。
- 根据权利要求2所述的基于可穿戴传感器的健身运动识别方法,其特征在于:所述步骤2的具体过程为:所述窗口分割方法包括基于滑动窗口的分割、基于事件定义的窗口分割和基于动作定义的窗口分割。
- 根据权利要求2所述的基于可穿戴传感器的健身运动识别方法,其特征在于:所述步骤2中每个窗口有50%信息的重叠。
- 根据权利要求1所述的基于可穿戴传感器的健身运动识别方法,其特征在于:所述步骤5中采用主成分分析法进行降维。
- 根据权利要求1所述的基于可穿戴传感器的健身运动识别方法,其特征在于:所述步骤6中采用基于神经网络的分层方法对人在健身房运动的行为进行识别。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911257652.2 | 2019-12-10 | ||
CN201911257652.2A CN111089604B (zh) | 2019-12-10 | 2019-12-10 | 基于可穿戴传感器的健身运动识别方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021115064A1 true WO2021115064A1 (zh) | 2021-06-17 |
Family
ID=70394962
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/129525 WO2021115064A1 (zh) | 2019-12-10 | 2020-11-17 | 基于可穿戴传感器的健身运动识别方法 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111089604B (zh) |
WO (1) | WO2021115064A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117574133A (zh) * | 2024-01-11 | 2024-02-20 | 湖南工商大学 | 一种不安全生产行为识别方法及相关设备 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111089604B (zh) * | 2019-12-10 | 2021-09-07 | 中国科学院深圳先进技术研究院 | 基于可穿戴传感器的健身运动识别方法 |
CN112633467A (zh) * | 2020-11-25 | 2021-04-09 | 超越科技股份有限公司 | 一种基于猫眼连接改进lstm模型的人体行为识别方法 |
CN113591552A (zh) * | 2021-06-18 | 2021-11-02 | 新绎健康科技有限公司 | 一种基于步态加速度数据进行身份识别的方法及系统 |
CN117084671B (zh) * | 2023-10-19 | 2024-04-02 | 首都医科大学宣武医院 | 一种基于陀螺仪信号的运动评估系统 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010008900A1 (en) * | 2008-06-24 | 2010-01-21 | Dp Technologies, Inc. | Program setting adjustments based on activity identification |
CN109933202A (zh) * | 2019-03-20 | 2019-06-25 | 深圳大学 | 一种基于骨传导的智能输入方法和系统 |
WO2019122168A1 (en) * | 2017-12-21 | 2019-06-27 | Yoti Holding Limited | Biometric user authentication |
CN110245718A (zh) * | 2019-06-21 | 2019-09-17 | 南京信息工程大学 | 一种基于联合时域频域特征的人体行为识别方法 |
CN110334573A (zh) * | 2019-04-09 | 2019-10-15 | 北京航空航天大学 | 一种基于密集连接卷积神经网络的人体运动状态判别方法 |
CN110532898A (zh) * | 2019-08-09 | 2019-12-03 | 北京工业大学 | 一种基于智能手机多传感器融合的人体活动识别方法 |
CN111089604A (zh) * | 2019-12-10 | 2020-05-01 | 中国科学院深圳先进技术研究院 | 基于可穿戴传感器的健身运动识别方法 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102387378B1 (ko) * | 2014-10-07 | 2022-04-15 | 삼성전자주식회사 | 보행 동작 인식을 위한 장치 및 방법 |
US10477354B2 (en) * | 2015-02-20 | 2019-11-12 | Mc10, Inc. | Automated detection and configuration of wearable devices based on on-body status, location, and/or orientation |
CN108549856B (zh) * | 2018-04-02 | 2021-04-30 | 上海理工大学 | 一种人体动作和路况识别方法 |
CN108764282A (zh) * | 2018-04-19 | 2018-11-06 | 中国科学院计算技术研究所 | 一种类别增量行为识别方法和系统 |
CN109086667A (zh) * | 2018-07-02 | 2018-12-25 | 南京邮电大学 | 基于智能终端的相似活动识别方法 |
-
2019
- 2019-12-10 CN CN201911257652.2A patent/CN111089604B/zh active Active
-
2020
- 2020-11-17 WO PCT/CN2020/129525 patent/WO2021115064A1/zh active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010008900A1 (en) * | 2008-06-24 | 2010-01-21 | Dp Technologies, Inc. | Program setting adjustments based on activity identification |
WO2019122168A1 (en) * | 2017-12-21 | 2019-06-27 | Yoti Holding Limited | Biometric user authentication |
CN109933202A (zh) * | 2019-03-20 | 2019-06-25 | 深圳大学 | 一种基于骨传导的智能输入方法和系统 |
CN110334573A (zh) * | 2019-04-09 | 2019-10-15 | 北京航空航天大学 | 一种基于密集连接卷积神经网络的人体运动状态判别方法 |
CN110245718A (zh) * | 2019-06-21 | 2019-09-17 | 南京信息工程大学 | 一种基于联合时域频域特征的人体行为识别方法 |
CN110532898A (zh) * | 2019-08-09 | 2019-12-03 | 北京工业大学 | 一种基于智能手机多传感器融合的人体活动识别方法 |
CN111089604A (zh) * | 2019-12-10 | 2020-05-01 | 中国科学院深圳先进技术研究院 | 基于可穿戴传感器的健身运动识别方法 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117574133A (zh) * | 2024-01-11 | 2024-02-20 | 湖南工商大学 | 一种不安全生产行为识别方法及相关设备 |
CN117574133B (zh) * | 2024-01-11 | 2024-04-02 | 湖南工商大学 | 一种不安全生产行为识别方法及相关设备 |
Also Published As
Publication number | Publication date |
---|---|
CN111089604A (zh) | 2020-05-01 |
CN111089604B (zh) | 2021-09-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021115064A1 (zh) | 基于可穿戴传感器的健身运动识别方法 | |
Wang et al. | Fall detection based on dual-channel feature integration | |
Mubashir et al. | A survey on fall detection: Principles and approaches | |
Gupta et al. | A survey on human activity recognition and classification | |
Bennett et al. | Inertial measurement unit-based wearable computers for assisted living applications: A signal processing perspective | |
Zeng et al. | Silhouette-based gait recognition via deterministic learning | |
CN107462258B (zh) | 一种基于手机三轴加速度传感器的计步方法 | |
Jensen et al. | Classification of kinematic swimming data with emphasis on resource consumption | |
Hasan et al. | Robust pose-based human fall detection using recurrent neural network | |
Geng | Research on athlete’s action recognition based on acceleration sensor and deep learning | |
Lu et al. | MFE-HAR: multiscale feature engineering for human activity recognition using wearable sensors | |
Yuan et al. | Adaptive recognition of motion posture in sports video based on evolution equation | |
CN114881079A (zh) | 面向穿戴式传感器的人体运动意图异常检测方法及系统 | |
Javeed et al. | Deep activity recognition based on patterns discovery for healthcare monitoring | |
Ghobadi et al. | A robust automatic gait monitoring approach using a single IMU for home-based applications | |
Hajjej et al. | Deep human motion detection and multi-features analysis for smart healthcare learning tools | |
Sowmyayani et al. | Fall detection in elderly care system based on group of pictures | |
Khandnor et al. | A survey of activity recognition process using inertial sensors and smartphone sensors | |
Bansal et al. | Elderly people fall detection system using skeleton tracking and recognition | |
CN111767932B (zh) | 动作判定方法及装置、计算机设备及计算机可读存储介质 | |
CN116092193A (zh) | 一种基于人体运动状态识别的行人航迹推算方法 | |
Dorofeev et al. | Extraction of individual gait features in mobile phone accelerometer | |
Sugimoto et al. | Robust rule-based method for human activity recognition | |
Maldonado-Mendez et al. | Fall detection using features extracted from skeletal joints and SVM: Preliminary results | |
Ismael et al. | A study on human activity recognition using smartphone |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20899467 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20899467 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20899467 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 15/02/2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20899467 Country of ref document: EP Kind code of ref document: A1 |