WO2022100187A1 - Mobile terminal-based method for identifying and monitoring emotions of user - Google Patents

Mobile terminal-based method for identifying and monitoring emotions of user Download PDF

Info

Publication number
WO2022100187A1
WO2022100187A1 PCT/CN2021/113012 CN2021113012W WO2022100187A1 WO 2022100187 A1 WO2022100187 A1 WO 2022100187A1 CN 2021113012 W CN2021113012 W CN 2021113012W WO 2022100187 A1 WO2022100187 A1 WO 2022100187A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
user
heart rate
emotion
mobile terminal
Prior art date
Application number
PCT/CN2021/113012
Other languages
French (fr)
Chinese (zh)
Inventor
李志刚
李斯羽
问静波
齐振翮
张娜娜
Original Assignee
西北工业大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 西北工业大学 filed Critical 西北工业大学
Publication of WO2022100187A1 publication Critical patent/WO2022100187A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety

Definitions

  • the present invention relates to the field of data processing, in particular to a method for identifying and monitoring a user's emotion based on a mobile terminal.
  • user emotion recognition is mainly divided into two modules: user emotion recognition based on traditional methods (face-based user emotion recognition, text-based user emotion recognition, voice-based user emotion recognition and physiological feature-based user emotion recognition) and User emotion recognition based on a novel method (user emotion recognition based on gesture behavior).
  • the traditional emotion recognition method has good applicability for capturing the user's instantaneous emotion in scientific research, but in real life, the user's emotion may change every moment. Compared with the analysis of instantaneous emotion, Users may be more inclined to get the average sentiment over a certain period of time.
  • User emotion recognition based on traditional methods only describes user emotion from a single perspective, which reduces the accuracy of emotion recognition.
  • user emotion recognition based on traditional methods sometimes needs to wear complex equipment for data collection. For example, user emotion analysis based on user brain waves requires users to wear a responsible EEG collection device to collect physiological characteristics.
  • the current emotion recognition based on traditional methods uses open source data sets to train the network, and the data may have problems such as low naturalness and poor soundness, which will directly affect emotions recognition accuracy.
  • the present invention provides a method for identifying and monitoring users' emotions based on a mobile terminal.
  • the present invention collects the diversified daily data generated by mobile terminal devices such as smart phones and smart bracelets, and extracts characteristic values. Create a model to describe the user's emotions as accurately as possible from multiple perspectives without affecting the user's daily work, and at the same time provide the analysis results to other programs in need for use.
  • the technical scheme of the present invention is: a method for identifying and monitoring the emotion of a user based on a mobile terminal, the method comprising the following steps:
  • Step 1 collect user data to generate a data set; the data heart rate data and emotion data; the data set includes a heart rate data set and an emotion data set;
  • Step 2 Preliminary data processing, the preliminary processing includes supplementing the missing characteristic values of some data in the data set, integrating the data, and extracting corresponding characteristic values;
  • Step 3 Perform emotion recognition and select a model for recognition.
  • the heart rate data is divided into daytime heart rate data and nighttime heart rate data, the daytime heart rate data is obtained by a heart rate sensor; the nighttime heart rate data is obtained by Triaxial accelerometer and heart rate sensor are obtained.
  • a method for identifying and monitoring user's emotion based on a mobile terminal wherein the emotion data includes a positive class, a general class, and a negative class.
  • the feature value is supplemented by an automatic strategy based on other known values and an average value.
  • a method for identifying and monitoring the user's emotion based on the mobile terminal the extraction of the corresponding characteristic value is that after the data classification collected by the APP carries out graphic visualization observation, the corresponding characteristic value is integrated and extracted for part of the data.
  • a method for identifying and monitoring a user's emotion based on a mobile terminal wherein the emotion feature involves an APP index, a physiological index, a sleep index, and a communication index;
  • the model includes an SVM model and a neural network model.
  • the invention collects the daily data of the user through the smart phone and the smart bracelet, finds out the characteristic data for network training, thus obtains the mapping relationship between the user's external performance data and the internal emotion, and identifies the user's internal emotion. Collecting users' daily data based on mobile devices can solve the problems of low data naturalness and poor soundness in traditional emotion recognition.
  • the high popularity of smartphones in my country can ensure the amount of data that can be collected; second, different people use different mobile phones, which can ensure the specificity of data and the soundness of experimental results; finally, because mobile phones are portable and easy to carry, different Compared with traditional emotion recognition, the new type of emotion recognition based on smart phones can easily obtain the data generated by the user daily to recognize the user's emotion without the need for the user to wear an additional data collection device.
  • the present invention is a method for identifying and monitoring the user's emotion based on the mobile terminal, which is different from the traditional single prediction based on expression, behavior and language.
  • Various data generated by the user such as communication, sleep, APP usage, etc., are collected to build a data set that covers the user's life, and then complete the accurate perception of the user's emotions.
  • the invention simplifies the collection process, improves user friendliness, describes the user's emotional characteristics from a multi-dimensional perspective, improves the accuracy of the recognition result, and at the same time in the data processing stage, extracts key features for modeling, reducing subsequent The complexity of the machine learning algorithm reduces the algorithm running time.
  • Figure 1 is a flowchart of emotion recognition and monitoring based on mobile terminal users.
  • a method for identifying and monitoring a user's emotion based on a mobile terminal includes the following steps.
  • Step 1 Collect user data to generate a data set; the data heart rate data and emotion data; the data set includes a heart rate data set and an emotion data set.
  • the heart rate data is divided into daytime heart rate data and nighttime heart rate data, the daytime heart rate data is obtained by a heart rate sensor; the nighttime heart rate data is obtained by a triaxial acceleration sensor and a heart rate sensor.
  • the emotion data includes positive class, general class, and negative class.
  • a mobile APP to collect the data generated by users every day. Record the usage time of ⁇ kinds of applications through UsageStats through the APP, and get the collection
  • the pedometer function is realized through the newly added STEP DETECTOR and STEP COUNTER sensors of Android, and the number of steps s and movement distance d of the user in the day is recorded, and the collection is obtained. Collect the user's total communication duration ct and times cf through PhoneStateListener to get the data
  • Step 2 Preliminary data processing, which includes supplementing some data in the data set with missing eigenvalues, integrating the data, and extracting corresponding eigenvalues; the eigenvalues are supplemented using automatic strategies based on other known data.
  • the average value is used for supplementation; the extraction of the corresponding feature value is that after the data collected by the APP is classified for graphic visualization, the corresponding feature value is extracted from the integration of some data; the emotional feature involves the APP index and the physiological index. , sleep indicators, communication indicators.
  • the specific method is as follows: First, the collected data is managed, which is mainly divided into three parts: First, for the situation that some data in the data set have missing eigenvalues, an automatic strategy is used here based on other known values. Supplement; second, data integration is performed. After the data collected by the APP is classified and visualized, the corresponding feature values are extracted from the integration of some data.
  • the APP indicators include the type of APP, the usage time of the APP, and the number of times the APP is used. Considering the types of APPs on the market and the impact on users' emotions, APPs are currently only divided into three categories: work, entertainment, and social interaction. According to different APP types, the total working hours of the user in the day can be obtained Total entertainment time of the day Total social time of the day
  • the APP indicators can be obtained by weighting the total duration of different types of APPs:
  • the obtained three-axis accelerated data set is processed as follows to obtain parameters value: Get the dataset corresponding to each time interval at night First, determine whether the user is in a sleep state or an awake state.
  • the three-axis acceleration sensor is the mainstay, and the heart rate is assisted.
  • the acceleration calculation weight is less than the heart rate, that is When the user is in sleep state, summing the time can get the total sleep time of the user
  • the sleep state can be further classified, with the heart rate as the leading factor and the acceleration sensor as the auxiliary, and the sleep time can be divided into deep sleep time and light sleep time:
  • communication data represents the communication time, and Indicates the number of communications, and the average duration of communications can be obtained
  • h_m ⁇ the maximum heart rate during the day in the ⁇ th time interval of the ith user on the jth day;
  • Step 3 Perform emotion recognition and select a model for recognition.
  • the models include SVM models and neural network models.
  • SVM is a very common and effective classifier that builds a maximum margin hyperplane in the space by mapping the vector to a higher latitude space.
  • the present invention uses the default Graph of TensorFlow to create ops, uses placeholder() as the input container of data, and instructs TensorFlow to run the corresponding nodes of the graph through sess.run(). Through continuous training with the training data collected, a usable classification model is obtained.
  • the main steps of the experiment are as follows: parse the dataset and cross-validate, preprocess the data, draw the data flow diagram, define the Gaussian kernel function, create the dual loss function, create the classification kernel function, create the classification function, set the optimizer and initialize, train and evaluate the model .
  • Neural network model also known as artificial neural network, is a derivative of using mathematical methods to simulate the working principle of human neurons in the brain, and is divided into input layer, hidden layer and output layer.
  • the input layer inputs feature data, defines the number of hidden layers and the number of neurons in each layer, and the output layer is used to output the results.
  • the neural network model constructed in this project is trained using the data set generated by the user's daily life collected on the mobile terminal. Through continuous optimization, an available emotion recognition model is trained to recognize and monitor the user's daily emotion.
  • Two high-level TensorFlow are mainly used here: APIEstimator and Dataset to build neural network models. The collected data is divided into two categories, one is used as a sample for training, and the other is used for evaluation. At the same time, TensorBoard is used to visualize the training process.
  • the experiment is divided into the following steps: import and parse the dataset, describe the data, define the model type, define the training input function, train the model, define the test model, and evaluate the model.

Abstract

A mobile terminal-based method for identifying and monitoring the emotions of a user, which is different from carrying out single prediction in the traditional sense purely on the basis of expression, behavior and language, rather, in said method, various pieces of data generated by a user, such as communication, sleep, APP usage and the like, are collected by means of a mobile phone and a bracelet that the user carries daily, a data set that covers the user's life to a high degree, and then an accurate perception of the emotions of the user is completed. During data collection, the collection process is simplified and user-friendliness is improved, emotional features of a user are described from a multi-dimensional perspective to improve the accuracy of recognition results; meanwhile, in a data processing stage, key features are extracted for modeling, reducing the complexity of a subsequent machine learning algorithm and thus shortening the algorithm running time.

Description

一种基于移动端对用户的情感进行识别与监测方法A method for identifying and monitoring users' emotions based on mobile terminals 技术领域technical field
本发明涉及数据处理领域,具体涉及一种基于移动端对用户的情感进行识别与监测方法。The present invention relates to the field of data processing, in particular to a method for identifying and monitoring a user's emotion based on a mobile terminal.
背景技术Background technique
目前用户情感识别主要分为两大模块:基于传统方式的用户情感识别(基于人脸的用户情感识别、基于文本的用户情感识别、基于语音的用户情感识别和基于生理特征的用户情感识别)以及基于新型方式的用户情感识别(基于姿势行为的用户情感识别)。At present, user emotion recognition is mainly divided into two modules: user emotion recognition based on traditional methods (face-based user emotion recognition, text-based user emotion recognition, voice-based user emotion recognition and physiological feature-based user emotion recognition) and User emotion recognition based on a novel method (user emotion recognition based on gesture behavior).
传统的情感技术识别的方法在科研方面,对于抓取用户的瞬时情感有很好的应用性,但是在现实生活中,用户的情感每时每刻都可能发生变化,相对于瞬时情感的分析,用户可能更倾向得到某一段时间中的平均情感。基于传统方法的用户情感识别,仅从单一角度去描述用户情感,降低了情感识别的准确度。同时基于传统方法的用户情感识别,有时需要佩戴复杂的设备进行数据的收集,比如基于用户脑电波进行用户情感分析,就需要用户佩戴负责的脑电收集装置进行生理特征的收集。此外,在后续的训练网络中,目前基于传统方法的情感技术识别都是利用开源的数据集对网络进行训练,数据可能会存在自然度低和健全性较差等问题,这样就会直接影响情绪识别的精度。The traditional emotion recognition method has good applicability for capturing the user's instantaneous emotion in scientific research, but in real life, the user's emotion may change every moment. Compared with the analysis of instantaneous emotion, Users may be more inclined to get the average sentiment over a certain period of time. User emotion recognition based on traditional methods only describes user emotion from a single perspective, which reduces the accuracy of emotion recognition. At the same time, user emotion recognition based on traditional methods sometimes needs to wear complex equipment for data collection. For example, user emotion analysis based on user brain waves requires users to wear a responsible EEG collection device to collect physiological characteristics. In addition, in the subsequent training of the network, the current emotion recognition based on traditional methods uses open source data sets to train the network, and the data may have problems such as low naturalness and poor soundness, which will directly affect emotions recognition accuracy.
发明内容SUMMARY OF THE INVENTION
针对以上缺陷本发明提供一种基于移动端对用户的情感进行识别与监测方法,本发明通过对智能手机和智能手环这种移动端的设备 产生的多样化的日常数据进行收集,提取特征值,创建模型,从多方面多角度在不影响用户的日常生活工作的情况下尽可能准确的描述用户情感,同时将分析结果提供给其他有需要程序进行利用。In view of the above defects, the present invention provides a method for identifying and monitoring users' emotions based on a mobile terminal. The present invention collects the diversified daily data generated by mobile terminal devices such as smart phones and smart bracelets, and extracts characteristic values. Create a model to describe the user's emotions as accurately as possible from multiple perspectives without affecting the user's daily work, and at the same time provide the analysis results to other programs in need for use.
本发明的技术方案为:一种基于移动端对用户的情感进行识别与监测方法,所述方法包括以下步骤:The technical scheme of the present invention is: a method for identifying and monitoring the emotion of a user based on a mobile terminal, the method comprising the following steps:
步骤1:收集用户的数据,生成数据集;所述数据心率数据、情感数据;所述数据集包括心率数据集、情感数据集;Step 1: collect user data to generate a data set; the data heart rate data and emotion data; the data set includes a heart rate data set and an emotion data set;
步骤2:数据初步处理,所述初步处理包括对于数据集中部分数据存在特征值缺失的情况进行补充,对数据进行整合,提取相应的特征值;Step 2: Preliminary data processing, the preliminary processing includes supplementing the missing characteristic values of some data in the data set, integrating the data, and extracting corresponding characteristic values;
步骤3:进行情感识别,选用模型进行识别。Step 3: Perform emotion recognition and select a model for recognition.
进一步的,一种基于移动端对用户的情感进行识别与监测方法,所述心率数据分为日间心率数据和夜间心率数据,所述日间心率数据通过心率传感器获得;所述夜间心率数据通过三轴加速度传感器和心率传感器获得。Further, a method for identifying and monitoring a user's emotion based on a mobile terminal, the heart rate data is divided into daytime heart rate data and nighttime heart rate data, the daytime heart rate data is obtained by a heart rate sensor; the nighttime heart rate data is obtained by Triaxial accelerometer and heart rate sensor are obtained.
进一步的,一种基于移动端对用户的情感进行识别与监测方法,所述情感数据包括积极类、一般类、消极类。Further, a method for identifying and monitoring user's emotion based on a mobile terminal, wherein the emotion data includes a positive class, a general class, and a negative class.
进一步的,一种基于移动端对用户的情感进行识别与监测方法,所述特征值的补充采用自动策略根据其他已知值,利用平均值进行补充。Further, in a mobile terminal-based method for identifying and monitoring the user's emotion, the feature value is supplemented by an automatic strategy based on other known values and an average value.
进一步的,一种基于移动端对用户的情感进行识别与监测方法,所述提取相应的特征值是通过APP收集的数据分类进行图形可视化 观察之后,对于部分数据进行整合提取出相应的特征值。Further, a method for identifying and monitoring the user's emotion based on the mobile terminal, the extraction of the corresponding characteristic value is that after the data classification collected by the APP carries out graphic visualization observation, the corresponding characteristic value is integrated and extracted for part of the data.
进一步的,一种基于移动端对用户的情感进行识别与监测方法,所述情感特征涉及APP指标、生理指标、睡眠指标、通讯指标;Further, a method for identifying and monitoring a user's emotion based on a mobile terminal, wherein the emotion feature involves an APP index, a physiological index, a sleep index, and a communication index;
所述情感特征描述为:The emotional features are described as:
Figure PCTCN2021113012-appb-000001
Figure PCTCN2021113012-appb-000001
其中
Figure PCTCN2021113012-appb-000002
为APP指标、
Figure PCTCN2021113012-appb-000003
为生理指标、
Figure PCTCN2021113012-appb-000004
为睡眠指标、
Figure PCTCN2021113012-appb-000005
为通讯指标。
in
Figure PCTCN2021113012-appb-000002
for APP indicators,
Figure PCTCN2021113012-appb-000003
for physiological indicators,
Figure PCTCN2021113012-appb-000004
for sleep indicators,
Figure PCTCN2021113012-appb-000005
for communication indicators.
进一步的,一种基于移动端对用户的情感进行识别与监测方法,所述模型包括SVM模型和神经网络模型。Further, a method for identifying and monitoring a user's emotion based on a mobile terminal, the model includes an SVM model and a neural network model.
本发明通过智能手机和智能手环对用户的日常数据进行收集,找出特征数据进行网络训练,从而得到用户外在表现数据和内在情感的映射关系,将用户的内在情感识别出来。基于移动设备对用户的日常数据进行收集,可以解决传统情感识别中数据自然度低和健全性差等问题。首先我国智能手机的高普及性可以保证可收集的数据的数量;其次,不同人对于手机使用的情况不同,可以保证数据的特异性以及实验结果的健全性;最后,由于手机便携易带,不同于传统情感识别,基于智能手机的新型情感识别不需要用户佩戴额外的数据收集装置,就可以方便地获取用户日常产生的数据来进行用户情感的识别。The invention collects the daily data of the user through the smart phone and the smart bracelet, finds out the characteristic data for network training, thus obtains the mapping relationship between the user's external performance data and the internal emotion, and identifies the user's internal emotion. Collecting users' daily data based on mobile devices can solve the problems of low data naturalness and poor soundness in traditional emotion recognition. First, the high popularity of smartphones in my country can ensure the amount of data that can be collected; second, different people use different mobile phones, which can ensure the specificity of data and the soundness of experimental results; finally, because mobile phones are portable and easy to carry, different Compared with traditional emotion recognition, the new type of emotion recognition based on smart phones can easily obtain the data generated by the user daily to recognize the user's emotion without the need for the user to wear an additional data collection device.
本发明是一种基于移动端对用户的情感进行识别与监测的方法,不同于传统意义上的单纯基于表情,行为,语言中进行的单一预测,而是通过用户日常携带的手机和手环对用户产生的各种数据如通讯情况、睡眠情况、APP使用情况等等数据进行收集,构建一个对用户 生活高度覆盖的数据集,进而完成对用户情感的精准感知。本发明在数据收集的过程中简化了收集过程,提高用户友好度,以多维度的视角描述用户的情感特征提高识别结果的准确度,同时在数据处理阶段,提取关键特征进行建模,降低后续机器学习算法的复杂度,缩短了算法运行时间。The present invention is a method for identifying and monitoring the user's emotion based on the mobile terminal, which is different from the traditional single prediction based on expression, behavior and language. Various data generated by the user, such as communication, sleep, APP usage, etc., are collected to build a data set that covers the user's life, and then complete the accurate perception of the user's emotions. In the process of data collection, the invention simplifies the collection process, improves user friendliness, describes the user's emotional characteristics from a multi-dimensional perspective, improves the accuracy of the recognition result, and at the same time in the data processing stage, extracts key features for modeling, reducing subsequent The complexity of the machine learning algorithm reduces the algorithm running time.
附图说明Description of drawings
图1为基于移动端用户情感识别和监测流程图。Figure 1 is a flowchart of emotion recognition and monitoring based on mobile terminal users.
具体实施方式Detailed ways
下面结合附图来进一步描述本发明的技术方案:如图1所示,一种基于移动端对用户的情感进行识别与监测方法,所述方法包括以下步骤。The technical solution of the present invention is further described below with reference to the accompanying drawings: as shown in FIG. 1 , a method for identifying and monitoring a user's emotion based on a mobile terminal includes the following steps.
步骤1:收集用户的数据,生成数据集;所述数据心率数据、情感数据;所述数据集包括心率数据集、情感数据集。Step 1: Collect user data to generate a data set; the data heart rate data and emotion data; the data set includes a heart rate data set and an emotion data set.
为得到更加准备的结果,将所述心率数据分为日间心率数据和夜间心率数据,所述日间心率数据通过心率传感器获得;所述夜间心率数据通过三轴加速度传感器和心率传感器获得。所述情感数据包括积极类、一般类、消极类。In order to obtain a more prepared result, the heart rate data is divided into daytime heart rate data and nighttime heart rate data, the daytime heart rate data is obtained by a heart rate sensor; the nighttime heart rate data is obtained by a triaxial acceleration sensor and a heart rate sensor. The emotion data includes positive class, general class, and negative class.
具体的,首先设计一个手机APP对用户每天产生的数据进行收集。通过APP通过UsageStats记录α种应用程序的使用时间,得到集合
Figure PCTCN2021113012-appb-000006
通过Android新增的STEP DETECTOR以及STEP COUNTER传感器实现计步器功能,记录用户当天步数s和运动距离 d,得到集合
Figure PCTCN2021113012-appb-000007
通过PhoneStateListener对用户的通讯总时长ct和次数cf进行收集,得到数据
Figure PCTCN2021113012-appb-000008
Specifically, first design a mobile APP to collect the data generated by users every day. Record the usage time of α kinds of applications through UsageStats through the APP, and get the collection
Figure PCTCN2021113012-appb-000006
The pedometer function is realized through the newly added STEP DETECTOR and STEP COUNTER sensors of Android, and the number of steps s and movement distance d of the user in the day is recorded, and the collection is obtained.
Figure PCTCN2021113012-appb-000007
Collect the user's total communication duration ct and times cf through PhoneStateListener to get the data
Figure PCTCN2021113012-appb-000008
其次设计一个手环APP,通过手环上携带的心率传感器,记录日间心率数据
Figure PCTCN2021113012-appb-000009
通过手环携带的三轴加速度传感器和心率传感器设置时间节点记录夜间心率数据
Figure PCTCN2021113012-appb-000010
Figure PCTCN2021113012-appb-000011
和三轴加速度数据
Figure PCTCN2021113012-appb-000012
Figure PCTCN2021113012-appb-000013
收集用户每天的情感数据,将情感分为三类:1→积极,0→一般,-1→消极,得到用户每天的情感数据集
Figure PCTCN2021113012-appb-000014
根据每天移动端产生的数据对应用户每天的情感变化,可得到数据集:
Secondly, design a bracelet APP to record the daytime heart rate data through the heart rate sensor carried on the bracelet
Figure PCTCN2021113012-appb-000009
Set the time node to record the nighttime heart rate data through the three-axis acceleration sensor and heart rate sensor carried by the bracelet
Figure PCTCN2021113012-appb-000010
Figure PCTCN2021113012-appb-000011
and triaxial acceleration data
Figure PCTCN2021113012-appb-000012
Figure PCTCN2021113012-appb-000013
Collect the user's daily emotional data, divide the emotions into three categories: 1→positive, 0→normal, -1→negative, and get the user's daily emotional data set
Figure PCTCN2021113012-appb-000014
According to the daily emotional changes of users corresponding to the data generated by the mobile terminal every day, the data set can be obtained:
Figure PCTCN2021113012-appb-000015
Figure PCTCN2021113012-appb-000015
步骤2:数据初步处理,所述初步处理包括对于数据集中部分数据存在特征值缺失的情况进行补充,对数据进行整合,提取相应的特征值;所述特征值的补充采用自动策略根据其他已知值,利用平均值进行补充;所述提取相应的特征值是通过APP收集的数据分类进行图形可视化观察之后,对于部分数据进行整合提取出相应的特征值;所述情感特征涉及APP指标、生理指标、睡眠指标、通讯指标。Step 2: Preliminary data processing, which includes supplementing some data in the data set with missing eigenvalues, integrating the data, and extracting corresponding eigenvalues; the eigenvalues are supplemented using automatic strategies based on other known data. The average value is used for supplementation; the extraction of the corresponding feature value is that after the data collected by the APP is classified for graphic visualization, the corresponding feature value is extracted from the integration of some data; the emotional feature involves the APP index and the physiological index. , sleep indicators, communication indicators.
具体方法为:首先对于收集的数据进行管理,这里主要分为三个部分:第一、针对于数据集中部分数据存在特征值缺失的情况,这里采用自动策略根据其他已知值,利用平均值进行补充;第二、进行数据整合,通过APP收集的数据分类进行图形可视化观察之后,对于部分数据进行整合提取出相应的特征值。The specific method is as follows: First, the collected data is managed, which is mainly divided into three parts: First, for the situation that some data in the data set have missing eigenvalues, an automatic strategy is used here based on other known values. Supplement; second, data integration is performed. After the data collected by the APP is classified and visualized, the corresponding feature values are extracted from the integration of some data.
APP指标APP indicator
APP指标包括APP的种类,APP的使用时间以及APP的使用次数。考虑市面上APP的种类以及对用户情感的影响,目前仅将APP分为三大类:工作、娱乐、社交。通过根据不同的APP种类可以得到用户当日工作总时长
Figure PCTCN2021113012-appb-000016
当日娱乐总时长
Figure PCTCN2021113012-appb-000017
当日社交总时长
Figure PCTCN2021113012-appb-000018
The APP indicators include the type of APP, the usage time of the APP, and the number of times the APP is used. Considering the types of APPs on the market and the impact on users' emotions, APPs are currently only divided into three categories: work, entertainment, and social interaction. According to different APP types, the total working hours of the user in the day can be obtained
Figure PCTCN2021113012-appb-000016
Total entertainment time of the day
Figure PCTCN2021113012-appb-000017
Total social time of the day
Figure PCTCN2021113012-appb-000018
考虑不同APP种类的使用时间对于用户情感的影响程度的不同,将不同种类的APP总时长进行加权处理可以得到APP指标:Considering the different degrees of influence of the usage time of different APP types on user emotions, the APP indicators can be obtained by weighting the total duration of different types of APPs:
Figure PCTCN2021113012-appb-000019
Figure PCTCN2021113012-appb-000019
生理指标Physiological indicators
首先通过手机APP获取用户每天的运动距离s和步数d,得到集合
Figure PCTCN2021113012-appb-000020
其次设计一个手环APP,通过手环上携带的心率传感器,设置时间节点,每1h收集一次心率数据和三轴加速度,得到心率最大值集合:
First, obtain the daily movement distance s and steps d of the user through the mobile APP, and get the set
Figure PCTCN2021113012-appb-000020
Secondly, design a bracelet APP, set the time node through the heart rate sensor carried on the bracelet, collect heart rate data and three-axis acceleration every 1h, and obtain the maximum heart rate set:
Figure PCTCN2021113012-appb-000021
Figure PCTCN2021113012-appb-000021
取日间心率数据的方差,得到日间心率异常值:Take the variance of the daytime heart rate data to get the daytime heart rate abnormal value:
Figure PCTCN2021113012-appb-000022
Figure PCTCN2021113012-appb-000022
进而可以得到生理指标:Physiological indicators can then be obtained:
Figure PCTCN2021113012-appb-000023
Figure PCTCN2021113012-appb-000023
睡眠指标sleep indicators
设置时间节点,夜间的时候每30min收集一次心率数据和三轴 加速度,得到心率最大值集合:
Figure PCTCN2021113012-appb-000024
Figure PCTCN2021113012-appb-000025
同时收集夜间相同的时间间隔中对应的手环三轴加速度的数据,得到数据集:
Figure PCTCN2021113012-appb-000026
Figure PCTCN2021113012-appb-000027
对所得到的三轴加速的数据集进如下处理可得到参数
Figure PCTCN2021113012-appb-000028
值:
Figure PCTCN2021113012-appb-000029
得到夜间每个时间间隔对应的数据集
Figure PCTCN2021113012-appb-000030
首先判断用户处于睡眠状态还是清醒状态,以三轴加速度传感器为主导,心率辅助,当加速度计算权值小于心率时,即
Figure PCTCN2021113012-appb-000031
Figure PCTCN2021113012-appb-000032
时,用户处于睡眠状态,对时间求和可以得到用户睡眠总时间
Figure PCTCN2021113012-appb-000033
对睡眠状态可以进一步分类,以心率做主导,加速度传感器辅助,将睡眠时间分为深睡时间和浅睡时间:
Set the time node, collect heart rate data and triaxial acceleration every 30 minutes at night, and get the maximum heart rate set:
Figure PCTCN2021113012-appb-000024
Figure PCTCN2021113012-appb-000025
At the same time, the data of the corresponding three-axis acceleration of the bracelet in the same time interval at night are collected, and the data set is obtained:
Figure PCTCN2021113012-appb-000026
Figure PCTCN2021113012-appb-000027
The obtained three-axis accelerated data set is processed as follows to obtain parameters
Figure PCTCN2021113012-appb-000028
value:
Figure PCTCN2021113012-appb-000029
Get the dataset corresponding to each time interval at night
Figure PCTCN2021113012-appb-000030
First, determine whether the user is in a sleep state or an awake state. The three-axis acceleration sensor is the mainstay, and the heart rate is assisted. When the acceleration calculation weight is less than the heart rate, that is
Figure PCTCN2021113012-appb-000031
Figure PCTCN2021113012-appb-000032
When the user is in sleep state, summing the time can get the total sleep time of the user
Figure PCTCN2021113012-appb-000033
The sleep state can be further classified, with the heart rate as the leading factor and the acceleration sensor as the auxiliary, and the sleep time can be divided into deep sleep time and light sleep time:
Figure PCTCN2021113012-appb-000034
Figure PCTCN2021113012-appb-000034
进而得到睡眠指标:And then get the sleep indicator:
Figure PCTCN2021113012-appb-000035
Figure PCTCN2021113012-appb-000035
通讯指标communication indicators
通讯数据
Figure PCTCN2021113012-appb-000036
表示通讯时长,而
Figure PCTCN2021113012-appb-000037
表示通讯次数,可以得到通讯平均时长
Figure PCTCN2021113012-appb-000038
communication data
Figure PCTCN2021113012-appb-000036
represents the communication time, and
Figure PCTCN2021113012-appb-000037
Indicates the number of communications, and the average duration of communications can be obtained
Figure PCTCN2021113012-appb-000038
Figure PCTCN2021113012-appb-000039
Figure PCTCN2021113012-appb-000039
综合以上指标,所述情感特征描述:Based on the above indicators, the emotional characteristics are described as follows:
Figure PCTCN2021113012-appb-000040
Figure PCTCN2021113012-appb-000040
以上各符号含义如下:The meanings of the above symbols are as follows:
Figure PCTCN2021113012-appb-000041
——第i个用户第j天使用应用程序∝的总时长;
Figure PCTCN2021113012-appb-000041
——The total duration of the application ∝ used by the i-th user on the j-th day;
Figure PCTCN2021113012-appb-000042
——第i个用户第j天总步数;
Figure PCTCN2021113012-appb-000042
——The total number of steps of the i-th user on the j-th day;
Figure PCTCN2021113012-appb-000043
——第i个用户第j天运动距离;
Figure PCTCN2021113012-appb-000043
——The movement distance of the i-th user on the j-th day;
Figure PCTCN2021113012-appb-000044
——第i个用户第j天当天的通讯总时长;
Figure PCTCN2021113012-appb-000044
——The total communication time of the i-th user on the j-th day;
Figure PCTCN2021113012-appb-000045
——第i个用户第j天当天通讯总次数;
Figure PCTCN2021113012-appb-000045
——The total number of communications of the i-th user on the j-th day;
Figure PCTCN2021113012-appb-000046
——第i个用户第j天在第ε个时间间隔的心率数据;
Figure PCTCN2021113012-appb-000046
——The heart rate data of the ith user in the εth time interval on the jth day;
Figure PCTCN2021113012-appb-000047
——第i个用户第j天的日间心率数据集;
Figure PCTCN2021113012-appb-000047
——The daytime heart rate data set of the ith user on the jth day;
Figure PCTCN2021113012-appb-000048
——第i个用户第j天的心率数据集;
Figure PCTCN2021113012-appb-000048
——The heart rate data set of the ith user on the jth day;
Figure PCTCN2021113012-appb-000049
—第i个用户第j天在第ε个时间间隔的夜间三轴加速度;
Figure PCTCN2021113012-appb-000050
——第i个用户第j天的夜间三轴加速度数据集;
Figure PCTCN2021113012-appb-000049
—The nighttime triaxial acceleration of the ith user on the jth day in the εth time interval;
Figure PCTCN2021113012-appb-000050
——The nighttime three-axis acceleration data set of the ith user on the jth day;
Figure PCTCN2021113012-appb-000051
——第i个用户第j天的情感数据;
Figure PCTCN2021113012-appb-000051
——The emotional data of the ith user on the jth day;
Figure PCTCN2021113012-appb-000052
——第i个用户第j天使用工作APP的总时长;
Figure PCTCN2021113012-appb-000052
——The total length of time that the i-th user uses the work APP on the j-th day;
Figure PCTCN2021113012-appb-000053
——第i个用户第j天使用娱乐APP的总时长;
Figure PCTCN2021113012-appb-000053
——The total length of time that the i-th user uses the entertainment APP on the j-th day;
Figure PCTCN2021113012-appb-000054
——第i个用户第j天使用社交APP的总时长;
Figure PCTCN2021113012-appb-000054
——The total length of time that the i-th user uses the social APP on the j-th day;
h_m ε——第i个用户第j天在第ε个时间间隔日间心率最大值; h_m ε ——the maximum heart rate during the day in the εth time interval of the ith user on the jth day;
Figure PCTCN2021113012-appb-000055
——第i个用户第j天日间心率最大值集合;
Figure PCTCN2021113012-appb-000055
——The maximum set of daytime heart rate of the ith user on the jth day;
Figure PCTCN2021113012-appb-000056
——第i个用户第j天运动距离s和步数d数据集合;
Figure PCTCN2021113012-appb-000056
——The data set of the movement distance s and the number of steps d of the ith user on the jth day;
Figure PCTCN2021113012-appb-000057
——第i个用户第j天日间心率异常值;
Figure PCTCN2021113012-appb-000057
——The abnormal value of the daytime heart rate of the ith user on the jth day;
Figure PCTCN2021113012-appb-000058
——第i个用户第j天的生理指标值;
Figure PCTCN2021113012-appb-000058
——The physiological index value of the ith user on the jth day;
Figure PCTCN2021113012-appb-000059
——第i个用户第j天夜间心率最大值。
Figure PCTCN2021113012-appb-000059
——The maximum heart rate of the ith user at night on the jth day.
步骤3:进行情感识别,选用模型进行识别。所述模型包括SVM 模型和神经网络模型。Step 3: Perform emotion recognition and select a model for recognition. The models include SVM models and neural network models.
模型选择model selection
为了选择较优的模型,使用多种能对用户情感进行识别和监测的分类方法进行比较。In order to choose a better model, a variety of classification methods that can identify and monitor user emotions are compared.
基于SVM模型Based on SVM model
SVM是一个十分常用且有效的分类器,通过将向量映射到一个更高纬的空间里,在空间中建立一个最大间隔超平面。SVM is a very common and effective classifier that builds a maximum margin hyperplane in the space by mapping the vector to a higher latitude space.
本发明采用TensorFlow的默认Graph创建ops,并使用placeholder()作为数据的输入容器,通过sess.run()指示TensorFlow运行图相应节点。通过都收集的训练数据进行不断训练,得到可用的分类模型。The present invention uses the default Graph of TensorFlow to create ops, uses placeholder() as the input container of data, and instructs TensorFlow to run the corresponding nodes of the graph through sess.run(). Through continuous training with the training data collected, a usable classification model is obtained.
实验主要步骤如下:解析数据集并交叉验证、数据预处理、绘制数据流程图、定义高斯核函数、创建对偶损失函数、创建分类核函数、创建分类函数,设置优化器并初始化、训练并评估模型。The main steps of the experiment are as follows: parse the dataset and cross-validate, preprocess the data, draw the data flow diagram, define the Gaussian kernel function, create the dual loss function, create the classification kernel function, create the classification function, set the optimizer and initialize, train and evaluate the model .
基于神经网络模型Based on neural network model
神经网络模型又称人工神经网络,是使用数学方法来模拟人类神经元大脑神经元工作原理的衍生物,又分为输入层、隐藏层、输出层构成。输入层将特征数据输入,定义隐藏层的层数和每一层的神经元数,输出层用于输出结果。Neural network model, also known as artificial neural network, is a derivative of using mathematical methods to simulate the working principle of human neurons in the brain, and is divided into input layer, hidden layer and output layer. The input layer inputs feature data, defines the number of hidden layers and the number of neurons in each layer, and the output layer is used to output the results.
本项目构建的神经网络模型,使用移动端收集的用户日常生活产生的数据集进行训练,通过不断优化,训练出一个可用的情感识别模型,对用户的日常情感进行识别和监测。这里主要使用TensorFlow 两个高阶分别是:APIEstimator和Dataset来构建神经网络模型。将收集的数据分为两类,一类作为样本进行训练,一类进行评测,,同时使用TensorBoard对训练过程进行可视化。The neural network model constructed in this project is trained using the data set generated by the user's daily life collected on the mobile terminal. Through continuous optimization, an available emotion recognition model is trained to recognize and monitor the user's daily emotion. Two high-level TensorFlow are mainly used here: APIEstimator and Dataset to build neural network models. The collected data is divided into two categories, one is used as a sample for training, and the other is used for evaluation. At the same time, TensorBoard is used to visualize the training process.
实验分为以下步骤:导入和解析数据集、描述数据、定义模型类型、定义训练输入函数、训练模型、定义测试模型、评估模型。The experiment is divided into the following steps: import and parse the dataset, describe the data, define the model type, define the training input function, train the model, define the test model, and evaluate the model.

Claims (7)

  1. 一种基于移动端对用户的情感进行识别与监测方法,其特征在于:A method for identifying and monitoring users' emotions based on a mobile terminal, characterized in that:
    所述方法包括以下步骤:The method includes the following steps:
    步骤1:收集用户的数据,生成数据集;所述数据心率数据、情感数据;所述数据集包括心率数据集、情感数据集;Step 1: collect user data to generate a data set; the data heart rate data and emotion data; the data set includes a heart rate data set and an emotion data set;
    步骤2:数据初步处理,所述初步处理包括对于数据集中部分数据存在特征值缺失的情况进行补充,对数据进行整合,提取相应的特征值;Step 2: Preliminary data processing, the preliminary processing includes supplementing the missing characteristic values of some data in the data set, integrating the data, and extracting corresponding characteristic values;
    步骤3:进行情感识别,选用模型进行识别。Step 3: Perform emotion recognition and select a model for recognition.
  2. 根据权利要求1所述的一种基于移动端对用户的情感进行识别与监测方法,其特征在于:所述心率数据分为日间心率数据和夜间心率数据,所述日间心率数据通过心率传感器获得;所述夜间心率数据通过三轴加速度传感器和心率传感器获得。The method for identifying and monitoring user's emotions based on a mobile terminal according to claim 1, wherein the heart rate data is divided into daytime heart rate data and nighttime heart rate data, and the daytime heart rate data is passed through a heart rate sensor. Obtained; the nighttime heart rate data is obtained through a three-axis acceleration sensor and a heart rate sensor.
  3. 根据权利要求1所述的一种基于移动端对用户的情感进行识别与监测方法,其特征在于:所述情感数据包括积极类、一般类、消极类。The method for identifying and monitoring the user's emotion based on a mobile terminal according to claim 1, wherein the emotion data includes a positive class, a general class, and a negative class.
  4. 根据权利要求1所述的一种基于移动端对用户的情感进行识别与监测方法,其特征在于:所述特征值的补充采用自动策略根据其他已知值,利用平均值进行补充。The method for identifying and monitoring the user's emotion based on a mobile terminal according to claim 1, wherein the feature value is supplemented by an automatic strategy based on other known values and an average value.
  5. 根据权利要求1所述的一种基于移动端对用户的情感进行识别与监测方法,其特征在于:所述提取相应的特征值是通过APP收集的数据分类进行图形可视化观察之后,对于部分数据进行整合提取出相应的特征值。The method for identifying and monitoring users' emotions based on a mobile terminal according to claim 1, wherein the extraction of corresponding feature values is performed by classifying and visualizing the data collected by the APP. The corresponding eigenvalues are extracted by integration.
  6. 根据权利要求5所述的一种基于移动端对用户的情感进行识别与监测方法,其特征在于:所述情感特征涉及APP指标、生理指标、睡眠指标、通讯指标;The method for identifying and monitoring the user's emotion based on a mobile terminal according to claim 5, wherein the emotion feature involves an APP index, a physiological index, a sleep index, and a communication index;
    所述情感特征描述为:The emotional features are described as:
    Figure PCTCN2021113012-appb-100001
    Figure PCTCN2021113012-appb-100001
    其中
    Figure PCTCN2021113012-appb-100002
    为APP指标、
    Figure PCTCN2021113012-appb-100003
    为生理指标、
    Figure PCTCN2021113012-appb-100004
    为睡眠指标、
    Figure PCTCN2021113012-appb-100005
    为通讯指标。
    in
    Figure PCTCN2021113012-appb-100002
    for APP indicators,
    Figure PCTCN2021113012-appb-100003
    for physiological indicators,
    Figure PCTCN2021113012-appb-100004
    for sleep indicators,
    Figure PCTCN2021113012-appb-100005
    for communication indicators.
  7. 根据权利要求1所述的一种基于移动端对用户的情感进行识别与监测方法,其特征在于:所述模型包括SVM模型和神经网络模型。The method for identifying and monitoring the user's emotion based on a mobile terminal according to claim 1, wherein the model comprises an SVM model and a neural network model.
PCT/CN2021/113012 2020-11-11 2021-08-17 Mobile terminal-based method for identifying and monitoring emotions of user WO2022100187A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011256341.7A CN112370058A (en) 2020-11-11 2020-11-11 Method for identifying and monitoring emotion of user based on mobile terminal
CN202011256341.7 2020-11-11

Publications (1)

Publication Number Publication Date
WO2022100187A1 true WO2022100187A1 (en) 2022-05-19

Family

ID=74582794

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/113012 WO2022100187A1 (en) 2020-11-11 2021-08-17 Mobile terminal-based method for identifying and monitoring emotions of user

Country Status (2)

Country Link
CN (1) CN112370058A (en)
WO (1) WO2022100187A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112370058A (en) * 2020-11-11 2021-02-19 西北工业大学 Method for identifying and monitoring emotion of user based on mobile terminal
CN116631628A (en) * 2023-07-21 2023-08-22 北京中科心研科技有限公司 Method and device for identifying dysthymia and wearable equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180005137A1 (en) * 2016-06-30 2018-01-04 Cal-Comp Electronics & Communications Company Limited Emotion analysis method and electronic apparatus thereof
CN108216254A (en) * 2018-01-10 2018-06-29 山东大学 The road anger Emotion identification method merged based on face-image with pulse information
CN109670406A (en) * 2018-11-25 2019-04-23 华南理工大学 A kind of contactless emotion identification method of combination heart rate and facial expression object game user
CN110507335A (en) * 2019-08-23 2019-11-29 山东大学 Inmate's psychological health states appraisal procedure and system based on multi-modal information
CN110909876A (en) * 2019-11-27 2020-03-24 上海交通大学 Sign information monitoring method and system based on multiple physiological parameters and CNN
CN111259895A (en) * 2020-02-21 2020-06-09 天津工业大学 Emotion classification method and system based on facial blood flow distribution
CN111444863A (en) * 2020-03-30 2020-07-24 华南理工大学 Camera-based 5G vehicle-mounted network cloud-assisted driver emotion recognition method
CN112370058A (en) * 2020-11-11 2021-02-19 西北工业大学 Method for identifying and monitoring emotion of user based on mobile terminal

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012025622A2 (en) * 2010-08-27 2012-03-01 Smartex S.R.L. Monitoring method and system for assessment of prediction of mood trends
US9241664B2 (en) * 2012-08-16 2016-01-26 Samsung Electronics Co., Ltd. Using physical sensory input to determine human response to multimedia content displayed on a mobile device
CN105306703A (en) * 2015-09-30 2016-02-03 西安沧海网络科技有限公司 Emotion recognition wearable device based on smartphone
CN106037749A (en) * 2016-05-18 2016-10-26 武汉大学 Old people falling monitoring method based on smart mobile phone and wearable device
CN106510658B (en) * 2016-10-25 2019-08-02 广东乐源数字技术有限公司 A kind of human body emotion judgment method based on bracelet
CN106725382A (en) * 2016-12-28 2017-05-31 天津众阳科技有限公司 Sleep state judgement system and method based on action and HRV measurements

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180005137A1 (en) * 2016-06-30 2018-01-04 Cal-Comp Electronics & Communications Company Limited Emotion analysis method and electronic apparatus thereof
CN108216254A (en) * 2018-01-10 2018-06-29 山东大学 The road anger Emotion identification method merged based on face-image with pulse information
CN109670406A (en) * 2018-11-25 2019-04-23 华南理工大学 A kind of contactless emotion identification method of combination heart rate and facial expression object game user
CN110507335A (en) * 2019-08-23 2019-11-29 山东大学 Inmate's psychological health states appraisal procedure and system based on multi-modal information
CN110909876A (en) * 2019-11-27 2020-03-24 上海交通大学 Sign information monitoring method and system based on multiple physiological parameters and CNN
CN111259895A (en) * 2020-02-21 2020-06-09 天津工业大学 Emotion classification method and system based on facial blood flow distribution
CN111444863A (en) * 2020-03-30 2020-07-24 华南理工大学 Camera-based 5G vehicle-mounted network cloud-assisted driver emotion recognition method
CN112370058A (en) * 2020-11-11 2021-02-19 西北工业大学 Method for identifying and monitoring emotion of user based on mobile terminal

Also Published As

Publication number Publication date
CN112370058A (en) 2021-02-19

Similar Documents

Publication Publication Date Title
CN108806792B (en) Deep learning face diagnosis system
CN105877766B (en) A kind of state of mind detection system and method based on the fusion of more physiological signals
WO2022100187A1 (en) Mobile terminal-based method for identifying and monitoring emotions of user
CN110353673B (en) Electroencephalogram channel selection method based on standard mutual information
CN108304917A (en) A kind of P300 signal detecting methods based on LSTM networks
CN107220591A (en) Multi-modal intelligent mood sensing system
Dureha An accurate algorithm for generating a music playlist based on facial expressions
CN104484644B (en) A kind of gesture identification method and device
CN106228200A (en) A kind of action identification method not relying on action message collecting device
CN108937973A (en) A kind of robotic diagnostic human body indignation mood method and device
CN109508755B (en) Psychological assessment method based on image cognition
Bu Human motion gesture recognition algorithm in video based on convolutional neural features of training images
CN108958482B (en) Similarity action recognition device and method based on convolutional neural network
CN110472649A (en) Brain electricity sensibility classification method and system based on multiscale analysis and integrated tree-model
CN110974219A (en) Human brain idea recognition system based on invasive BCI
CN110025322A (en) Multi-modal physiological signal sensibility classification method based on filtering with integrated classifier
CN109765996A (en) Insensitive gesture detection system and method are deviated to wearing position based on FMG armband
CN108256579A (en) A kind of multi-modal sense of national identity quantization measuring method based on priori
CN111920420A (en) Patient behavior multi-modal analysis and prediction system based on statistical learning
CN107582077A (en) A kind of human body state of mind analysis method that behavior is touched based on mobile phone
CN107292296A (en) A kind of human emotion wake-up degree classifying identification method of use EEG signals
CN108717548B (en) Behavior recognition model updating method and system for dynamic increase of sensors
Cui et al. Emotion detection from natural walking
CN111753683A (en) Human body posture identification method based on multi-expert convolutional neural network
CN112668486A (en) Method, device and carrier for identifying facial expressions of pre-activated residual depth separable convolutional network

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21890716

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21890716

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 13.11.2023)