CN109543659B - Risk behavior monitoring and early warning method and system suitable for old users - Google Patents

Risk behavior monitoring and early warning method and system suitable for old users Download PDF

Info

Publication number
CN109543659B
CN109543659B CN201811590111.7A CN201811590111A CN109543659B CN 109543659 B CN109543659 B CN 109543659B CN 201811590111 A CN201811590111 A CN 201811590111A CN 109543659 B CN109543659 B CN 109543659B
Authority
CN
China
Prior art keywords
risk
emotion
early warning
data
classifier
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811590111.7A
Other languages
Chinese (zh)
Other versions
CN109543659A (en
Inventor
马皑
宋业臻
方秋兰
孙晓
王方兵
刘晓倩
林振林
赵一洋
舒志
陈奕帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Xinfa Technology Co.,Ltd.
Original Assignee
Beijing Xinfa Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xinfa Technology Co Ltd filed Critical Beijing Xinfa Technology Co Ltd
Priority to CN201811590111.7A priority Critical patent/CN109543659B/en
Publication of CN109543659A publication Critical patent/CN109543659A/en
Application granted granted Critical
Publication of CN109543659B publication Critical patent/CN109543659B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Computing Systems (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Alarm Systems (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention provides a risk behavior monitoring and early warning method and system suitable for an old user. A risk behavior monitoring and early warning method suitable for an elderly user comprises the following steps: acquiring daily behavior data, social interaction data and facial activity monitoring data of an old user; inputting the daily behavior data and the social interaction data into a risk classifier for risk classification and inputting the facial activity monitoring data into a pre-trained emotion classifier for emotion classification; and determining whether to perform early warning or not according to the emotion type and risk type of the old user. In the embodiment, the early warning result can be more accurate by determining the emotion type and the risk type of the old user. In addition, in the embodiment, the emotion of the individual can be dynamically monitored in real time, the real-time performance of monitoring can be realized, and the monitoring efficiency can be improved.

Description

Risk behavior monitoring and early warning method and system suitable for old users
Technical Field
The invention relates to the technical field of early warning, in particular to a risk behavior monitoring and early warning method and system suitable for an old user.
Background
Currently, the warning for special people may include the following modes: for example, the manager regularly performs behavior observation and individual interview on the user, and then judges whether abnormal behavior exists in a special population through self experience. For another example, a manager can compile qualitative/quantitative problems according to a psychological theory design, check the credibility and correct the questionnaire, and infer whether the possibility of behavior abnormality exists according to the result of filling the questionnaire by special population.
However, the existing early warning method has the following disadvantages:
firstly, the observation method and the interview method have strong subjectivity and inaccuracy, and are difficult to unify the measurement standard, so that accurate judgment and identification are difficult to achieve.
Second, the questionnaire scale takes a lot of time to fill out, resulting in poor evaluation efficiency.
Thirdly, the evaluation period is long, and real-time monitoring cannot be achieved.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides a risk behavior monitoring and early warning method and system suitable for an old user, which are used for solving the technical problems in the related technology.
In a first aspect, an embodiment of the present invention provides a method for monitoring and warning a risk behavior of an elderly user, where the method includes:
acquiring daily behavior data, social interaction data and facial activity monitoring data of an old user;
inputting the daily behavior data and the social interaction data into a risk classifier for risk classification and inputting the facial activity monitoring data into a pre-trained emotion classifier for emotion classification;
and determining whether to perform early warning or not according to the emotion type and risk type of the old user.
Optionally, the daily behavioral data comprises average daily meals, average daily exercise, and average daily rest; the social interaction data comprises contact times with other people and conversation times with other people; the facial activity monitoring data includes changes in the area of the facial activity unit, changes in facial temperature, and changes in respiration and heart rate.
Optionally, the pre-trained emotion classifier is trained in the following manner:
(1) constructing a convolutional neural network, and constructing 3 convolutional and maximum pooling layers, 1 full link layer and a Softmax layer which is connected with 1 p to 0.5 behind the full link layer aiming at a gray scale map with the input size of 32 x 32;
(2) setting 9 different interest regions ROI according to the face structure, and actively guiding a neural network to focus on the region related to the expression;
(3) extracting 900 face pictures of 4 types of happiness, sadness, anger and surprise from the Internet, extracting 900 neutral emotion pictures of the certificate photo as training data, and obtaining 40500 picture training data through ROI processing; the test data consists of 300 pictures of 5 classes of happiness, sadness, anger, surprise and neutral emotion which are downloaded on the Internet;
(4) training and testing to obtain a classifier with the accuracy rate of more than 98%;
(5) and (4) inputting the facial features of the old user, training and testing according to the steps (1) to (4) to obtain a final emotion classifier.
Optionally, the 3 convolutional layers are respectively: CNN-64: [32, 32, 64,64 ]; CNN-96: [48, 48, 96, 200 ]; CNN-128: [64, 128,300 ];
except for the Softmax layer, the activation functions of the remaining layers are: relu (x) max (0, x);
and the number of the first and second groups,
the weight W is initialized by adopting the zero mean value and constant standard deviation of Krizhevsky, and the constant standard deviation of each layer is as follows: [0.0001,0.001,0.001,0.01,0.1].
Optionally, inputting the daily behavior data and the social interaction data into a risk classifier for risk classification comprises:
the risk classifier respectively calculates average values, standard deviations and differences between the average values and the standard deviations of daily average eating times, daily average exercise times, daily average rest times, contact times with other people and conversation times with other people;
the risk classifier determines whether there is at least one of an average number of meals per day, an average number of exercises per day, an average number of breaks per day, a number of contacts with others, and a number of conversations with others that is greater than or less than a difference between the average and standard deviation thereof;
if so, the risk classifier determines that the risk classification is likely to be a risk.
Optionally, determining whether to perform early warning according to the emotion type and risk type of the elderly user includes:
recording the emotion types and risk types of all individuals in the old users within a preset time period;
and calling a preset early warning strategy, determining whether the emotion type and the risk type meet the preset early warning strategy, and if so, giving an alarm.
In a second aspect, an embodiment of the present invention provides a risk behavior monitoring and early warning system suitable for an elderly user, where the system includes:
the monitoring data acquisition module is used for acquiring daily behavior data, social interaction data and facial activity monitoring data of the old user;
the early warning determination module is used for inputting the daily behavior data and the social interaction data into a risk classifier for risk classification and inputting the facial activity monitoring data into a pre-trained emotion classifier for emotion classification;
and the early warning determining module is used for determining whether to carry out early warning according to the emotion type and the risk type of the old user.
Optionally, the daily behavioral data comprises average daily meals, average daily exercise, and average daily rest; the social interaction data comprises contact times with other people and conversation times with other people; the facial activity monitoring data includes changes in the area of the facial activity unit, changes in facial temperature, and changes in respiration and heart rate.
Optionally, the pre-trained emotion classifier is trained in the following manner:
(1) constructing a convolutional neural network, and constructing 3 convolutional and maximum pooling layers, 1 full link layer and a Softmax layer which is connected with 1 p to 0.5 behind the full link layer aiming at a gray scale map with the input size of 32 x 32;
(2) setting 9 different interest regions ROI according to the face structure, and actively guiding a neural network to focus on the region related to the expression;
(3) extracting 900 face pictures of 4 types of happiness, sadness, anger and surprise from the Internet, extracting 900 neutral emotion pictures of the certificate photo as training data, and obtaining 40500 picture training data through ROI processing; the test data consists of 300 pictures of 5 classes of happiness, sadness, anger, surprise and neutral emotion which are downloaded on the Internet;
(4) training and testing to obtain a classifier with the accuracy rate of more than 98%;
(5) inputting facial features of the elderly user, training and testing according to the steps (1) to (4) to obtain a final emotion classifier; wherein
The 3 convolutional layers are respectively: CNN-64: [32, 32, 64,64 ]; CNN-96: [48, 48, 96, 200 ]; CNN-128: [64, 128,300 ];
except for the Softmax layer, the activation functions of the remaining layers are: relu (x) max (0, x);
and the number of the first and second groups,
the weight W is initialized by adopting the zero mean value and constant standard deviation of Krizhevsky, and the constant standard deviation of each layer is as follows: [0.0001,0.001,0.001,0.01,0.1].
Optionally, the early warning determination module includes:
the emotion type recording unit is used for recording the emotion types and risk types of all individuals in the old user within a preset time period;
and the emotion early warning unit is used for calling a preset early warning strategy, determining whether the emotion type and the risk type meet the preset early warning strategy or not, and giving an alarm if the emotion type and the risk type meet the preset early warning strategy.
According to the technical scheme, the facial activity monitoring data of the aged user are acquired; then, inputting the recorded facial activity characteristics of the old user into a pre-trained emotion classifier for emotion classification; and finally, determining whether to perform early warning or not according to the emotion type and risk type of the old user. Thus, the emotional state is identified and analyzed through the facial image, and therefore the early warning result can be more accurate. In addition, in the embodiment, the emotion of the individual can be dynamically monitored in real time, the real-time performance of monitoring can be realized, and the monitoring efficiency can be improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic flowchart of a risk behavior monitoring and early warning method suitable for an elderly user according to an embodiment of the present invention;
fig. 2 to fig. 3 are block diagrams of another risk behavior monitoring and early warning system suitable for an elderly user according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Currently, the warning for special people may include the following modes: for example, the manager regularly performs behavior observation and individual interview on the user, and then judges whether abnormal behavior exists in a special population through self experience. For another example, a manager can compile qualitative/quantitative problems according to a psychological theory design, check the credibility and correct the questionnaire, and infer whether the possibility of behavior abnormality exists according to the result of filling the questionnaire by special population.
However, the existing early warning method has the following disadvantages:
firstly, the observation method and the interview method have strong subjectivity and inaccuracy, and are difficult to unify the measurement standard, so that accurate judgment and identification are difficult to achieve.
Second, the questionnaire scale takes a lot of time to fill out, resulting in poor evaluation efficiency.
Thirdly, the evaluation period is long, and real-time monitoring cannot be achieved.
The inventors of the present application consider: when an individual has difficulty adapting to the environment, emotional states of the sad, fallen type often occur; an angry type of emotional state typically occurs when an individual considers a threat in the environment. Moreover, when an individual is in a certain emotional state continuously and greatly deviates from the normal emotional state of the individual, particularly in the emotional states of anger, sadness and the like, the individual is highly likely to be stimulated to make dangerous behaviors of attacking others, suicide and the like. The emotional state of human beings affects the activity of autonomic nerves, thereby generating a series of external physiological manifestations, and the obvious manifestations mainly comprise: changes in facial expression (facial activity units), changes in facial temperature, changes in breathing, etc.
Therefore, an embodiment of the present invention provides a method for monitoring and warning a risk behavior of an elderly user, and fig. 1 is a schematic flow diagram of the method for monitoring and warning a risk behavior of an elderly user according to an embodiment of the present invention, and the method can be applied to electronic devices such as an intelligent device, a personal computer, and a server. Referring to fig. 1, a method for monitoring and warning risk behaviors of an elderly user includes:
101, acquiring daily behavior data, social interaction data and facial activity monitoring data of an old user;
102, inputting the daily behavior data and the social interaction data into a risk classifier for risk classification, and inputting the facial activity monitoring data into a pre-trained emotion classifier for emotion classification;
103, determining whether to perform early warning according to the emotion type and the risk type of the old user.
The steps of the risk behavior monitoring and early warning method for the elderly user are described in detail below with reference to fig. 1 and fig. 2.
First, a step of acquiring daily behavior data, social interaction data, and facial activity monitoring data of an elderly user is introduced 101.
In this embodiment, the recorded behavior data includes: (1) average number of meals per day: c (eat); (2) average number of movements per day: c (move); (3) average number of daily breaks: c (rest).
The recorded social interaction data includes: (1) number of contacts with another person c (connect); (2) number of conversations with others c (speech).
The facial activity monitoring data includes changes in the area of the facial activity unit, changes in facial temperature, and changes in breathing and heart rate. Wherein:
(1) the area of the face activity unit changes. Labeling the facial activity units as 18-20 activity point locations (Landmarks), each described by a set of coordinate values:
Dn(Xn,Yn);
wherein D represents a certain active point location, n is a serial number, Xn is an abscissa value of the nth active point location, and Yn is an ordinate value of the nth active point location.
(2) The facial temperature changes. The change of the face temperature is represented on the video image and is represented as the color difference of the face image, and the change of the face temperature is described as follows through an image enhancement technology:
ΔC=(C(n+1)-Cn);
where C denotes an image color, C (n +1) denotes an image color value for the (n +1) th second, and Cn denotes an image color value for the n-th second.
(3) Respiration, heart rate variation. The change of the breathing and the heart rate is reflected on the color change of the specific area of the face, and the change of the breathing and the heart rate can be calculated by adopting a preset formula.
Then, a step of introducing 102, inputting the daily behavior data and the social interaction data into a risk classifier for risk classification, and inputting the facial activity monitoring data into a pre-trained emotion classifier for emotion classification.
In this embodiment, the risk classifier may be preset, and (1) average number of meals per day: c (eat); (2) average number of movements per day: c (move); (3) average number of daily breaks: c (rest); (4) number of contacts with another person c (connect); (5) the number of conversations with others, c (speech), is input to the risk classifier. The risk classifier may calculate (1) the average number of meals eaten per day: c (eat); (2) average number of movements per day: c (move); (3) average number of daily breaks: c (rest); (4) number of contacts with another person c (connect); (5) the average value U and the standard deviation S of the number of sessions with another person c (speech) were calculated, and the difference Δ between the average value U and the standard deviation S was calculated as U-S.
When the elderly user has at least one C value greater than or less than U +/- Δ in (1) - (5) on a certain day, the risk classifier classifies: "there may be a risk".
In this embodiment, the facial activity features are input to a pre-trained emotion classifier for emotion classification. The pre-trained emotion classifier training mode is as follows:
(1) constructing a convolutional neural network, and constructing 3 convolutional and maximum pooling layers, 1 full link layer and a Softmax layer which is connected with 1 p to 0.5 behind the full link layer aiming at a gray scale map with the input size of 32 x 32;
CNN-64:[32,32,64,64];
CNN-96:[48,48,96,200];
CNN-128:[64,64,128,300];
except for the Softmax layer, the activation functions of the remaining layers are:
ReLU(x)=max(0,x);
the weight W is initialized by adopting the zero mean value and constant standard deviation of Krizhevsky, and the constant standard deviation of each layer is as follows: [0.0001,0.001,0.001,0.01,0.1].
(2) Setting 9 different interest areas ROI (region of interest) according to the facial structure of the face, and actively guiding a neural network to focus on the area related to the expression.
(3) Extracting 900 face pictures of 4 types of happiness, sadness, anger and surprise from the Internet, extracting 900 neutral emotion pictures of the certificate photo as training data, and obtaining 40500 picture training data through ROI processing; the test data consisted of 300 pictures in 5 categories of happy, sad, angry, surprised and neutral mood downloaded over the internet.
It is understood that the number of training data or test data may be adjusted according to a specific scenario, and is not limited herein.
(4) Through training and testing, the classifier with the accuracy rate of more than 98% is obtained.
(5) And (4) inputting the facial features of the old user, training and testing according to the steps (1) to (4) to obtain a final emotion classifier.
In this embodiment, facial features of the elderly user photographed by the camera are input to the emotion classifier, and the emotion classifier is obtained by training and testing according to the steps (1) to (4). The emotion type of the individual X at that time and the current day can be obtained through calculation of an emotion classifier, such as: (number X, sad).
Finally, a step of introducing 103, determining whether to perform early warning according to the emotion type and risk type of the elderly user.
In this embodiment, the emotion types and risk types of each individual in the special population within a preset time period can be recorded. And then, calling a preset early warning strategy, determining whether the emotion type and the risk type meet the preset early warning strategy, and if so, giving an alarm.
The preset early warning strategy can be preset. For example, when an individual is in three observations of a day, sadness, anger, and classification into possible risks according to risk, an early warning is determined.
As another example, when a certain individual presents three distinct differences in mood type in succession in three observations per day compared to other individuals, and is classified as potentially risky, such as: on a certain day, 85% of individuals in the whole special population continuously present three times of happy emotions, but an individual X continuously presents three times of angry emotions, and early warning is determined; or the individual Y presents neutral emotion for three times continuously, and the early warning is determined to be prepared.
Therefore, in the embodiment, the emotion type and the risk type of the elderly user are determined, so that the early warning result can be more accurate. In addition, in the embodiment, the emotion of the individual can be dynamically monitored in real time, the real-time performance of monitoring can be realized, and the monitoring efficiency can be improved.
In a second aspect, an embodiment of the present invention provides a risk behavior monitoring and early warning system suitable for an elderly user, and referring to fig. 2, the system includes:
a monitoring data acquisition module 201, configured to acquire daily behavior data, social interaction data, and facial activity monitoring data of an elderly user;
an emotion classification module 202, configured to input the daily behavior data and the social interaction data into a risk classifier for risk classification, and input the facial activity monitoring data into a pre-trained emotion classifier for emotion classification;
and the early warning determining module 203 is used for determining whether to perform early warning according to the emotion type and the risk type of the old user.
In some embodiments, the facial activity monitoring data includes changes in area of facial activity units, changes in facial temperature, and changes in breathing and heart rate.
In some embodiments, the pre-trained emotion classifier is trained as follows:
(1) constructing a convolutional neural network, and constructing 3 convolutional and maximum pooling layers, 1 full link layer and a Softmax layer which is connected with 1 p to 0.5 behind the full link layer aiming at a gray scale map with the input size of 32 x 32;
(2) setting 9 different interest regions ROI according to the face structure, and actively guiding a neural network to focus on the region related to the expression;
(3) extracting 900 face pictures of 4 types of happiness, sadness, anger and surprise from the Internet, extracting 900 neutral emotion pictures of the certificate photo as training data, and obtaining 40500 picture training data through ROI processing; the test data consists of 300 pictures of 5 classes of happiness, sadness, anger, surprise and neutral emotion which are downloaded on the Internet;
(4) training and testing to obtain a classifier with the accuracy rate of more than 98%;
(5) and (4) inputting the facial features of the old user, training and testing according to the steps (1) to (4) to obtain a final emotion classifier.
In some embodiments, the 3 convolutional layers are: CNN-64: [32, 32, 64,64 ]; CNN-96: [48, 48, 96, 200 ]; CNN-128: [64, 128,300 ];
except for the Softmax layer, the activation functions of the remaining layers are: relu (x) max (0, x);
and the number of the first and second groups,
the weight W is initialized by adopting the zero mean value and constant standard deviation of Krizhevsky, and the constant standard deviation of each layer is as follows: [0.0001,0.001,0.001,0.01,0.1].
In some embodiments, referring to fig. 3, the early warning determination module 203 comprises:
the emotion type recording unit 301 is used for recording the emotion types and risk types of all individuals in the old user within a preset time period;
and an emotion early warning unit 302, configured to invoke a preset early warning policy, determine whether the emotion type and the risk type meet the preset early warning policy, and alarm if yes.
It should be noted that the risk behavior monitoring and early warning system suitable for the elderly user according to the embodiment of the present invention is in a one-to-one correspondence relationship with the above method, and the implementation details of the above method are also applicable to the above system, and the above system is not described in detail in the embodiment of the present invention.
In the description of the present invention, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the present invention, and they should be construed as being included in the following claims and description.

Claims (5)

1. A risk behavior monitoring and early warning method suitable for an elderly user is characterized by comprising the following steps:
acquiring daily behavior data, social interaction data and facial activity monitoring data of an old user;
inputting the daily behavior data and the social interaction data into a risk classifier for risk classification and inputting the facial activity monitoring data into a pre-trained emotion classifier for emotion classification;
determining whether to perform early warning or not according to the emotion type and risk type of the old user;
the daily behavior data comprises average daily eating times, average daily exercise times and average daily rest times; the social interaction data comprises contact times with other people and conversation times with other people; the facial activity monitoring data includes regional variations of facial activity units, facial temperature variations, and respiratory and heart rate variations;
inputting the daily behavioral data and the social interaction data into a risk classifier for risk classification includes:
the risk classifier respectively calculates average values, standard deviations and differences between the average values and the standard deviations of daily average eating times, daily average exercise times, daily average rest times, contact times with other people and conversation times with other people;
the risk classifier determines whether there is at least one of an average number of meals per day, an average number of exercises per day, an average number of breaks per day, a number of contacts with others, and a number of conversations with others that is greater than or less than a difference between the average and standard deviation thereof;
if so, the risk classifier determines that the risk classification is possible to have a risk;
the pre-trained emotion classifier training mode is as follows:
(1) constructing a convolutional neural network, and constructing 3 convolutional and maximum pooling layers, 1 full link layer and a Softmax layer which is connected with 1 p to 0.5 behind the full link layer aiming at a gray scale map with the input size of 32 x 32;
(2) setting 9 different interest regions ROI according to the face structure, and actively guiding a neural network to focus on the region related to the expression;
(3) extracting 900 face pictures of 4 types of happiness, sadness, anger and surprise from the Internet, extracting 900 neutral emotion pictures of the certificate photo as training data, and obtaining 40500 picture training data through ROI processing; the test data consists of 300 pictures of 5 classes of happiness, sadness, anger, surprise and neutral emotion which are downloaded on the Internet;
(4) training and testing to obtain a classifier with the accuracy rate of more than 98%;
(5) and (4) inputting the facial features of the old user, training and testing according to the steps (1) to (4) to obtain a final emotion classifier.
2. The risk behavior monitoring and early warning method suitable for the elderly user according to claim 1, wherein the 3 convolutional layers are respectively: CNN-64: [32, 32, 64,64 ]; CNN-96: [48, 48, 96, 200 ]; CNN-128: [64, 128,300 ];
except for the Softmax layer, the activation functions of the remaining layers are: relu (x) max (0, x);
and the number of the first and second groups,
the weight W is initialized by adopting the zero mean value and constant standard deviation of Krizhevsky, and the constant standard deviation of each layer is as follows: [0.0001,0.001,0.001,0.01,0.1].
3. The method for monitoring and warning the risky behaviors of the elderly user according to claim 1, wherein determining whether to perform warning according to the emotion type and the risk type of the elderly user comprises:
recording the emotion types and risk types of all individuals in the old users within a preset time period;
and calling a preset early warning strategy, determining whether the emotion type and the risk type meet the preset early warning strategy, and if so, giving an alarm.
4. A risk behavior monitoring and early warning system suitable for elderly users, the system comprising:
the monitoring data acquisition module is used for acquiring daily behavior data, social interaction data and facial activity monitoring data of the old user;
the early warning determination module is used for inputting the daily behavior data and the social interaction data into a risk classifier for risk classification and inputting the facial activity monitoring data into a pre-trained emotion classifier for emotion classification;
the early warning determining module is used for determining whether to carry out early warning according to the emotion type and the risk type of the old user;
the daily behavior data comprises average daily eating times, average daily exercise times and average daily rest times; the social interaction data comprises contact times with other people and conversation times with other people; the facial activity monitoring data includes regional variations of facial activity units, facial temperature variations, and respiratory and heart rate variations;
inputting the daily behavioral data and the social interaction data into a risk classifier for risk classification includes:
the risk classifier respectively calculates average values, standard deviations and differences between the average values and the standard deviations of daily average eating times, daily average exercise times, daily average rest times, contact times with other people and conversation times with other people;
the risk classifier determines whether there is at least one of an average number of meals per day, an average number of exercises per day, an average number of breaks per day, a number of contacts with others, and a number of conversations with others that is greater than or less than a difference between the average and standard deviation thereof;
if so, the risk classifier determines that the risk classification is possible to have a risk;
the pre-trained emotion classifier training mode is as follows:
(1) constructing a convolutional neural network, and constructing 3 convolutional and maximum pooling layers, 1 full link layer and a Softmax layer which is connected with 1 p to 0.5 behind the full link layer aiming at a gray scale map with the input size of 32 x 32;
(2) setting 9 different interest regions ROI according to the face structure, and actively guiding a neural network to focus on the region related to the expression;
(3) extracting 900 face pictures of 4 types of happiness, sadness, anger and surprise from the Internet, extracting 900 neutral emotion pictures of the certificate photo as training data, and obtaining 40500 picture training data through ROI processing; the test data consists of 300 pictures of 5 classes of happiness, sadness, anger, surprise and neutral emotion which are downloaded on the Internet;
(4) training and testing to obtain a classifier with the accuracy rate of more than 98%;
(5) inputting facial features of the elderly user, training and testing according to the steps (1) to (4) to obtain a final emotion classifier; wherein
The 3 convolutional layers are respectively: CNN-64: [32, 32, 64,64 ]; CNN-96: [48, 48, 96, 200 ]; CNN-128: [64, 128,300 ];
except for the Softmax layer, the activation functions of the remaining layers are: relu (x) max (0, x);
and the number of the first and second groups,
the weight W is initialized by adopting the zero mean value and constant standard deviation of Krizhevsky, and the constant standard deviation of each layer is as follows: [0.0001,0.001,0.001,0.01,0.1].
5. The system of claim 4, wherein the early warning determination module comprises:
the emotion type recording unit is used for recording the emotion types and risk types of all individuals in the old user within a preset time period;
and the emotion early warning unit is used for calling a preset early warning strategy, determining whether the emotion type and the risk type meet the preset early warning strategy or not, and giving an alarm if the emotion type and the risk type meet the preset early warning strategy.
CN201811590111.7A 2018-12-25 2018-12-25 Risk behavior monitoring and early warning method and system suitable for old users Active CN109543659B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811590111.7A CN109543659B (en) 2018-12-25 2018-12-25 Risk behavior monitoring and early warning method and system suitable for old users

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811590111.7A CN109543659B (en) 2018-12-25 2018-12-25 Risk behavior monitoring and early warning method and system suitable for old users

Publications (2)

Publication Number Publication Date
CN109543659A CN109543659A (en) 2019-03-29
CN109543659B true CN109543659B (en) 2020-03-31

Family

ID=65858085

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811590111.7A Active CN109543659B (en) 2018-12-25 2018-12-25 Risk behavior monitoring and early warning method and system suitable for old users

Country Status (1)

Country Link
CN (1) CN109543659B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110264668A (en) * 2019-07-10 2019-09-20 四川长虹电器股份有限公司 More tactful old men based on machine vision technique see maintaining method
CN115662631B (en) * 2022-10-26 2023-11-17 上海柚纯数字科技有限公司 Nursing home management system based on AI intelligent discrimination

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104994335A (en) * 2015-06-11 2015-10-21 广东欧珀移动通信有限公司 Alarm method and terminal
CN106027978A (en) * 2016-06-21 2016-10-12 南京工业大学 Video monitoring abnormal behavior system and method for smart home old people care
CN208227195U (en) * 2018-04-13 2018-12-11 西安科技大学 A kind of old solitary people monitor device based on Expression Recognition

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100280579A1 (en) * 2009-04-30 2010-11-04 Medtronic, Inc. Posture state detection
KR20100137175A (en) * 2009-06-22 2010-12-30 삼성전자주식회사 Device and method of automatically recognizing emotion and intention of user
CN106650621A (en) * 2016-11-18 2017-05-10 广东技术师范学院 Deep learning-based emotion recognition method and system
CN106777954B (en) * 2016-12-09 2019-08-23 电子科技大学 A kind of intelligent guarding system and method for Empty nest elderly health
CN107320090A (en) * 2017-06-28 2017-11-07 广东数相智能科技有限公司 A kind of burst disease monitor system and method
CN107491726B (en) * 2017-07-04 2020-08-04 重庆邮电大学 Real-time expression recognition method based on multichannel parallel convolutional neural network

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104994335A (en) * 2015-06-11 2015-10-21 广东欧珀移动通信有限公司 Alarm method and terminal
CN106027978A (en) * 2016-06-21 2016-10-12 南京工业大学 Video monitoring abnormal behavior system and method for smart home old people care
CN208227195U (en) * 2018-04-13 2018-12-11 西安科技大学 A kind of old solitary people monitor device based on Expression Recognition

Also Published As

Publication number Publication date
CN109543659A (en) 2019-03-29

Similar Documents

Publication Publication Date Title
JP7083809B2 (en) Systems and methods for identifying and / or identifying and / or pain, fatigue, mood, and intent with privacy protection
CN110291478B (en) Driver Monitoring and Response System
JP4401079B2 (en) Subject behavior analysis
WO2018168095A1 (en) Person trend recording device, person trend recording method, and program
US11700420B2 (en) Media manipulation using cognitive state metric analysis
US20150287054A1 (en) Automatic analysis of rapport
JP7392492B2 (en) Method, server and program for detecting cognitive and speech disorders based on temporal and visual facial features
JP6930277B2 (en) Presentation device, presentation method, communication control device, communication control method and communication control system
CN109543659B (en) Risk behavior monitoring and early warning method and system suitable for old users
CN111513732A (en) Intelligent psychological stress assessment early warning system for various groups of people under epidemic disease condition
WO2017136931A1 (en) System and method for conducting online market research
CN110705428A (en) Facial age recognition system and method based on impulse neural network
Setyadi et al. Human character recognition application based on facial feature using face detection
Celiktutan et al. Continuous prediction of perceived traits and social dimensions in space and time
Islam A deep learning based framework for detecting and reducing onset of cybersickness
EP3799407A1 (en) Initiating communication between first and second users
CN109635778B (en) Risk behavior monitoring and early warning method and system suitable for special population
JP7306152B2 (en) Emotion estimation device, emotion estimation method, program, information presentation device, information presentation method, and emotion estimation system
US9530049B2 (en) Kinetic-based tool for biometric identification, verification, validation and profiling
CN111723869A (en) Special personnel-oriented intelligent behavior risk early warning method and system
TWI646438B (en) Emotion detection system and method
Rolff et al. When do saccades begin? prediction of saccades as a time-to-event problem
JP2022152500A (en) Awakening level estimation method, awakening level estimation device and awakening level estimation program
NL2020989B1 (en) Monitoring and analyzing body language with machine learning, using artificial intelligence systems for improving interaction between humans, and humans and robots.
Puteri et al. Micro-sleep detection using combination of haar cascade and convolutional neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20220310

Address after: 266000 room 504, floor 5, building a, Shinan Software Park, No. 288, Ningxia road, Shinan District, Qingdao, Shandong Province

Patentee after: Shandong Xinfa Technology Co.,Ltd.

Address before: 100094 No. 1008-1105, first floor, block a, building 1, yard 2, Yongcheng North Road, Haidian District, Beijing

Patentee before: BEIJING XINFA TECHNOLOGY Co.,Ltd.