CN111797817B - Emotion recognition method, emotion recognition device, computer equipment and computer readable storage medium - Google Patents

Emotion recognition method, emotion recognition device, computer equipment and computer readable storage medium Download PDF

Info

Publication number
CN111797817B
CN111797817B CN202010752926.1A CN202010752926A CN111797817B CN 111797817 B CN111797817 B CN 111797817B CN 202010752926 A CN202010752926 A CN 202010752926A CN 111797817 B CN111797817 B CN 111797817B
Authority
CN
China
Prior art keywords
emotion
emotional state
preset
fingertip
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010752926.1A
Other languages
Chinese (zh)
Other versions
CN111797817A (en
Inventor
黄晓君
庄伯金
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN202010752926.1A priority Critical patent/CN111797817B/en
Publication of CN111797817A publication Critical patent/CN111797817A/en
Priority to PCT/CN2020/122418 priority patent/WO2021139310A1/en
Application granted granted Critical
Publication of CN111797817B publication Critical patent/CN111797817B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Molecular Biology (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Social Psychology (AREA)
  • Public Health (AREA)
  • Primary Health Care (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Signal Processing (AREA)
  • Epidemiology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Psychology (AREA)
  • Psychiatry (AREA)
  • Hospice & Palliative Care (AREA)
  • Developmental Disabilities (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The embodiment of the application provides an emotion recognition method, an emotion recognition device, computer equipment and a computer readable storage medium. The embodiment of the application belongs to the technical field of image processing, and comprises the steps of obtaining fingertip images corresponding to a plurality of finger fingertips, converting formats corresponding to the fingertip images to obtain PPG signals corresponding to the fingertip images, extracting heart rate characteristics and heart rate variation characteristics from the PPG signals through a preset neural network model, obtaining emotion state indexes corresponding to the PPG signals according to the heart rate characteristics and the heart rate variation characteristics, and obtaining emotion states corresponding to the emotion state indexes according to the emotion state indexes and preset matching relations between the emotion state indexes and the emotion states.

Description

Emotion recognition method, emotion recognition device, computer equipment and computer readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and apparatus for emotion recognition based on a neural network model, a computer device, and a computer readable storage medium.
Background
Negative emotion of a person can cause abnormal states such as insomnia and even depression, if the emotional state cannot be timely relieved and released for a long time, a series of discomfort such as headache, insomnia and anxiety are caused, and a series of chronic risks such as cardiovascular and cerebrovascular diseases, diabetes and cancers are increased. Emotion is used as a primary defense line of human immunity, and good emotion management can effectively help a user establish good body.
Mood management is generally divided into mood state assessment and mood regulation. The traditional method for evaluating the emotion state is a subjective evaluation method, and scoring is carried out through a specially designed psychological scale according to the description of the self-sensation of a tested person. This method, while simple and common to operate, is less objective and more suitable for large sample statistical analysis, and is more difficult for individuals to evaluate over a long period of time and multiple times. Therefore, in the conventional art, the efficiency of recognizing or evaluating emotion is low.
Disclosure of Invention
The embodiment of the application provides a method, a device, computer equipment and a computer readable storage medium for emotion recognition based on a neural network model, which can solve the problem of low emotion recognition efficiency in the traditional technology.
In a first aspect, an embodiment of the present application provides a method for identifying emotion based on a neural network model, where the method includes: acquiring fingertip images corresponding to a plurality of finger tips; converting a format corresponding to the fingertip image to obtain a PPG signal corresponding to the fingertip image; extracting heart rate characteristics and heart rate variability characteristics from the PPG signal through a preset neural network model; acquiring an emotional state index corresponding to the PPG signal according to the heart rate characteristics and the heart rate variability characteristics; and acquiring the emotion state corresponding to the emotion state index according to the emotion state index and a preset matching relation between the emotion state index and the emotion state.
In a second aspect, an embodiment of the present application further provides an emotion recognition device based on a neural network model, including: the first acquisition unit is used for acquiring fingertip images corresponding to a plurality of finger fingertips; the conversion unit is used for converting the format corresponding to the fingertip image so as to obtain a PPG signal corresponding to the fingertip image; the extraction unit is used for extracting heart rate characteristics and heart rate variability characteristics from the PPG signals through a preset neural network model; the second acquisition unit is used for acquiring an emotional state index corresponding to the PPG signal according to the heart rate characteristics and the heart rate variability characteristics; the third obtaining unit is used for obtaining the emotion state corresponding to the emotion state index according to the emotion state index and the preset matching relation between the emotion state index and the emotion state.
In a third aspect, an embodiment of the present application further provides a computer device, which includes a memory and a processor, where the memory stores a computer program, and the processor implements the steps of the emotion recognition method based on the neural network model when executing the computer program.
In a fourth aspect, embodiments of the present application further provide a computer readable storage medium storing a computer program, which when executed by a processor causes the processor to perform the steps of the emotion recognition method based on a neural network model.
The embodiment of the application provides a method and a device for identifying emotion based on a neural network model, computer equipment and a computer readable storage medium. According to the embodiment of the application, through acquiring a plurality of fingertip images corresponding to finger tips, converting formats corresponding to the fingertip images to obtain PPG signals corresponding to the fingertip images, extracting heart rate characteristics and heart rate variability characteristics from the PPG signals through a preset neural network model, acquiring an emotion state index corresponding to the PPG signals according to the heart rate characteristics and the heart rate variability characteristics, and acquiring an emotion state corresponding to the emotion state index according to the emotion state index and a preset matching relation between the emotion state index and the emotion state.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of an emotion recognition method based on a neural network model according to an embodiment of the present application;
Fig. 2 is a schematic diagram of an emotion state calculation mode of an emotion recognition method based on a neural network model according to an embodiment of the present application;
Fig. 3 is a schematic flow chart of a sub-process in an emotion recognition method based on a neural network model according to an embodiment of the present application;
FIG. 4 is a schematic block diagram of an emotion recognition device based on a neural network model according to an embodiment of the present application; and
Fig. 5 is a schematic block diagram of a computer device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
It should be understood that the terms "comprises" and "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Referring to fig. 1, fig. 1 is a schematic flow chart of an emotion recognition method based on a neural network model according to an embodiment of the application. As shown in fig. 1, the method includes the following steps S101-S103:
S101, acquiring fingertip images corresponding to a plurality of finger tips.
Specifically, the fingertip images are obtained, the fingertip images can be shot through the image acquisition equipment, for example, the fingertip images can be shot through a camera of the mobile terminal equipment, and a plurality of images in a preset time can be shot continuously through a continuous shooting mode, so that continuous fingertip conditions in the preset time can be described better, and the emotion state can be reflected more accurately. In addition, the fingertip video can be recorded through the camera of the mobile terminal, then each frame of picture in the video is extracted to obtain a plurality of fingertip images, and each frame of fingertip image extracted from the video has continuity due to the continuity of the video, so that the fingertip condition in the preset time can be accurately represented, and the accuracy of emotion state identification can be improved when emotion identification is carried out through the fingertip image.
S102, converting a format corresponding to the fingertip image to obtain a PPG signal corresponding to the fingertip image.
The PPG signal, english Photoplethysmography, is abbreviated as PPG, is a photoplethysmography, also called photoplethysmogram, and is a detection method for detecting a blood volume change in a living tissue by means of a photoelectric means.
Specifically, when a light beam with a certain wavelength irradiates the surface of finger tip skin, the contraction and expansion of blood vessels affect the transmission of light, for example, in the transmission PPG, the light passing through the fingertip, when the light passes through skin tissue and then is reflected to the photosensitive sensor, the light is attenuated to a certain extent, and the absorption of light by muscles, bones, veins and other connecting tissues is basically unchanged (provided that the measuring part does not move greatly), but arteries are different, and the absorption of light naturally changes due to the pulsation of blood in the arteries, so that the change of the fingertip image is caused, and the change of the fingertip image pixel is reflected, therefore, the characteristic of blood flow can be reflected by analyzing the pixel change contained in the fingertip image, and the heart rate characteristic is reflected by the blood flow.
Therefore, after the fingertip image is obtained, the format corresponding to the fingertip image is converted, the fingertip image can be described by converting the fingertip image into RGB three channels, and further, the RGB three channels are analyzed, for example, green light channels or red light channels in the RGB three channels are analyzed, so that the PPG signal corresponding to the fingertip image is obtained. The three channels of RGB are also called full-color image, and the three channels are respectively: r (red), G (green), B (blue). For example, the RGB image and gray values are understood using the Halcon program and Halcon's own image. For example, three channels of separation may be performed on the color image by Opencv and Matlab, where the storage order of the channels is different when the Opencv and Matlab process the color image, the arrangement order of Matlab is R, G, B, and in Opencv, the arrangement order is B, G, R.
S103, extracting heart rate characteristics and heart rate variability characteristics from the PPG signal through a preset neural network model.
Where heart rate variability (i.e., HRV) refers to the small difference in successive beat intervals.
Specifically, in the embodiment of the present application, a pre-training is performed on a pre-set neural network model in advance, for example, a cyclic neural network (RecurrentNeuralNetwork, abbreviated as RNN) is used, and because the neural network model has an automatic learning capability, after a training sample including a sample PPG signal and a heart rate feature, a heart rate variability feature, and the like corresponding to the sample PPG signal is input to the pre-set neural network, the pre-set neural network model automatically learns according to the sample PPG signal and the heart rate feature and the heart rate variability feature corresponding to the sample PPG signal, so that after a PPG signal is obtained later, the heart rate feature and the heart rate variability feature can be extracted according to the PPG signal. After training the preset neural network model, the training effect of the preset neural network model can be detected through a verification sample, and if the preset neural network model passes the verification, the preset neural network model is applied to the production environment.
Since PPG signal morphology is similar to Arterial Blood Pressure (ABP) waveforms, and heart rate is closely related to blood pressure, a person's emotional state directly affects heart rate and blood pressure, and blood pressure and heart rate also reflect a person's emotional state, this makes PPG signal a non-invasive heart rate monitoring tool that can measure a person's heart rate and blood pressure by analyzing PPG signal, and identify a person's emotional state. Since the periodicity of the PPG signal corresponds to the heart rhythm, the heart rate may be estimated from the PPG signal.
The heart rate variability feature may be obtained by frequency domain analysis and time domain analysis of the PPG signal.
The time domain analysis, english Timedomainanalysis, and HRV quantitatively describe the variation characteristics of the cardiac cycle by various statistical methods, for example, by measuring and calculating the average RR interval, the difference or ratio between the longest RR interval and the shortest RR interval, and the standard deviation of all RR intervals in a certain period of time, so as to calculate the number of peaks of the PPG signal in a certain period of time by filtering the original PPG signal, and then calculate the heart rate value, for example, assuming that the continuous sampling is performed for 5 seconds, and the number of peaks in 5 seconds is N, then the heart rate is n×12.
Frequency domain analysis, english is Frequency domain analysis, also called spectrum analysis, and a special calculation method is used to decompose a heart rate fluctuation curve changing with time into the sum of sinusoidal curves with different frequencies and different amplitudes, so as to obtain the frequency spectrum of the HRV. The method has the advantages that the periodic quantity of the heart activity can be quantified, and the human HRV power spectrum is divided into 4 areas: high frequency band, low frequency band, very low frequency band, and ultra low frequency band, for example, by performing FFT transformation on PPG signal.
After the pretrained preset neural network model is put into a production environment and a PPG signal is received, heart rate characteristics and heart rate variation characteristics can be extracted from the PPG signal by carrying out time domain analysis and frequency domain analysis on the PPG signal.
S104, acquiring an emotion state index corresponding to the PPG signal according to the heart rate characteristics and the heart rate variability characteristics.
Specifically, after extracting the heart rate characteristic and the heart rate variability characteristic, acquiring an emotional state index corresponding to the PPG signal according to the heart rate characteristic and the heart rate variability characteristic. Referring to fig. 2, fig. 2 is a schematic diagram illustrating an emotion state calculation mode of an emotion recognition method based on a neural network model according to an embodiment of the present application. In this embodiment, according to the heart rate characteristics and the heart rate variability characteristics, emotional state indexes such as an vitality index, a stress index, and a fatigue index corresponding to the PPG signal are obtained, so as to quantitatively evaluate the activity and balance of the autonomic nervous system and evaluate the fatigue and stress states of the human body, as shown in fig. 2.
For example, based on big data analysis of existing data, by means of training samples used in pre-training, statistical relations between heart rate characteristics and heart rate variability characteristics and emotion state indexes can be obtained, for example, statistical relations between fatigue indexes, stress indexes and vitality indexes corresponding to heart rate characteristics and heart rate variability characteristics can be obtained through training samples, and further after obtaining heart rate characteristics and heart rate variability characteristics corresponding to PPG signals, fatigue indexes, stress indexes and vitality indexes corresponding to the PPG signals are obtained according to the statistical relations, and emotion state indexes corresponding to the PPG signals are obtained. For example, in one example, the statistical relationships between heart rate characteristics and heart rate variability characteristics and emotional state indexes are shown in tables 1 to 3 below, wherein table 1 is a fatigue index rating and evaluation table, table 2 is a stress index rating and evaluation table, table 3 is a vitality index rating and evaluation table, and tables 1 to 3 are as follows:
Table 1
Table 2
TABLE 3
S105, acquiring the emotion state corresponding to the emotion state index according to the emotion state index and a preset matching relation between the emotion state index and the emotion state.
Specifically, based on priori knowledge and accumulated data, the emotional states are classified to form a preset matching relation between the emotional state indexes and the emotional states, and after the emotional state indexes are obtained, the emotional states corresponding to the emotional state indexes are obtained according to the emotional state indexes and the preset matching relation. For example, please refer to tables 1 to 3, the emotional state corresponding to the emotion index can be obtained to realize the identification of the emotional state. When the embodiment of the application is used for realizing the quantitative emotional health, the intelligent mobile equipment is used for carrying out emotion quantification, the intelligent mobile equipment is used for carrying out fingertip detection, the electrocardio pulse wave (PPG) is collected and combined with a neural network RNN algorithm, the characteristics of heart rate and Heart Rate Variability (HRV) are extracted to detect the vitality, pressure and fatigue state of a user, the current and long-term emotional states are objectively quantified, the low threshold of emotion recognition based on the neural network model is realized, and the physical and mental health data index of the user can be obtained without additional external equipment.
Referring to fig. 3, fig. 3 is a schematic flow chart of a emotion recognition method based on a neural network model according to an embodiment of the present application. As shown in fig. 3, in this embodiment, the step of acquiring fingertip images corresponding to a plurality of finger tips includes:
s301, responding to an instruction for collecting fingertip videos, and opening a flash lamp of a camera of the mobile terminal;
S302, prompting a user to shield the camera through fingertips;
s303, judging whether the camera is blocked by the finger tip of the finger or not;
S304, if the camera is shielded by the finger tip, recording the finger tip through the camera to obtain a fingertip video, and if the camera is not shielded by the finger tip, returning to execute the step of prompting the user to shield the camera through the finger tip;
s305, extracting the fingertip video to obtain a fingertip image.
Specifically, when emotion recognition is achieved through the intelligent mobile terminal and emotion management is achieved, fingertip detection can be conducted through a camera of the intelligent mobile terminal, and then the emotion state is evaluated. When an electrocardiographic pulse wave (PPG) signal is required to be acquired through a camera of the intelligent mobile device, for example, when a user opens an application program for emotion recognition through emotion management, the mobile terminal responds to an operation instruction of the user for a preset function to obtain an instruction for responding to acquisition of a fingertip video, then a flash lamp of the camera of the mobile terminal is turned on to match with the flash lamp of the mobile device, the user is prompted to shield the camera through a finger fingertip, so that the user can shield the camera through the finger fingertip, whether the camera is shielded by the finger fingertip or not is judged through image recognition, for example, whether an image feature corresponding to an extracted shielding object is matched with a preset fingertip image feature is judged, if the image feature corresponding to the extracted shielding object is matched with the preset fingertip image feature, the camera is judged to be shielded by the finger fingertip, then video is recorded on the finger fingertip, and if the camera is not shielded by the finger fingertip, the step of prompting the user to shield the camera through the finger fingertip is executed. The blood flow of the finger tip can periodically fluctuate along with the change of the blood vessel volume, so that the recorded video pixel values can also periodically change, and the fingertip video image contained in the recorded fingertip video can reflect the heart rate of the user, so that the emotion state of the user is reflected through the heart rate.
In one embodiment, the step of converting the format corresponding to the fingertip image to obtain the PPG signal corresponding to the fingertip image includes:
RGB classification is carried out on each frame of fingertip image according to a preset separation method so as to obtain RGB three channels corresponding to each frame of fingertip image;
separating a red channel from the RGB three channels;
calculating the red channel to extract a PPG signal corresponding to the red channel;
And combining the PPG signals corresponding to all the red channels to obtain the PPG signals corresponding to the fingertip image.
Specifically, RGB classification is performed on the fingertip images of each frame according to a preset separation method, the preset separation method includes that a Halcon program is used for understanding RGB images and gray values of Halcon self-contained images, three-channel separation can be performed on color images through Opencv and Matlab to obtain RGB three channels corresponding to the fingertip images of each frame, further pixel data related to heart rate such as a red channel or a green channel are separated from the RGB three channels to obtain PPG signals, PPG signals corresponding to the red channel are extracted, the PPG signals corresponding to all the red channels are combined to obtain PPG signals corresponding to the fingertip images, format conversion of each frame of video is achieved to obtain PPG signals, and then a preset neural network model such as a neural network RNN algorithm is combined to extract heart rate and Heart Rate Variability (HRV) features.
In one embodiment, after the step of obtaining the emotional state index corresponding to the PPG signal, the method further includes:
Acquiring all the emotion state indexes in a preset time period;
Calculating the average value of the emotional state indexes corresponding to all the emotional state indexes;
the step of obtaining the emotional state corresponding to the emotional state index according to the emotional state index and the preset matching relation between the emotional state index and the emotional state comprises the following steps:
and acquiring the emotional state corresponding to the average value of the emotional state indexes according to the average value of the emotional state indexes and a preset matching relation between the emotional state indexes and the emotional state.
Specifically, the emotional state results such as vitality index, stress index and fatigue index detected by each fingertip are recorded, the health trend of the person is analyzed according to the emotional state indexes, the daily detected values are recorded as a curve, and the trend of the emotional change is monitored. And all the emotional state indexes in a preset time period can be obtained, the average value of the emotional state indexes corresponding to all the emotional state indexes is calculated, the emotional state corresponding to the average value of the emotional state indexes is obtained according to the average value of the emotional state indexes and the preset matching relation between the emotional state indexes and the emotional state, for example, the average value of the emotional state results of 7 days is calculated and is used as Baseline (reference index), and the current emotional state is intuitively and contrasted and known. For users who insist on using emotion detection for 7 days, a 7-day average was calculated and health assessment and advice was provided. Scoring systems for vitality, stress, fatigue index according to the algorithm are shown in tables 1 to 3 above. The emotional states in the preset time period can be measured through the average value of the emotional state indexes, so that the emotional states in a period of time can be obtained more accurately, and the stability of the emotional states of the user can be measured. Through carrying out emotion monitoring on the emotion of the user in a preset time period, the analysis of health trend can be carried out on the user which is in long-term adherence, and the change of emotion states is digitally displayed, so that the psychological health risk is identified, the psychological health of the user is quantitatively assessed, and the risk of suffering from psychological diseases is timely identified.
In one embodiment, after the step of obtaining the emotional state corresponding to the average value of the emotional state indexes, the method further includes:
drawing all the emotional state indexes into an emotional trend schematic diagram;
and outputting the emotional state corresponding to the emotional trend schematic diagram and the average value of the emotional state indexes to display the emotional state.
Specifically, all the emotional state indexes within the preset time period are drawn into an emotional trend diagram, and the emotional states corresponding to the emotional trend diagram and the average value of the emotional state indexes are output for emotional state display, so that a user can intuitively know the emotional states of the user, and the efficiency of emotion management is improved.
Specifically, after the step of obtaining the emotional state corresponding to the emotional state index, the method further includes:
judging whether the emotion state index is smaller than or equal to a preset emotion state index threshold value;
outputting a preset emotion guiding prompt if the emotion state index is smaller than or equal to the preset emotion state index threshold;
judging whether an instruction corresponding to the operation of agreeing to carry out emotion guiding by the user is received or not;
And if an instruction corresponding to the operation of agreeing to carry out emotion guiding by the user is received, guiding the user to carry out emotion adjustment according to a preset emotion guiding mode.
Specifically, if the user's emotional state is in a non-good state, the user's emotional state may be guided by a preset emotion guiding manner to improve the user's emotional state, for example, whether the emotional state index is less than or equal to a preset emotional state index threshold is determined, if the emotional state index is less than or equal to the preset emotional state index threshold, the user's emotional state is determined to be in a bad state, a preset emotion guiding prompt is output to remind the user that the user needs to improve the emotional state, otherwise, physical and mental health may not be affected. After the user is prompted for emotion guiding, judging whether an instruction corresponding to the operation of agreeing to perform emotion guiding by the user is received, and guiding the user to perform emotion adjustment according to a preset emotion guiding mode if the instruction corresponding to the operation of agreeing to perform emotion guiding by the user is received. According to the embodiment of the application, the quantitative emotion and emotion monitoring of the user can be realized by providing the intelligent mobile equipment, so that the psychological health risk is recognized, and when the emotion of the user is in a bad state, the short-time emotion regulating service is used, the risk of the user suffering from psychological diseases is reduced, and the psychological health auxiliary regulation function is realized.
Further, the step of guiding the user to perform emotion adjustment according to the preset emotion guiding mode includes:
Acquiring the emotional state index;
Acquiring preset type healing music corresponding to the emotion state index according to the emotion state index;
Combining the preset type of healing music with preset positive training to guide the user to carry out emotion adjustment.
Specifically, through AI creation meditation music, through AI machine learning, study the works of the specific style of the therapeutic music, use a multi-layer sequence model and a high latitude music feature extraction method to create the main melody of the therapeutic music, adjust the music beat and tone according to the breathing rules of professional breathing precautions such as a balanced breathing method, an abdominal breathing method and the like, increase white noise such as wind noise, rain noise, water drop noise and the like according to different theme scenes, finally form the preset type therapeutic music, and form a corresponding matching relation between the preset type therapeutic music and emotion state indexes, namely, different emotion state indexes correspond to different types of therapeutic music, so as to pertinently carry out different auxiliary adjustment emotions to different emotion states, and improve the adjustment effect to the emotion states.
And then combining the preset healing lingoes corresponding to the AI creation meditation music with a preset positive training method, and providing the service for effectively regulating the emotion at the mobile equipment end for the user with the health evaluation of 'need to be noticed', thereby realizing the combination of music therapy and positive training, and providing a short-time emotion regulation effect by considering the equipment characteristics of the mobile equipment at any time and any place. Because the traditional positive training mode generally requires the user to last for more than 30 minutes, the short-time emotion adjustment provided by the embodiment of the application can provide flexible positive voice guidance on the mobile device, for example, 8 positive voice guidance for 5-10 minutes, so that according to different life scenes of the user, service recommendation of emotion adjustment is performed on the mobile device, and recommended content comprises the aspects of emotion first aid, positive emotion breathing, body scanning, pre-sleep preparation, stability and sleep, anxiety relief, anxiety admission, attention improvement and the like. The user selects the guidance of the positive emotion voice required by the user according to the emotion state of the user, and the user can effectively relax by matching with the relaxed meditation music and breathing rhythm adjustment, so that the 'thinking' is slowly converted into the 'feeling' mode, the thinking and the attention are returned to the body of the user, the emotion can be effectively gentle, an effective and convenient emotion adjustment method is provided for people in the anxiety and stress living environment for a long time, the positive emotion training method and the AI creation meditation music are combined, a set of short-time emotion adjustment service is provided on the intelligent mobile device, the user can utilize the fragmentation time to carry out emotion adjustment, the problems of anxiety, tension, pressure and the like of the user are effectively solved, and the emotion adjustment efficiency is improved.
According to the embodiment of the application, emotion management based on intelligent mobile equipment can be realized through the mode, a closed loop system of emotion detection, analysis, quantitative evaluation and emotion regulation is constructed by combining an AI health fingertip detection technology, an AI creation meditation music technology and a positive idea training method, and effective emotion management is carried out on people.
It should be noted that, according to the emotion recognition method based on the neural network model in the foregoing embodiments, the technical features included in different embodiments may be recombined according to needs to obtain a combined embodiment, which is within the scope of protection claimed by the present application.
Referring to fig. 4, fig. 4 is a schematic block diagram of an emotion recognition device based on a neural network model according to an embodiment of the present application. Corresponding to the emotion recognition method based on the neural network model, the embodiment of the application also provides an emotion recognition device based on the neural network model. As shown in fig. 4, the emotion recognition device based on the neural network model includes a unit for performing the above emotion recognition method based on the neural network model, and the emotion recognition device based on the neural network model may be configured in a computer device such as a mobile terminal. Specifically, referring to fig. 4, the emotion recognition device 400 based on the neural network model includes a first obtaining unit 401, a converting unit 402, an extracting unit 403, a second obtaining unit 404, and a third obtaining unit 405.
The first acquiring unit 401 is configured to acquire fingertip images corresponding to a plurality of finger tips;
a conversion unit 402, configured to convert a format corresponding to the fingertip image, so as to obtain a PPG signal corresponding to the fingertip image;
An extracting unit 403, configured to extract heart rate features and heart rate variability features from the PPG signal through a preset neural network model;
A second obtaining unit 404, configured to obtain an emotional state index corresponding to the PPG signal according to the heart rate characteristic and the heart rate variability characteristic;
a third obtaining unit 405, configured to obtain an emotional state corresponding to the emotional state index according to the emotional state index and a preset matching relationship between the emotional state index and the emotional state.
In one embodiment, the first obtaining unit 401 includes:
The response subunit is used for responding to the instruction of collecting the fingertip video and opening a flash lamp of a camera of the mobile terminal;
the prompting subunit is used for prompting a user to shield the camera through finger tips;
a judging subunit, configured to judge whether the camera is blocked by the fingertip of the finger;
A recording subunit, configured to record, if the camera is blocked by the finger tip, the finger tip through the camera, so as to obtain a fingertip video;
and the extraction subunit is used for extracting the image of the fingertip video to obtain a fingertip image.
In one embodiment, the emotion recognition device 400 based on the neural network model further includes:
a fourth obtaining unit, configured to obtain all the emotional state indexes in a preset time period;
a calculating unit, configured to calculate an average value of all emotional state indexes corresponding to the emotional state indexes;
the third obtaining unit 405 is configured to obtain an emotional state corresponding to the average value of the emotional state indexes according to the average value of the emotional state indexes and a preset matching relationship between the emotional state indexes and the emotional state.
In one embodiment, the emotion recognition device 400 based on the neural network model further includes:
the drawing unit is used for drawing all the emotion state indexes into an emotion trend schematic diagram;
And the display unit is used for outputting the emotional state corresponding to the emotional trend schematic diagram and the average value of the emotional state indexes so as to display the emotional state.
In one embodiment, the emotion recognition device 400 based on the neural network model further includes:
the first judging unit is used for judging whether the emotion state index is smaller than or equal to a preset emotion state index threshold value;
The prompting unit is used for outputting a preset emotion guiding prompt if the emotion state index is smaller than or equal to the preset emotion state index threshold value;
The second judging unit is used for judging whether an instruction corresponding to the operation of agreeing to carry out emotion guiding by the user is received or not;
And the guiding unit is used for guiding the user to carry out emotion adjustment according to a preset emotion guiding mode if receiving an instruction corresponding to the operation of the user agreeing to carry out emotion guiding.
In one embodiment, the guide unit includes:
a first acquisition subunit configured to acquire the emotional state index;
The second acquisition subunit is used for acquiring the preset type healing music corresponding to the emotion state index according to the emotion state index;
And the guiding subunit is used for combining the preset type healing music with preset positive training to guide the user to carry out emotion adjustment.
It should be noted that, as those skilled in the art can clearly understand, the specific implementation process of the emotion recognition device and each unit based on the neural network model can refer to the corresponding description in the foregoing method embodiment, and for convenience and brevity of description, the description is omitted here.
Meanwhile, the above-mentioned dividing and connecting modes of each unit in the emotion recognition device based on the neural network model are only used for illustration, in other embodiments, the emotion recognition device based on the neural network model may be divided into different units according to the needs, and different connecting sequences and modes may be adopted for each unit in the emotion recognition device based on the neural network model, so as to complete all or part of functions of the emotion recognition device based on the neural network model.
The above emotion recognition apparatus based on the neural network model may be implemented in the form of a computer program that can be run on a computer device as shown in fig. 5.
Referring to fig. 5, fig. 5 is a schematic block diagram of a computer device according to an embodiment of the present application. The computer device 500 may be a computer device such as a desktop computer or a server, or may be a component or part of another device.
With reference to FIG. 5, the computer device 500 includes a processor 502, memory, and a network interface 505 connected by a system bus 501, where the memory may include a non-volatile storage medium 503 and an internal memory 504.
The non-volatile storage medium 503 may store an operating system 5031 and a computer program 5032. The computer program 5032, when executed, may cause the processor 502 to perform a neural network model-based emotion recognition method as described above.
The processor 502 is used to provide computing and control capabilities to support the operation of the overall computer device 500.
The internal memory 504 provides an environment for the execution of a computer program 5032 in the non-volatile storage medium 503, which computer program 5032, when executed by the processor 502, causes the processor 502 to perform a neural network model-based emotion recognition method as described above.
The network interface 505 is used for network communication with other devices. It will be appreciated by those skilled in the art that the architecture shown in fig. 5 is merely a block diagram of some of the architecture relevant to the present inventive arrangements and is not limiting of the computer device 500 to which the present inventive arrangements may be implemented, as a particular computer device 500 may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components. For example, in some embodiments, the computer device may include only a memory and a processor, and in such embodiments, the structure and function of the memory and the processor are consistent with the embodiment shown in fig. 5, and will not be described again.
Wherein the processor 502 is configured to execute a computer program 5032 stored in a memory to implement the steps of: acquiring fingertip images corresponding to a plurality of finger tips; converting a format corresponding to the fingertip image to obtain a PPG signal corresponding to the fingertip image; extracting heart rate characteristics and heart rate variability characteristics from the PPG signal through a preset neural network model; acquiring an emotional state index corresponding to the PPG signal according to the heart rate characteristics and the heart rate variability characteristics; and acquiring the emotion state corresponding to the emotion state index according to the emotion state index and a preset matching relation between the emotion state index and the emotion state.
In an embodiment, when the step of acquiring the fingertip images corresponding to the plurality of finger tips is implemented by the processor 502, the following steps are specifically implemented:
Responding to an instruction for acquiring a fingertip video, and opening a flash lamp of a camera of the mobile terminal;
prompting a user to shield the camera through a finger tip;
Judging whether the camera is shielded by the finger tip of the finger or not;
If the camera is shielded by the finger tip, recording the finger tip through the camera to obtain a fingertip video;
and extracting the fingertip video to obtain a fingertip image.
In an embodiment, after implementing the step of obtaining the emotional state index corresponding to the PPG signal, the processor 502 further implements the following steps:
Acquiring all the emotion state indexes in a preset time period;
Calculating the average value of the emotional state indexes corresponding to all the emotional state indexes;
the step of obtaining the emotional state corresponding to the emotional state index according to the emotional state index and the preset matching relation between the emotional state index and the emotional state comprises the following steps:
and acquiring the emotional state corresponding to the average value of the emotional state indexes according to the average value of the emotional state indexes and a preset matching relation between the emotional state indexes and the emotional state.
In an embodiment, after implementing the step of obtaining the emotional state corresponding to the average value of the emotional state indexes, the processor 502 further implements the following steps:
drawing all the emotional state indexes into an emotional trend schematic diagram;
and outputting the emotional state corresponding to the emotional trend schematic diagram and the average value of the emotional state indexes to display the emotional state.
In an embodiment, after implementing the step of acquiring the emotional state corresponding to the emotional state index, the processor 502 further implements the following steps:
judging whether the emotion state index is smaller than or equal to a preset emotion state index threshold value;
outputting a preset emotion guiding prompt if the emotion state index is smaller than or equal to the preset emotion state index threshold;
judging whether an instruction corresponding to the operation of agreeing to carry out emotion guiding by the user is received or not;
And if an instruction corresponding to the operation of agreeing to carry out emotion guiding by the user is received, guiding the user to carry out emotion adjustment according to a preset emotion guiding mode.
In an embodiment, when the step of guiding the user to perform emotion adjustment according to the preset emotion guiding manner is implemented by the processor 502, the following steps are specifically implemented:
Acquiring the emotional state index;
Acquiring preset type healing music corresponding to the emotion state index according to the emotion state index;
Combining the preset type of healing music with preset positive training to guide the user to carry out emotion adjustment.
It should be appreciated that in embodiments of the present application, the processor 502 may be a central processing unit (central ProcessingUnit, CPU), the processor 502 may also be other general purpose processors, digital signal processors (DIGITAL SIGNAL processors, dsps), application SPECIFIC INTEGRATED circuits (asics), off-the-shelf programmable gate arrays (field-programmable GATEARRAY, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. Wherein the general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It will be appreciated by those skilled in the art that all or part of the flow of the method of the above embodiments may be implemented by a computer program, which may be stored on a computer readable storage medium. The computer program is executed by at least one processor in the computer system to implement the flow steps of the embodiments of the method described above.
Accordingly, the present application also provides a computer-readable storage medium. The computer readable storage medium may be a non-volatile computer readable storage medium storing a computer program which, when executed by a processor, causes the processor to perform the steps of:
A computer program product which, when run on a computer, causes the computer to perform the steps of the neural network model-based emotion recognition method described in the above embodiments.
The computer readable storage medium may be an internal storage unit of the aforementioned device, such as a hard disk or a memory of the device. The computer readable storage medium may also be an external storage device of the device, such as a plug-in hard disk, a smart memory card (SMART MEDIA CARD, SMC), a Secure Digital (SD) card, a flash memory card (FLASH CARD), etc. that are provided on the device. Further, the computer readable storage medium may also include both internal storage units and external storage devices of the device.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the apparatus, device and unit described above may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
The storage medium is a physical, non-transitory storage medium, and may be, for example, a U-disk, a removable hard disk, a Read-only memory (ROM), a magnetic disk, or an optical disk.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps described in connection with the embodiments disclosed herein may be embodied in electronic hardware, in computer software, or in a combination of the two, and that the elements and steps of the examples have been generally described in terms of function in the foregoing description to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the device embodiments described above are merely illustrative. For example, the division of each unit is only one logic function division, and there may be another division manner in actual implementation. For example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed.
The steps in the method of the embodiment of the application can be sequentially adjusted, combined and deleted according to actual needs. The units in the device of the embodiment of the application can be combined, divided and deleted according to actual needs. In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The integrated unit may be stored in a storage medium if implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application is essentially or a part contributing to the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, comprising several instructions for causing an electronic device (which may be a personal computer, a terminal, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application.
While the application has been described with reference to certain preferred embodiments, it will be understood by those skilled in the art that various changes and substitutions of equivalents may be made and equivalents will be apparent to those skilled in the art without departing from the scope of the application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (9)

1. A method for emotion recognition based on a neural network model, the method comprising:
Acquiring fingertip images corresponding to a plurality of finger tips;
Converting a format corresponding to the fingertip image to obtain a PPG signal corresponding to the fingertip image;
Extracting heart rate characteristics and heart rate variability characteristics from the PPG signal through a preset neural network model;
Acquiring an emotional state index corresponding to the PPG signal according to the heart rate characteristics and the heart rate variability characteristics;
Acquiring an emotional state corresponding to the emotional state index according to the emotional state index and a preset matching relation between the emotional state index and the emotional state;
the preset neural network model at least comprises a cyclic neural network;
After the step of obtaining the emotional state corresponding to the emotional state index, the method further includes:
judging whether the emotion state index is smaller than or equal to a preset emotion state index threshold value;
outputting a preset emotion guiding prompt if the emotion state index is smaller than or equal to the preset emotion state index threshold;
judging whether an instruction corresponding to the operation of agreeing to carry out emotion guiding by the user is received or not;
If an instruction corresponding to the operation of agreeing to carry out emotion guiding by the user is received, guiding the user to carry out emotion adjustment according to a preset emotion guiding mode;
Wherein the preset emotion guiding prompt is used for prompting a user to improve an emotional state.
2. The emotion recognition method based on the neural network model according to claim 1, wherein the step of acquiring fingertip images corresponding to a plurality of finger fingertips comprises:
Responding to an instruction for acquiring a fingertip video, and opening a flash lamp of a camera of the mobile terminal;
prompting a user to shield the camera through a finger tip;
Judging whether the camera is shielded by the finger tip of the finger or not;
If the camera is shielded by the finger tip, recording the finger tip through the camera to obtain a fingertip video;
and extracting the fingertip video to obtain a fingertip image.
3. The emotion recognition method based on the neural network model according to claim 1, wherein after the step of obtaining the emotion state index corresponding to the PPG signal, further comprises:
Acquiring all the emotion state indexes in a preset time period;
Calculating the average value of the emotional state indexes corresponding to all the emotional state indexes;
the step of obtaining the emotional state corresponding to the emotional state index according to the emotional state index and the preset matching relation between the emotional state index and the emotional state comprises the following steps:
and acquiring the emotional state corresponding to the average value of the emotional state indexes according to the average value of the emotional state indexes and a preset matching relation between the emotional state indexes and the emotional state.
4. The emotion recognition method based on neural network model of claim 3, further comprising, after the step of obtaining the emotional state corresponding to the average value of the emotional state indexes:
drawing all the emotional state indexes into an emotional trend schematic diagram;
and outputting the emotional state corresponding to the emotional trend schematic diagram and the average value of the emotional state indexes to display the emotional state.
5. The emotion recognition method based on a neural network model according to claim 1, wherein the guiding the user to perform emotion adjustment according to a preset emotion guiding manner comprises:
Acquiring the emotional state index;
Acquiring preset type healing music corresponding to the emotion state index according to the emotion state index;
Combining the preset type of healing music with preset positive training to guide the user to carry out emotion adjustment.
6. An emotion recognition device based on a neural network model, comprising:
The first acquisition unit is used for acquiring fingertip images corresponding to a plurality of finger fingertips;
The conversion unit is used for converting the format corresponding to the fingertip image so as to obtain a PPG signal corresponding to the fingertip image;
The extraction unit is used for extracting heart rate characteristics and heart rate variability characteristics from the PPG signals through a preset neural network model;
The second acquisition unit is used for acquiring an emotional state index corresponding to the PPG signal according to the heart rate characteristics and the heart rate variability characteristics;
a third obtaining unit, configured to obtain an emotional state corresponding to the emotional state index according to the emotional state index and a preset matching relationship between the emotional state index and the emotional state;
the preset neural network model at least comprises a cyclic neural network;
the emotion recognition device based on the neural network model further comprises:
the first judging unit is used for judging whether the emotion state index is smaller than or equal to a preset emotion state index threshold value;
The prompting unit is used for outputting a preset emotion guiding prompt if the emotion state index is smaller than or equal to the preset emotion state index threshold value;
The second judging unit is used for judging whether an instruction corresponding to the operation of agreeing to carry out emotion guiding by the user is received or not;
The guiding unit is used for guiding the user to carry out emotion adjustment according to a preset emotion guiding mode if receiving an instruction corresponding to the operation of the user agreeing to carry out emotion guiding;
Wherein the preset emotion guiding prompt is used for prompting a user to improve an emotional state.
7. The emotion recognition device based on a neural network model of claim 6, wherein the first acquisition unit includes:
The response subunit is used for responding to the instruction of collecting the fingertip video and opening a flash lamp of a camera of the mobile terminal;
the prompting subunit is used for prompting a user to shield the camera through finger tips;
a judging subunit, configured to judge whether the camera is blocked by the fingertip of the finger;
A recording subunit, configured to record, if the camera is blocked by the finger tip, the finger tip through the camera, so as to obtain a fingertip video;
and the extraction subunit is used for extracting the image of the fingertip video to obtain a fingertip image.
8. A computer device comprising a memory and a processor coupled to the memory; the memory is used for storing a computer program; the processor being adapted to run the computer program to perform the steps of the method according to any of claims 1-5.
9. A computer-readable storage medium, characterized in that the storage medium stores a computer program which, when executed by a processor, implements the steps of the method according to any of claims 1-5.
CN202010752926.1A 2020-07-30 2020-07-30 Emotion recognition method, emotion recognition device, computer equipment and computer readable storage medium Active CN111797817B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010752926.1A CN111797817B (en) 2020-07-30 2020-07-30 Emotion recognition method, emotion recognition device, computer equipment and computer readable storage medium
PCT/CN2020/122418 WO2021139310A1 (en) 2020-07-30 2020-10-21 Emotion recognition method, apparatus, computer device, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010752926.1A CN111797817B (en) 2020-07-30 2020-07-30 Emotion recognition method, emotion recognition device, computer equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN111797817A CN111797817A (en) 2020-10-20
CN111797817B true CN111797817B (en) 2024-04-19

Family

ID=72828073

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010752926.1A Active CN111797817B (en) 2020-07-30 2020-07-30 Emotion recognition method, emotion recognition device, computer equipment and computer readable storage medium

Country Status (2)

Country Link
CN (1) CN111797817B (en)
WO (1) WO2021139310A1 (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111797817B (en) * 2020-07-30 2024-04-19 平安科技(深圳)有限公司 Emotion recognition method, emotion recognition device, computer equipment and computer readable storage medium
CN112370057A (en) * 2020-11-09 2021-02-19 平安科技(深圳)有限公司 Pressure evaluation method and device, computer equipment and storage medium
CN112364329A (en) * 2020-12-09 2021-02-12 山西三友和智慧信息技术股份有限公司 Face authentication system and method combining heart rate detection
CN112716469B (en) * 2020-12-29 2022-07-19 厦门大学 Real-time heart rate extraction method and device based on fingertip video
CN113842145B (en) * 2021-10-11 2023-10-03 北京工业大学 Method, device and system for calculating emotion index based on pupil wave
CN114241719B (en) * 2021-12-03 2023-10-31 广州宏途数字科技有限公司 Visual fatigue state monitoring method, device and storage medium in student learning
CN115886815A (en) * 2022-11-10 2023-04-04 研祥智慧物联科技有限公司 Emotional pressure monitoring method and device and intelligent wearable device
CN117731288B (en) * 2024-01-18 2024-09-06 深圳谨启科技有限公司 AI psychological consultation method and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109793509A (en) * 2019-03-15 2019-05-24 北京科技大学 A kind of nuclear radiation detection and method for measuring heart rate and device
CN109846496A (en) * 2017-11-30 2019-06-07 昆山光微电子有限公司 The hardware implementation method and combination of intelligent wearable device mood sensing function
CN109993068A (en) * 2019-03-11 2019-07-09 华南理工大学 A kind of contactless human emotion's recognition methods based on heart rate and facial characteristics

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10517521B2 (en) * 2010-06-07 2019-12-31 Affectiva, Inc. Mental state mood analysis using heart rate collection based on video imagery
US11334066B2 (en) * 2013-03-27 2022-05-17 Pixart Imaging Inc. Safety monitoring apparatus and method thereof for human-driven vehicle
CN111797817B (en) * 2020-07-30 2024-04-19 平安科技(深圳)有限公司 Emotion recognition method, emotion recognition device, computer equipment and computer readable storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109846496A (en) * 2017-11-30 2019-06-07 昆山光微电子有限公司 The hardware implementation method and combination of intelligent wearable device mood sensing function
CN109993068A (en) * 2019-03-11 2019-07-09 华南理工大学 A kind of contactless human emotion's recognition methods based on heart rate and facial characteristics
CN109793509A (en) * 2019-03-15 2019-05-24 北京科技大学 A kind of nuclear radiation detection and method for measuring heart rate and device

Also Published As

Publication number Publication date
CN111797817A (en) 2020-10-20
WO2021139310A1 (en) 2021-07-15

Similar Documents

Publication Publication Date Title
CN111797817B (en) Emotion recognition method, emotion recognition device, computer equipment and computer readable storage medium
Gaurav et al. Cuff-less PPG based continuous blood pressure monitoring—A smartphone based approach
CN107920763B (en) Processing biological data
CN106580301B (en) A kind of monitoring method of physiological parameter, device and handheld device
US10004410B2 (en) System and methods for measuring physiological parameters
US10448900B2 (en) Method and apparatus for physiological monitoring
Lovisotto et al. Seeing red: PPG biometrics using smartphone cameras
CN107874750B (en) Pulse rate variability and sleep quality fused psychological pressure monitoring method and device
US11304662B2 (en) Lung-sound signal processing method, processing device, and readable storage medium
KR20220013559A (en) System for monitoring physiological parameters
CN114781465A (en) rPPG-based non-contact fatigue detection system and method
EP2874539A1 (en) A method and system for determining the state of a person
CN112294272A (en) Monitor and irregular pulse rate identification method thereof
WO2021164350A1 (en) Method and device for generating photoplethysmography signal
CN114027842B (en) Objective screening system, method and device for depression
US20240215841A1 (en) Method and apparatus for hypertension classification
CN109036552A (en) Tcm diagnosis terminal and its storage medium
CN115054209A (en) Multi-parameter physiological information detection system and method based on intelligent mobile device
CN112294271A (en) Monitor and irregular pulse rate identification method thereof
US20240153304A1 (en) Method, Computer Software, Non-Transitory Storage Medium, Apparatus and System For Performing A Measurement Of A Physiological Parameter Of A Person From A Series Of Images
Park Predict Daily Life Stress based on Heart Rate Variability
Müller et al. Machine Learning-Based Detection of Acute Psychosocial Stress from Digital Biomarkers
Talukdar et al. The Evaluation of Remote Monitoring Technology Across Participants With Different Skin Tones
Talukdar et al. Evaluation of Remote Monitoring Technology across different skin tone participants
Khessro et al. PPG-based Vital Signs Measurement Using SmartPhone Camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40032035

Country of ref document: HK

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant