CN111553311A - Micro-expression recognition robot and control method thereof - Google Patents

Micro-expression recognition robot and control method thereof Download PDF

Info

Publication number
CN111553311A
CN111553311A CN202010399822.7A CN202010399822A CN111553311A CN 111553311 A CN111553311 A CN 111553311A CN 202010399822 A CN202010399822 A CN 202010399822A CN 111553311 A CN111553311 A CN 111553311A
Authority
CN
China
Prior art keywords
expression
micro
person
state
tested person
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010399822.7A
Other languages
Chinese (zh)
Inventor
田佳
王彬
方健
李炜
张光娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jilin Teachers Institute of Engineering and Technology
Original Assignee
Jilin Teachers Institute of Engineering and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jilin Teachers Institute of Engineering and Technology filed Critical Jilin Teachers Institute of Engineering and Technology
Priority to CN202010399822.7A priority Critical patent/CN111553311A/en
Publication of CN111553311A publication Critical patent/CN111553311A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • General Physics & Mathematics (AREA)
  • Child & Adolescent Psychology (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Theoretical Computer Science (AREA)
  • Developmental Disabilities (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a micro-expression recognition robot, which comprises: the expression collector is used for collecting and determining expression images of the person to be detected; the micro expression recognizer is connected with the expression collector and is used for recognizing the facial organ baseline data of the detected person according to the real-time stress expression information and the expression data of the detected person; the micro-expression analyzer is connected with the micro-expression recognizer and classifies the facial features baseline data of the tested person to obtain a micro-expression recognition result. The invention also aims to provide a micro expression recognition method, which can collect and determine expression images of a detected person, and determine the micro expression state of a user based on the BP neural network, thereby simplifying the matching process and realizing the rapid recognition of micro expressions.

Description

Micro-expression recognition robot and control method thereof
Technical Field
The invention relates to the field of micro expression recognition, in particular to a micro expression recognition robot and a control method thereof.
Background
The micro expression is a very transient facial expression which cannot be automatically controlled and is revealed when people try to suppress or hide the real feelings, is an effective clue for lie recognition, and is mainly applied to the fields of safety, judicial expertise, clinic, education and the like.
At present, when a micro expression is identified, a micro expression library marked with each expression category is preset, and then a distance value between a micro expression image to be identified and each preset micro expression in the expression library is determined, and when the distance value is greater than a certain preset value, the expression category of the preset micro expression corresponding to the distance value is determined as the expression category of the micro expression image to be identified.
However, in the application stage, the method can only be applied to a single detected target with low recognition, and needs to recognize and detect the expressions of student groups in the education field, especially in the network course application occasions, so that the calculation workload is large, the recognition efficiency is low, and the feedback recognition result cannot be transmitted in time.
Disclosure of Invention
The invention designs and develops a micro expression recognition robot, wherein a micro expression recognizer is connected with an expression collector and is used for recognizing the facial organ baseline data of a tested person according to the real-time stress expression information and the expression data of the tested person. The identification effect is good.
The invention also aims to provide a micro expression recognition method, which can collect and determine expression images of a detected person, and determine the micro expression state of a user based on the BP neural network, thereby simplifying the matching process and realizing the rapid recognition of micro expressions.
The technical scheme provided by the invention is as follows:
a micro-expression recognition robot, comprising:
the expression collector is used for collecting and determining expression images of the person to be detected;
the micro expression recognizer is connected with the expression collector and is used for recognizing the facial organ baseline data of the detected person according to the real-time stress expression information and the expression data of the detected person;
the micro-expression analyzer is connected with the micro-expression recognizer and classifies the facial features baseline data of the tested person to obtain a micro-expression recognition result;
preferably, the five sense organ baseline data includes: eye rotation baseline data, eyebrow movement baseline data, and mouth corner movement baseline data.
Preferably, the baseline data of eye rotation comprises: baseline direction and baseline frequency of eye rotation.
Preferably, the mouth angle activity baseline data comprises: the rising angle of the mouth angle and the opening and closing size of the mouth angle.
A micro-expression recognition method comprises the following steps: the method comprises the following steps of collecting base line data of five sense organs of a tested person, and identifying the micro-expression of the tested person based on a BP neural network, wherein the method specifically comprises the following steps:
measuring eyeball rotation baseline data, eyebrow movement baseline data and mouth corner movement baseline data of a detected person according to a detection period;
step two, determining an input layer neuron vector { x ] of the three-layer BP neural network1,x2,x3,x4,x5,x6}; wherein x is1Is the base line direction of eye rotation, x2Is the base line frequency, x, of eye rotation3Is the rising angle of the nozzle angle, x4Opening and closing size of mouth corner of the person to be tested, x5Is the baseline direction of eyebrow movement, x6Is the baseline frequency of eyebrow activity;
mapping the input layer vector to a hidden layer, wherein the number of neurons of the hidden layer is m;
step four, obtaining an output layer neuron vector o ═ o1,o2,o3}; wherein o is1Is the eyeball state value, o2Is the value of the mouth angle state o3Is the eyebrow state value.
Preferably, the method further comprises the following steps: normalizing the eyeball state value, the mouth corner state value and the eyebrow state value to obtain an eyeball state coefficient, a mouth corner state coefficient and an eyebrow state coefficient of the detected person;
Figure BDA0002488978760000021
wherein the content of the first and second substances,
Figure BDA0002488978760000022
the state coefficient of the j-th item of the tested person, ojRespectively, output parameters are: j is 1,2, 3; oj maxAnd oj minRespectively a set maximum value and a set minimum value in the corresponding output parameter.
Preferably, the method further comprises the following steps of obtaining the micro-expression comprehensive evaluation probability of the tested person:
Figure BDA0002488978760000031
wherein P is the micro-expression comprehensive evaluation probability of the tested person, P0In order to make the micro-expression comprehensive evaluation standard value of the testing personnel, e is the base number of the natural logarithm,
Figure BDA0002488978760000032
in order to correct the coefficients of the image data,
Figure BDA0002488978760000033
in order to make the eyeball state coefficient of the tested person,
Figure BDA0002488978760000034
is an eyeball state coefficient threshold value, and the eyeball state coefficient threshold value,
Figure BDA0002488978760000035
is a coefficient of the state of the mouth angle,
Figure BDA0002488978760000036
is a threshold value for the coefficients of the mouth angle states,
Figure BDA0002488978760000037
the coefficient of the state of the eyebrow is,
Figure BDA0002488978760000038
is the threshold value of the coefficient of the most eyebrow state.
Preferably, the method further comprises a micro-expression matching recognition process, including:
when P is more than or equal to 90, judging that the micro expression of the tested person is happy;
when P is more than or equal to 85 and less than 90, judging that the micro expression of the tested person is surprised;
when P is more than or equal to 75 and less than 85, judging that the micro expression of the tested person is sad;
when P is more than or equal to 65 and less than 75, judging that the micro expression of the tested person is fear;
when P is more than or equal to 55 and less than 65, judging that the micro expression of the tested person is angry;
and when the P is more than or equal to 45 and less than 55, judging that the micro expression of the tested person is aversion.
Preferably, the number of neurons in the hidden layer is 5; the excitation functions of the hidden layer and the output layer adopt S-shaped functions fj(x)=1/(1+e-x)。
The invention has the advantages of
The invention designs and develops a micro expression recognition robot, wherein a micro expression recognizer is connected with an expression collector and used for recognizing the base line data of the five sense organs of a detected person according to the real-time stress expression information and the expression data of the detected person, and the recognition effect is good.
The invention also aims to provide a micro expression recognition method, which can collect and determine expression images of a detected person, and determine the micro expression state of a user based on the BP neural network, thereby simplifying the matching process and realizing the rapid recognition process of micro expressions.
The micro-expression recognition method provided by the invention can be applied to the medical field for recognizing the micro-expression of the patient, can play an auxiliary role in the psychological treatment process, and can assist the diagnosis of the disease condition by carrying out the micro-expression recognition on the patient with language communication expression disorder.
Detailed Description
The present invention is described in further detail below to enable those skilled in the art to practice the invention with reference to the description.
The invention provides a micro-expression recognition robot, which comprises: the expression collector is used for collecting and determining expression images of the person to be detected; the micro expression recognizer is connected with the expression collector and is used for recognizing the facial organ baseline data of the detected person according to the real-time stress expression information and the expression data of the detected person; the micro-expression analyzer is connected with the micro-expression recognizer and classifies the facial features baseline data of the tested person to obtain a micro-expression recognition result;
wherein the base line data of the five sense organs comprises: eye rotation baseline data, eyebrow movement baseline data, and mouth corner movement baseline data. Baseline data for eye rotation includes a baseline direction and a baseline frequency of eye rotation. The mouth angle activity baseline data includes: the rising angle of the mouth angle and the opening and closing size of the mouth angle.
The invention also provides a micro expression identification method, which comprises the following steps: the method comprises the following steps of collecting base line data of five sense organs of a tested person, and identifying the micro-expression of the tested person based on a BP neural network, wherein the method specifically comprises the following steps:
measuring eyeball rotation baseline data, eyebrow movement baseline data and mouth corner movement baseline data of a detected person according to a detection period;
step two, determining an input layer neuron vector { x ] of the three-layer BP neural network1,x2,x3,x4,x5,x6}; wherein x is1Is the base line direction of eye rotation, x2Is the base line frequency, x, of eye rotation3Is the rising angle of the nozzle angle, x4Opening and closing size of mouth corner of the person to be tested, x5Is the baseline direction of eyebrow movement, x6Is the baseline frequency of eyebrow activity;
mapping the input layer vector to a hidden layer, wherein the number of neurons of the hidden layer is m;
step four, obtaining an output layer neuron vector o ═ o1,o2,o3}; wherein o is1Is the eyeball state value, o2Is the value of the mouth angle state o3Is the eyebrow state value.
Preferably, the method further comprises the following steps: normalizing the eyeball state value, the mouth corner state value and the eyebrow state value to obtain an eyeball state coefficient, a mouth corner state coefficient and an eyebrow state coefficient of the detected person;
Figure BDA0002488978760000041
wherein the content of the first and second substances,
Figure BDA0002488978760000042
the state coefficient of the j-th item of the tested person, ojRespectively, output parameters are: j is 1,2, 3; oj maxAnd oj minRespectively a set maximum value and a set minimum value in the corresponding output parameter.
Preferably, the method further comprises the following steps of obtaining the micro-expression comprehensive evaluation probability of the tested person:
Figure BDA0002488978760000051
wherein P is the micro-expression comprehensive evaluation probability of the tested person, P0In order to make the micro-expression comprehensive evaluation standard value of the testing personnel, e is the base number of the natural logarithm,
Figure BDA0002488978760000052
in order to correct the coefficients of the image data,
Figure BDA0002488978760000053
in order to make the eyeball state coefficient of the tested person,
Figure BDA0002488978760000054
is an eyeball state coefficient threshold value, and the eyeball state coefficient threshold value,
Figure BDA0002488978760000055
is a coefficient of the state of the mouth angle,
Figure BDA0002488978760000056
is a threshold value for the coefficients of the mouth angle states,
Figure BDA0002488978760000057
the coefficient of the state of the eyebrow is,
Figure BDA0002488978760000058
is the threshold value of the coefficient of the most eyebrow state.
Preferably, the method further comprises a micro-expression matching recognition process, including:
when P is more than or equal to 90, judging that the micro expression of the tested person is happy;
when P is more than or equal to 85 and less than 90, judging that the micro expression of the tested person is surprised;
when P is more than or equal to 75 and less than 85, judging that the micro expression of the tested person is sad;
when P is more than or equal to 65 and less than 75, judging that the micro expression of the tested person is fear;
when P is more than or equal to 55 and less than 65, judging that the micro expression of the tested person is angry;
and when the P is more than or equal to 45 and less than 55, judging that the micro expression of the tested person is aversion.
Preferably, the number of neurons in the hidden layer is 5; the excitation functions of the hidden layer and the output layer adopt S-shaped functions fj(x)=1/(1+e-x)
The invention also provides a micro expression identification method, which comprises the following steps: the method comprises the following steps of collecting base line data of five sense organs of a tested person, and identifying the micro-expression of the tested person based on a BP neural network, wherein the method specifically comprises the following steps: :
step one, establishing a BP neural network model.
Fully interconnected connections are formed among neurons of each layer on the BP model, the neurons in each layer are not connected, and the output and the input of neurons in an input layer are the same, namely oi=xi. The operating characteristics of the neurons of the intermediate hidden and output layers are:
Figure BDA0002488978760000059
opj=fj(netpj)
where p represents the current input sample, ωjiIs the connection weight from neuron i to neuron j, opiIs the current input of neuron j, opjIs the output thereof; f. ofiIs a non-linear, slightly non-decreasing function, generally taken as a sigmoid function, i.e. fj(x)=1/(1+e-x)。
The BP neural network system structure adopted by the invention comprises three layers, wherein the first layer is an input layer, n nodes are provided in total, n detection signals of the micro expression robot are corresponded, and the signal parameters are given by a data preprocessing module; the second layer is a hidden layer, and has m nodes which are determined by the training process of the network in a self-adaptive mode; the third layer is an output layer, p nodes are provided in total, and the output is determined by the response actually needed by the system.
The mathematical model of the network is:
inputting a vector: x ═ x1,x2…xn)T
Intermediate layer vector: y ═ y1,y2…ym)T
Outputting a vector: o ═ o (o)1,o2…op)T
In the invention, the number of nodes of an input layer is 6, the number of nodes of an output layer is 3, and the number of nodes of a hidden layer is 5.
The 6 parameters of the input layer are respectively expressed as x1Is the base line direction of eye rotation, x2Is the base line frequency, x, of eye rotation3Is the rising angle of the nozzle angle, x4Opening and closing size of mouth corner of the person to be tested, x5Is the baseline direction of eyebrow movement, x6Is the baseline frequency of eyebrow activity;
the output layer has 3 parameters respectively represented as o1Is the eyeball state value, o2Is the value of the mouth angle state o3Is the eyebrow state value.
And step two, training the BP neural network.
After the BP neural network node model is established, the training of the BP neural network can be carried out. And obtaining a training sample according to historical experience data of the product, and giving a connection weight between the input node i and the hidden layer node j and a connection weight between the hidden layer node j and the output layer node k.
(1) Training method
Each subnet adopts a separate training method; when training, firstly providing a group of training samples, wherein each sample consists of an input sample and an ideal output pair, and when all actual outputs of the network are consistent with the ideal outputs of the network, the training is finished; otherwise, the ideal output of the network is consistent with the actual output by correcting the weight.
(2) Training algorithm
The BP network is trained by using a back Propagation (Backward Propagation) algorithm, and the steps can be summarized as follows:
the first step is as follows: and selecting a network with a reasonable structure, and setting initial values of all node thresholds and connection weights.
The second step is that: for each input sample, the following calculations are made:
(a) forward calculation: for j unit of l layer
Figure BDA0002488978760000071
In the formula (I), the compound is shown in the specification,
Figure BDA0002488978760000072
for the weighted sum of the j unit information of the l layer at the nth calculation,
Figure BDA0002488978760000073
is the connection weight between the j cell of the l layer and the cell i of the previous layer (i.e. the l-1 layer),
Figure BDA0002488978760000074
is the previous layer (i.e. l-1 layer, node number n)l-1) The operating signal sent by the unit i; when i is 0, order
Figure BDA0002488978760000075
Is the threshold of the j cell of the l layer.
If the activation function of the unit j is a sigmoid function, then
Figure BDA0002488978760000076
And is
Figure BDA0002488978760000077
If neuron j belongs to the output layer, then there are
Figure BDA0002488978760000078
And ej(n)=xj(n)-oj(n);
(b) And (3) calculating the error reversely:
for output unit
Figure BDA0002488978760000079
Pair hidden unit
Figure BDA00024889787600000710
(b) Correcting the weight value:
Figure BDA00024889787600000711
to learn the rate.
The third step: inputting a new sample or a new period sample until the network converges, and randomly re-ordering the input sequence of the samples in each period during training.
The BP algorithm adopts a gradient descent method to solve the extreme value of a nonlinear function, and has the problems of local minimum, low convergence speed and the like. A more effective algorithm is a Levenberg-Marquardt optimization algorithm, which enables the network learning time to be shorter and can effectively inhibit the network from being locally minimum. The weight adjustment rate is selected as
Δω=(JJT+μI)-1JTe
Wherein J is a Jacobian (Jacobian) matrix of error to weight differentiation, I is an input vector, e is an error vector, and the variable mu is a scalar quantity which is self-adaptive and adjusted and is used for determining whether the learning is finished according to a Newton method or a gradient method.
When the system is designed, the system model is a network which is only initialized, the weight needs to be learned and adjusted according to data samples obtained in the using process, and therefore the self-learning function of the system is designed. Under the condition of appointing learning samples and quantity, the system can carry out self-learning so as to continuously improve the network performance.
Step three, normalizing the eyeball state value, the mouth corner state value and the eyebrow state value to obtain an eyeball state coefficient, a mouth corner state coefficient and an eyebrow state coefficient of the detected person;
Figure BDA0002488978760000081
wherein the content of the first and second substances,
Figure BDA0002488978760000082
the state coefficient of the j-th item of the tested person, ojRespectively, output parameters are: j is 1,2, 3; oj maxAnd oj minRespectively a set maximum value and a set minimum value in the corresponding output parameter.
Step four, obtaining the micro-expression comprehensive evaluation probability of the tested person:
Figure BDA0002488978760000083
wherein P is the micro-expression comprehensive evaluation probability of the tested person, P0In order to make the micro-expression comprehensive evaluation standard value of the testing personnel, e is the base number of the natural logarithm,
Figure BDA0002488978760000084
in order to correct the coefficients of the image data,
Figure BDA0002488978760000085
in order to make the eyeball state coefficient of the tested person,
Figure BDA0002488978760000086
is an eyeball state coefficient threshold value, and the eyeball state coefficient threshold value,
Figure BDA0002488978760000087
is a coefficient of the state of the mouth angle,
Figure BDA0002488978760000088
is a threshold value for the coefficients of the mouth angle states,
Figure BDA0002488978760000089
the coefficient of the state of the eyebrow is,
Figure BDA00024889787600000810
is the maximum eyebrow state coefficient threshold;
also include the micro-expression matching recognition process, including:
when P is more than or equal to 90, judging that the micro expression of the tested person is happy;
when P is more than or equal to 85 and less than 90, judging that the micro expression of the tested person is surprised;
when P is more than or equal to 75 and less than 85, judging that the micro expression of the tested person is sad;
when P is more than or equal to 65 and less than 75, judging that the micro expression of the tested person is fear;
when P is more than or equal to 55 and less than 65, judging that the micro expression of the tested person is angry;
and when the P is more than or equal to 45 and less than 55, judging that the micro expression of the tested person is aversion.
The invention designs and develops a micro expression recognition robot, wherein a micro expression recognizer is connected with an expression collector and used for recognizing the base line data of the five sense organs of a detected person according to the real-time stress expression information and the expression data of the detected person, and the recognition effect is good.
The invention also aims to provide a micro expression recognition method, which can collect and determine expression images of a detected person, and determine the micro expression state of a user based on the BP neural network, thereby simplifying the matching process and realizing the rapid recognition process of micro expressions.
The micro-expression recognition method provided by the invention can be applied to the medical field for recognizing the micro-expression of the patient, can play an auxiliary role in the psychological treatment process, and can assist the diagnosis of the disease condition by carrying out the micro-expression recognition on the patient with language communication expression disorder.
While embodiments of the invention have been described above, it is not limited to the applications set forth in the description and the embodiments, which are fully applicable to various fields of endeavor for which the invention may be embodied with additional modifications as would be readily apparent to those skilled in the art, and the invention is therefore not limited to the details given herein and to the embodiments shown and described without departing from the generic concept as defined by the claims and their equivalents.

Claims (9)

1. A micro-expression recognition robot, comprising:
the expression collector is used for collecting and determining expression images of the person to be detected;
the micro expression recognizer is connected with the expression collector and is used for recognizing the facial organ baseline data of the detected person according to the real-time stress expression information and the expression data of the detected person;
and the micro-expression analyzer is connected with the micro-expression recognizer and classifies the five sense organ baseline data of the tested person to obtain a micro-expression recognition result.
2. The micro expression recognition robot of claim 1, wherein the facial features baseline data comprises: eye rotation baseline data, eyebrow movement baseline data, and mouth corner movement baseline data.
3. The micro expression recognition robot of claim 2, wherein the baseline data for eye rotation includes a baseline direction and a baseline frequency of eye rotation.
4. The micro-expression recognition robot of claim 2, wherein the mouth angle activity baseline data comprises: the rising angle of the mouth angle and the opening and closing size of the mouth angle.
5. A micro-expression recognition method is characterized by comprising the following steps: the method comprises the following steps of collecting base line data of five sense organs of a tested person, and identifying the micro-expression of the tested person based on a BP neural network, wherein the method specifically comprises the following steps:
measuring eyeball rotation baseline data, eyebrow movement baseline data and mouth corner movement baseline data of a detected person according to a detection period;
step two, determining an input layer neuron vector { x ] of the three-layer BP neural network1,x2,x3,x4,x5,x6}; wherein x is1Is the base line direction of eye rotation, x2Is the base line frequency, x, of eye rotation3Is the rising angle of the nozzle angle, x4Opening and closing size of mouth corner of the person to be tested, x5Is the baseline direction of eyebrow movement, x6Is the baseline frequency of eyebrow activity;
mapping the input layer vector to a hidden layer, wherein the number of neurons of the hidden layer is m;
step four, obtaining an output layer neuron vector o ═ o1,o2,o3}; wherein o is1Is the eyeball state value, o2Is the value of the mouth angle state o3Is the eyebrow state value.
6. The micro expression recognition method of claim 5, further comprising: normalizing the eyeball state value, the mouth corner state value and the eyebrow state value to obtain an eyeball state coefficient, a mouth corner state coefficient and an eyebrow state coefficient of the detected person;
Figure FDA0002488978750000021
wherein the content of the first and second substances,
Figure FDA0002488978750000022
the state coefficient of the j-th item of the tested person, ojRespectively, output parameters are: j is 1,2, 3; ojmaxAnd ojminRespectively a set maximum value and a set minimum value in the corresponding output parameter.
7. The micro-expression recognition method according to claim 6, further comprising obtaining a micro-expression comprehensive evaluation probability of the person under test:
Figure FDA0002488978750000023
wherein P is the micro-expression comprehensive evaluation probability of the tested person, P0In order to make the micro-expression comprehensive evaluation standard value of the testing personnel, e is the base number of the natural logarithm,
Figure FDA0002488978750000024
in order to correct the coefficients of the image data,
Figure FDA0002488978750000025
in order to make the eyeball state coefficient of the tested person,
Figure FDA0002488978750000026
Figure FDA0002488978750000027
is an eyeball state coefficient threshold value, and the eyeball state coefficient threshold value,
Figure FDA0002488978750000028
is a coefficient of the state of the mouth angle,
Figure FDA0002488978750000029
Figure FDA00024889787500000210
is a threshold value for the coefficients of the mouth angle states,
Figure FDA00024889787500000211
the coefficient of the state of the eyebrow is,
Figure FDA00024889787500000212
Figure FDA00024889787500000213
is the threshold value of the coefficient of the most eyebrow state.
8. The micro expression recognition method of claim 7, further comprising a micro expression matching recognition process comprising:
when P is more than or equal to 90, judging that the micro expression of the tested person is happy;
when P is more than or equal to 85 and less than 90, judging that the micro expression of the tested person is surprised;
when P is more than or equal to 75 and less than 85, judging that the micro expression of the tested person is sad;
when P is more than or equal to 65 and less than 75, judging that the micro expression of the tested person is fear;
when P is more than or equal to 55 and less than 65, judging that the micro expression of the tested person is angry;
and when the P is more than or equal to 45 and less than 55, judging that the micro expression of the tested person is aversion.
9. The micro expression recognition method according to any one of claims 5 to 8, wherein the number of neurons of the hidden layer is 5; the excitation functions of the hidden layer and the output layer adopt S-shaped functions fj(x)=1/(1+e-x)。
CN202010399822.7A 2020-05-13 2020-05-13 Micro-expression recognition robot and control method thereof Pending CN111553311A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010399822.7A CN111553311A (en) 2020-05-13 2020-05-13 Micro-expression recognition robot and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010399822.7A CN111553311A (en) 2020-05-13 2020-05-13 Micro-expression recognition robot and control method thereof

Publications (1)

Publication Number Publication Date
CN111553311A true CN111553311A (en) 2020-08-18

Family

ID=72008080

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010399822.7A Pending CN111553311A (en) 2020-05-13 2020-05-13 Micro-expression recognition robot and control method thereof

Country Status (1)

Country Link
CN (1) CN111553311A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112487980A (en) * 2020-11-30 2021-03-12 深圳市广信安科技股份有限公司 Micro-expression-based treatment method, device, system and computer-readable storage medium
CN113160629A (en) * 2021-05-06 2021-07-23 吉林工程技术师范学院 Man-machine cooperation learning education robot with emotion recognition function

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110301440A1 (en) * 2010-06-07 2011-12-08 Riley Carl W Apparatus for supporting and monitoring a person
CN105913046A (en) * 2016-05-06 2016-08-31 姜振宇 Micro-expression identification device and method
CN106599800A (en) * 2016-11-25 2017-04-26 哈尔滨工程大学 Face micro-expression recognition method based on deep learning
CN107273845A (en) * 2017-06-12 2017-10-20 大连海事大学 A kind of facial expression recognizing method based on confidence region and multiple features Weighted Fusion
US20180108440A1 (en) * 2016-10-17 2018-04-19 Jeffrey Stevens Systems and methods for medical diagnosis and biomarker identification using physiological sensors and machine learning
CN109284713A (en) * 2018-09-21 2019-01-29 上海健坤教育科技有限公司 A kind of Emotion identification analysis system based on camera acquisition expression data
CN109359599A (en) * 2018-10-19 2019-02-19 昆山杜克大学 Human facial expression recognition method based on combination learning identity and emotion information
CN109409296A (en) * 2018-10-30 2019-03-01 河北工业大学 The video feeling recognition methods that facial expression recognition and speech emotion recognition are merged
CN109447001A (en) * 2018-10-31 2019-03-08 深圳市安视宝科技有限公司 A kind of dynamic Emotion identification method
CN109472206A (en) * 2018-10-11 2019-03-15 平安科技(深圳)有限公司 Methods of risk assessment, device, equipment and medium based on micro- expression
CN110969073A (en) * 2019-08-23 2020-04-07 贵州大学 Facial expression recognition method based on feature fusion and BP neural network

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110301440A1 (en) * 2010-06-07 2011-12-08 Riley Carl W Apparatus for supporting and monitoring a person
CN105913046A (en) * 2016-05-06 2016-08-31 姜振宇 Micro-expression identification device and method
US20180108440A1 (en) * 2016-10-17 2018-04-19 Jeffrey Stevens Systems and methods for medical diagnosis and biomarker identification using physiological sensors and machine learning
CN106599800A (en) * 2016-11-25 2017-04-26 哈尔滨工程大学 Face micro-expression recognition method based on deep learning
CN107273845A (en) * 2017-06-12 2017-10-20 大连海事大学 A kind of facial expression recognizing method based on confidence region and multiple features Weighted Fusion
CN109284713A (en) * 2018-09-21 2019-01-29 上海健坤教育科技有限公司 A kind of Emotion identification analysis system based on camera acquisition expression data
CN109472206A (en) * 2018-10-11 2019-03-15 平安科技(深圳)有限公司 Methods of risk assessment, device, equipment and medium based on micro- expression
CN109359599A (en) * 2018-10-19 2019-02-19 昆山杜克大学 Human facial expression recognition method based on combination learning identity and emotion information
CN109409296A (en) * 2018-10-30 2019-03-01 河北工业大学 The video feeling recognition methods that facial expression recognition and speech emotion recognition are merged
CN109447001A (en) * 2018-10-31 2019-03-08 深圳市安视宝科技有限公司 A kind of dynamic Emotion identification method
CN110969073A (en) * 2019-08-23 2020-04-07 贵州大学 Facial expression recognition method based on feature fusion and BP neural network

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王志: "一种基于多层BP神经网络面部表情识别方法" *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112487980A (en) * 2020-11-30 2021-03-12 深圳市广信安科技股份有限公司 Micro-expression-based treatment method, device, system and computer-readable storage medium
CN113160629A (en) * 2021-05-06 2021-07-23 吉林工程技术师范学院 Man-machine cooperation learning education robot with emotion recognition function

Similar Documents

Publication Publication Date Title
CN108284442B (en) Mechanical arm flexible joint control method based on fuzzy neural network
CN107392255A (en) Generation method, device, computing device and the storage medium of minority class picture sample
CN107102727A (en) Dynamic gesture study and recognition methods based on ELM neutral nets
CN110188794B (en) Deep learning model training method, device, equipment and storage medium
CN111553311A (en) Micro-expression recognition robot and control method thereof
CN113344479B (en) Online classroom-oriented learning participation intelligent assessment method and device
CN106909938A (en) Viewing angle independence Activity recognition method based on deep learning network
CN110163098A (en) Based on the facial expression recognition model construction of depth of seam division network and recognition methods
CN110786855B (en) Sputum induction device and control method thereof
Hafez et al. Improving robot dual-system motor learning with intrinsically motivated meta-control and latent-space experience imagination
CN110909621A (en) Body-building guidance system based on vision
CN108538301A (en) A kind of intelligent digital musical instrument based on neural network Audiotechnica
CN111027215B (en) Character training system and method for virtual person
CN114504777B (en) Exercise intensity calculation system and method based on neural network and fuzzy comprehensive evaluation
JP3816762B2 (en) Neural network, neural network system, and neural network processing program
CN114952791A (en) Control method and device for musculoskeletal robot
CN114742292A (en) Knowledge tracking process-oriented two-state co-evolution method for predicting future performance of students
CN109635942B (en) Brain excitation state and inhibition state imitation working state neural network circuit structure and method
Wang et al. Bio-inspired computing: A deep learning algorithm with the spike-frequency adaptation
Sawaragi et al. Self-reflective segmentation of human bodily motions using recurrent neural networks
JPH02231670A (en) Learning device for neural network
Wang et al. TCM pulse-condition classification method based on BP neural network
CN113723594B (en) Pulse neural network target identification method
Leng Research on Optimizing Facial Expression Recognition Based on Convolutional Neural Network
JP3085312B2 (en) Character recognition learning method and character recognition device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination