CN111553311A - Micro-expression recognition robot and control method thereof - Google Patents
Micro-expression recognition robot and control method thereof Download PDFInfo
- Publication number
- CN111553311A CN111553311A CN202010399822.7A CN202010399822A CN111553311A CN 111553311 A CN111553311 A CN 111553311A CN 202010399822 A CN202010399822 A CN 202010399822A CN 111553311 A CN111553311 A CN 111553311A
- Authority
- CN
- China
- Prior art keywords
- expression
- micro
- person
- state
- tested person
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 50
- 230000014509 gene expression Effects 0.000 claims abstract description 104
- 238000013528 artificial neural network Methods 0.000 claims abstract description 15
- 230000008569 process Effects 0.000 claims abstract description 14
- 230000001815 facial effect Effects 0.000 claims abstract description 9
- 210000000056 organ Anatomy 0.000 claims abstract description 5
- 210000004709 eyebrow Anatomy 0.000 claims description 34
- 210000005252 bulbus oculi Anatomy 0.000 claims description 27
- 210000002569 neuron Anatomy 0.000 claims description 20
- 230000004418 eye rotation Effects 0.000 claims description 13
- 238000011156 evaluation Methods 0.000 claims description 12
- 230000006870 function Effects 0.000 claims description 12
- 230000000694 effects Effects 0.000 claims description 10
- 210000000697 sensory organ Anatomy 0.000 claims description 9
- 230000000630 rising effect Effects 0.000 claims description 7
- 238000012360 testing method Methods 0.000 claims description 5
- 206010063659 Aversion Diseases 0.000 claims description 4
- 238000001514 detection method Methods 0.000 claims description 4
- 210000001508 eye Anatomy 0.000 claims description 4
- 230000005284 excitation Effects 0.000 claims description 3
- 238000013507 mapping Methods 0.000 claims description 3
- 238000012549 training Methods 0.000 description 11
- 238000004364 calculation method Methods 0.000 description 4
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 4
- 210000004027 cell Anatomy 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000003745 diagnosis Methods 0.000 description 2
- 201000010099 disease Diseases 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 238000011478 gradient descent method Methods 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/70—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- General Physics & Mathematics (AREA)
- Child & Adolescent Psychology (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Multimedia (AREA)
- Pathology (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Theoretical Computer Science (AREA)
- Developmental Disabilities (AREA)
- Hospice & Palliative Care (AREA)
- Psychiatry (AREA)
- Psychology (AREA)
- Social Psychology (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a micro-expression recognition robot, which comprises: the expression collector is used for collecting and determining expression images of the person to be detected; the micro expression recognizer is connected with the expression collector and is used for recognizing the facial organ baseline data of the detected person according to the real-time stress expression information and the expression data of the detected person; the micro-expression analyzer is connected with the micro-expression recognizer and classifies the facial features baseline data of the tested person to obtain a micro-expression recognition result. The invention also aims to provide a micro expression recognition method, which can collect and determine expression images of a detected person, and determine the micro expression state of a user based on the BP neural network, thereby simplifying the matching process and realizing the rapid recognition of micro expressions.
Description
Technical Field
The invention relates to the field of micro expression recognition, in particular to a micro expression recognition robot and a control method thereof.
Background
The micro expression is a very transient facial expression which cannot be automatically controlled and is revealed when people try to suppress or hide the real feelings, is an effective clue for lie recognition, and is mainly applied to the fields of safety, judicial expertise, clinic, education and the like.
At present, when a micro expression is identified, a micro expression library marked with each expression category is preset, and then a distance value between a micro expression image to be identified and each preset micro expression in the expression library is determined, and when the distance value is greater than a certain preset value, the expression category of the preset micro expression corresponding to the distance value is determined as the expression category of the micro expression image to be identified.
However, in the application stage, the method can only be applied to a single detected target with low recognition, and needs to recognize and detect the expressions of student groups in the education field, especially in the network course application occasions, so that the calculation workload is large, the recognition efficiency is low, and the feedback recognition result cannot be transmitted in time.
Disclosure of Invention
The invention designs and develops a micro expression recognition robot, wherein a micro expression recognizer is connected with an expression collector and is used for recognizing the facial organ baseline data of a tested person according to the real-time stress expression information and the expression data of the tested person. The identification effect is good.
The invention also aims to provide a micro expression recognition method, which can collect and determine expression images of a detected person, and determine the micro expression state of a user based on the BP neural network, thereby simplifying the matching process and realizing the rapid recognition of micro expressions.
The technical scheme provided by the invention is as follows:
a micro-expression recognition robot, comprising:
the expression collector is used for collecting and determining expression images of the person to be detected;
the micro expression recognizer is connected with the expression collector and is used for recognizing the facial organ baseline data of the detected person according to the real-time stress expression information and the expression data of the detected person;
the micro-expression analyzer is connected with the micro-expression recognizer and classifies the facial features baseline data of the tested person to obtain a micro-expression recognition result;
preferably, the five sense organ baseline data includes: eye rotation baseline data, eyebrow movement baseline data, and mouth corner movement baseline data.
Preferably, the baseline data of eye rotation comprises: baseline direction and baseline frequency of eye rotation.
Preferably, the mouth angle activity baseline data comprises: the rising angle of the mouth angle and the opening and closing size of the mouth angle.
A micro-expression recognition method comprises the following steps: the method comprises the following steps of collecting base line data of five sense organs of a tested person, and identifying the micro-expression of the tested person based on a BP neural network, wherein the method specifically comprises the following steps:
measuring eyeball rotation baseline data, eyebrow movement baseline data and mouth corner movement baseline data of a detected person according to a detection period;
step two, determining an input layer neuron vector { x ] of the three-layer BP neural network1,x2,x3,x4,x5,x6}; wherein x is1Is the base line direction of eye rotation, x2Is the base line frequency, x, of eye rotation3Is the rising angle of the nozzle angle, x4Opening and closing size of mouth corner of the person to be tested, x5Is the baseline direction of eyebrow movement, x6Is the baseline frequency of eyebrow activity;
mapping the input layer vector to a hidden layer, wherein the number of neurons of the hidden layer is m;
step four, obtaining an output layer neuron vector o ═ o1,o2,o3}; wherein o is1Is the eyeball state value, o2Is the value of the mouth angle state o3Is the eyebrow state value.
Preferably, the method further comprises the following steps: normalizing the eyeball state value, the mouth corner state value and the eyebrow state value to obtain an eyeball state coefficient, a mouth corner state coefficient and an eyebrow state coefficient of the detected person;
wherein,the state coefficient of the j-th item of the tested person, ojRespectively, output parameters are: j is 1,2, 3; oj maxAnd oj minRespectively a set maximum value and a set minimum value in the corresponding output parameter.
Preferably, the method further comprises the following steps of obtaining the micro-expression comprehensive evaluation probability of the tested person:
wherein P is the micro-expression comprehensive evaluation probability of the tested person, P0In order to make the micro-expression comprehensive evaluation standard value of the testing personnel, e is the base number of the natural logarithm,in order to correct the coefficients of the image data,in order to make the eyeball state coefficient of the tested person,is an eyeball state coefficient threshold value, and the eyeball state coefficient threshold value,is a coefficient of the state of the mouth angle,is a threshold value for the coefficients of the mouth angle states,the coefficient of the state of the eyebrow is,is the threshold value of the coefficient of the most eyebrow state.
Preferably, the method further comprises a micro-expression matching recognition process, including:
when P is more than or equal to 90, judging that the micro expression of the tested person is happy;
when P is more than or equal to 85 and less than 90, judging that the micro expression of the tested person is surprised;
when P is more than or equal to 75 and less than 85, judging that the micro expression of the tested person is sad;
when P is more than or equal to 65 and less than 75, judging that the micro expression of the tested person is fear;
when P is more than or equal to 55 and less than 65, judging that the micro expression of the tested person is angry;
and when the P is more than or equal to 45 and less than 55, judging that the micro expression of the tested person is aversion.
Preferably, the number of neurons in the hidden layer is 5; the excitation functions of the hidden layer and the output layer adopt S-shaped functions fj(x)=1/(1+e-x)。
The invention has the advantages of
The invention designs and develops a micro expression recognition robot, wherein a micro expression recognizer is connected with an expression collector and used for recognizing the base line data of the five sense organs of a detected person according to the real-time stress expression information and the expression data of the detected person, and the recognition effect is good.
The invention also aims to provide a micro expression recognition method, which can collect and determine expression images of a detected person, and determine the micro expression state of a user based on the BP neural network, thereby simplifying the matching process and realizing the rapid recognition process of micro expressions.
The micro-expression recognition method provided by the invention can be applied to the medical field for recognizing the micro-expression of the patient, can play an auxiliary role in the psychological treatment process, and can assist the diagnosis of the disease condition by carrying out the micro-expression recognition on the patient with language communication expression disorder.
Detailed Description
The present invention is described in further detail below to enable those skilled in the art to practice the invention with reference to the description.
The invention provides a micro-expression recognition robot, which comprises: the expression collector is used for collecting and determining expression images of the person to be detected; the micro expression recognizer is connected with the expression collector and is used for recognizing the facial organ baseline data of the detected person according to the real-time stress expression information and the expression data of the detected person; the micro-expression analyzer is connected with the micro-expression recognizer and classifies the facial features baseline data of the tested person to obtain a micro-expression recognition result;
wherein the base line data of the five sense organs comprises: eye rotation baseline data, eyebrow movement baseline data, and mouth corner movement baseline data. Baseline data for eye rotation includes a baseline direction and a baseline frequency of eye rotation. The mouth angle activity baseline data includes: the rising angle of the mouth angle and the opening and closing size of the mouth angle.
The invention also provides a micro expression identification method, which comprises the following steps: the method comprises the following steps of collecting base line data of five sense organs of a tested person, and identifying the micro-expression of the tested person based on a BP neural network, wherein the method specifically comprises the following steps:
measuring eyeball rotation baseline data, eyebrow movement baseline data and mouth corner movement baseline data of a detected person according to a detection period;
step two, determining an input layer neuron vector { x ] of the three-layer BP neural network1,x2,x3,x4,x5,x6}; wherein x is1Is the base line direction of eye rotation, x2Is the base line frequency, x, of eye rotation3Is the rising angle of the nozzle angle, x4Opening and closing size of mouth corner of the person to be tested, x5Is the baseline direction of eyebrow movement, x6Is the baseline frequency of eyebrow activity;
mapping the input layer vector to a hidden layer, wherein the number of neurons of the hidden layer is m;
step four, obtaining an output layer neuron vector o ═ o1,o2,o3}; wherein o is1Is the eyeball state value, o2Is the value of the mouth angle state o3Is the eyebrow state value.
Preferably, the method further comprises the following steps: normalizing the eyeball state value, the mouth corner state value and the eyebrow state value to obtain an eyeball state coefficient, a mouth corner state coefficient and an eyebrow state coefficient of the detected person;
wherein,the state coefficient of the j-th item of the tested person, ojRespectively, output parameters are: j is 1,2, 3; oj maxAnd oj minRespectively a set maximum value and a set minimum value in the corresponding output parameter.
Preferably, the method further comprises the following steps of obtaining the micro-expression comprehensive evaluation probability of the tested person:
wherein P is the micro-expression comprehensive evaluation probability of the tested person, P0In order to make the micro-expression comprehensive evaluation standard value of the testing personnel, e is the base number of the natural logarithm,in order to correct the coefficients of the image data,in order to make the eyeball state coefficient of the tested person,is an eyeball state coefficient threshold value, and the eyeball state coefficient threshold value,is a coefficient of the state of the mouth angle,is a threshold value for the coefficients of the mouth angle states,the coefficient of the state of the eyebrow is,is the threshold value of the coefficient of the most eyebrow state.
Preferably, the method further comprises a micro-expression matching recognition process, including:
when P is more than or equal to 90, judging that the micro expression of the tested person is happy;
when P is more than or equal to 85 and less than 90, judging that the micro expression of the tested person is surprised;
when P is more than or equal to 75 and less than 85, judging that the micro expression of the tested person is sad;
when P is more than or equal to 65 and less than 75, judging that the micro expression of the tested person is fear;
when P is more than or equal to 55 and less than 65, judging that the micro expression of the tested person is angry;
and when the P is more than or equal to 45 and less than 55, judging that the micro expression of the tested person is aversion.
Preferably, the number of neurons in the hidden layer is 5; the excitation functions of the hidden layer and the output layer adopt S-shaped functions fj(x)=1/(1+e-x)
The invention also provides a micro expression identification method, which comprises the following steps: the method comprises the following steps of collecting base line data of five sense organs of a tested person, and identifying the micro-expression of the tested person based on a BP neural network, wherein the method specifically comprises the following steps: :
step one, establishing a BP neural network model.
Fully interconnected connections are formed among neurons of each layer on the BP model, the neurons in each layer are not connected, and the output and the input of neurons in an input layer are the same, namely oi=xi. The operating characteristics of the neurons of the intermediate hidden and output layers are:
opj=fj(netpj)
where p represents the current input sample, ωjiIs the connection weight from neuron i to neuron j, opiIs the current input of neuron j, opjIs the output thereof; f. ofiIs a non-linear, slightly non-decreasing function, generally taken as a sigmoid function, i.e. fj(x)=1/(1+e-x)。
The BP neural network system structure adopted by the invention comprises three layers, wherein the first layer is an input layer, n nodes are provided in total, n detection signals of the micro expression robot are corresponded, and the signal parameters are given by a data preprocessing module; the second layer is a hidden layer, and has m nodes which are determined by the training process of the network in a self-adaptive mode; the third layer is an output layer, p nodes are provided in total, and the output is determined by the response actually needed by the system.
The mathematical model of the network is:
inputting a vector: x ═ x1,x2…xn)T
Intermediate layer vector: y ═ y1,y2…ym)T
Outputting a vector: o ═ o (o)1,o2…op)T
In the invention, the number of nodes of an input layer is 6, the number of nodes of an output layer is 3, and the number of nodes of a hidden layer is 5.
The 6 parameters of the input layer are respectively expressed as x1Is the base line direction of eye rotation, x2Is the base line frequency, x, of eye rotation3Is the rising angle of the nozzle angle, x4Opening and closing size of mouth corner of the person to be tested, x5Is the baseline direction of eyebrow movement, x6Is the baseline frequency of eyebrow activity;
the output layer has 3 parameters respectively represented as o1Is the eyeball state value, o2Is the value of the mouth angle state o3Is the eyebrow state value.
And step two, training the BP neural network.
After the BP neural network node model is established, the training of the BP neural network can be carried out. And obtaining a training sample according to historical experience data of the product, and giving a connection weight between the input node i and the hidden layer node j and a connection weight between the hidden layer node j and the output layer node k.
(1) Training method
Each subnet adopts a separate training method; when training, firstly providing a group of training samples, wherein each sample consists of an input sample and an ideal output pair, and when all actual outputs of the network are consistent with the ideal outputs of the network, the training is finished; otherwise, the ideal output of the network is consistent with the actual output by correcting the weight.
(2) Training algorithm
The BP network is trained by using a back Propagation (Backward Propagation) algorithm, and the steps can be summarized as follows:
the first step is as follows: and selecting a network with a reasonable structure, and setting initial values of all node thresholds and connection weights.
The second step is that: for each input sample, the following calculations are made:
(a) forward calculation: for j unit of l layer
In the formula,for the weighted sum of the j unit information of the l layer at the nth calculation,is the connection weight between the j cell of the l layer and the cell i of the previous layer (i.e. the l-1 layer),is the previous layer (i.e. l-1 layer, node number n)l-1) The operating signal sent by the unit i; when i is 0, orderIs the threshold of the j cell of the l layer.
If the activation function of the unit j is a sigmoid function, then
If neuron j belongs to the output layer, then there are
(b) And (3) calculating the error reversely:
for output unit
Pair hidden unit
(b) Correcting the weight value:
The third step: inputting a new sample or a new period sample until the network converges, and randomly re-ordering the input sequence of the samples in each period during training.
The BP algorithm adopts a gradient descent method to solve the extreme value of a nonlinear function, and has the problems of local minimum, low convergence speed and the like. A more effective algorithm is a Levenberg-Marquardt optimization algorithm, which enables the network learning time to be shorter and can effectively inhibit the network from being locally minimum. The weight adjustment rate is selected as
Δω=(JJT+μI)-1JTe
Wherein J is a Jacobian (Jacobian) matrix of error to weight differentiation, I is an input vector, e is an error vector, and the variable mu is a scalar quantity which is self-adaptive and adjusted and is used for determining whether the learning is finished according to a Newton method or a gradient method.
When the system is designed, the system model is a network which is only initialized, the weight needs to be learned and adjusted according to data samples obtained in the using process, and therefore the self-learning function of the system is designed. Under the condition of appointing learning samples and quantity, the system can carry out self-learning so as to continuously improve the network performance.
Step three, normalizing the eyeball state value, the mouth corner state value and the eyebrow state value to obtain an eyeball state coefficient, a mouth corner state coefficient and an eyebrow state coefficient of the detected person;
wherein,the state coefficient of the j-th item of the tested person, ojRespectively, output parameters are: j is 1,2, 3; oj maxAnd oj minRespectively a set maximum value and a set minimum value in the corresponding output parameter.
Step four, obtaining the micro-expression comprehensive evaluation probability of the tested person:
wherein P is the micro-expression comprehensive evaluation probability of the tested person, P0In order to make the micro-expression comprehensive evaluation standard value of the testing personnel, e is the base number of the natural logarithm,in order to correct the coefficients of the image data,in order to make the eyeball state coefficient of the tested person,is an eyeball state coefficient threshold value, and the eyeball state coefficient threshold value,is a coefficient of the state of the mouth angle,is a threshold value for the coefficients of the mouth angle states,the coefficient of the state of the eyebrow is,is the maximum eyebrow state coefficient threshold;
also include the micro-expression matching recognition process, including:
when P is more than or equal to 90, judging that the micro expression of the tested person is happy;
when P is more than or equal to 85 and less than 90, judging that the micro expression of the tested person is surprised;
when P is more than or equal to 75 and less than 85, judging that the micro expression of the tested person is sad;
when P is more than or equal to 65 and less than 75, judging that the micro expression of the tested person is fear;
when P is more than or equal to 55 and less than 65, judging that the micro expression of the tested person is angry;
and when the P is more than or equal to 45 and less than 55, judging that the micro expression of the tested person is aversion.
The invention designs and develops a micro expression recognition robot, wherein a micro expression recognizer is connected with an expression collector and used for recognizing the base line data of the five sense organs of a detected person according to the real-time stress expression information and the expression data of the detected person, and the recognition effect is good.
The invention also aims to provide a micro expression recognition method, which can collect and determine expression images of a detected person, and determine the micro expression state of a user based on the BP neural network, thereby simplifying the matching process and realizing the rapid recognition process of micro expressions.
The micro-expression recognition method provided by the invention can be applied to the medical field for recognizing the micro-expression of the patient, can play an auxiliary role in the psychological treatment process, and can assist the diagnosis of the disease condition by carrying out the micro-expression recognition on the patient with language communication expression disorder.
While embodiments of the invention have been described above, it is not limited to the applications set forth in the description and the embodiments, which are fully applicable to various fields of endeavor for which the invention may be embodied with additional modifications as would be readily apparent to those skilled in the art, and the invention is therefore not limited to the details given herein and to the embodiments shown and described without departing from the generic concept as defined by the claims and their equivalents.
Claims (9)
1. A micro-expression recognition robot, comprising:
the expression collector is used for collecting and determining expression images of the person to be detected;
the micro expression recognizer is connected with the expression collector and is used for recognizing the facial organ baseline data of the detected person according to the real-time stress expression information and the expression data of the detected person;
and the micro-expression analyzer is connected with the micro-expression recognizer and classifies the five sense organ baseline data of the tested person to obtain a micro-expression recognition result.
2. The micro expression recognition robot of claim 1, wherein the facial features baseline data comprises: eye rotation baseline data, eyebrow movement baseline data, and mouth corner movement baseline data.
3. The micro expression recognition robot of claim 2, wherein the baseline data for eye rotation includes a baseline direction and a baseline frequency of eye rotation.
4. The micro-expression recognition robot of claim 2, wherein the mouth angle activity baseline data comprises: the rising angle of the mouth angle and the opening and closing size of the mouth angle.
5. A micro-expression recognition method is characterized by comprising the following steps: the method comprises the following steps of collecting base line data of five sense organs of a tested person, and identifying the micro-expression of the tested person based on a BP neural network, wherein the method specifically comprises the following steps:
measuring eyeball rotation baseline data, eyebrow movement baseline data and mouth corner movement baseline data of a detected person according to a detection period;
step two, determining an input layer neuron vector { x ] of the three-layer BP neural network1,x2,x3,x4,x5,x6}; wherein x is1Is the base line direction of eye rotation, x2Is the base line frequency, x, of eye rotation3Is the rising angle of the nozzle angle, x4Opening and closing size of mouth corner of the person to be tested, x5Is the baseline direction of eyebrow movement, x6Is the baseline frequency of eyebrow activity;
mapping the input layer vector to a hidden layer, wherein the number of neurons of the hidden layer is m;
step four, obtaining an output layer neuron vector o ═ o1,o2,o3}; wherein o is1Is the eyeball state value, o2Is the value of the mouth angle state o3Is the eyebrow state value.
6. The micro expression recognition method of claim 5, further comprising: normalizing the eyeball state value, the mouth corner state value and the eyebrow state value to obtain an eyeball state coefficient, a mouth corner state coefficient and an eyebrow state coefficient of the detected person;
7. The micro-expression recognition method according to claim 6, further comprising obtaining a micro-expression comprehensive evaluation probability of the person under test:
wherein P is the micro-expression comprehensive evaluation probability of the tested person, P0In order to make the micro-expression comprehensive evaluation standard value of the testing personnel, e is the base number of the natural logarithm,in order to correct the coefficients of the image data,in order to make the eyeball state coefficient of the tested person, is an eyeball state coefficient threshold value, and the eyeball state coefficient threshold value,is a coefficient of the state of the mouth angle, is a threshold value for the coefficients of the mouth angle states,the coefficient of the state of the eyebrow is, is the threshold value of the coefficient of the most eyebrow state.
8. The micro expression recognition method of claim 7, further comprising a micro expression matching recognition process comprising:
when P is more than or equal to 90, judging that the micro expression of the tested person is happy;
when P is more than or equal to 85 and less than 90, judging that the micro expression of the tested person is surprised;
when P is more than or equal to 75 and less than 85, judging that the micro expression of the tested person is sad;
when P is more than or equal to 65 and less than 75, judging that the micro expression of the tested person is fear;
when P is more than or equal to 55 and less than 65, judging that the micro expression of the tested person is angry;
and when the P is more than or equal to 45 and less than 55, judging that the micro expression of the tested person is aversion.
9. The micro expression recognition method according to any one of claims 5 to 8, wherein the number of neurons of the hidden layer is 5; the excitation functions of the hidden layer and the output layer adopt S-shaped functions fj(x)=1/(1+e-x)。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010399822.7A CN111553311A (en) | 2020-05-13 | 2020-05-13 | Micro-expression recognition robot and control method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010399822.7A CN111553311A (en) | 2020-05-13 | 2020-05-13 | Micro-expression recognition robot and control method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111553311A true CN111553311A (en) | 2020-08-18 |
Family
ID=72008080
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010399822.7A Pending CN111553311A (en) | 2020-05-13 | 2020-05-13 | Micro-expression recognition robot and control method thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111553311A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112487980A (en) * | 2020-11-30 | 2021-03-12 | 深圳市广信安科技股份有限公司 | Micro-expression-based treatment method, device, system and computer-readable storage medium |
CN113160629A (en) * | 2021-05-06 | 2021-07-23 | 吉林工程技术师范学院 | Man-machine cooperation learning education robot with emotion recognition function |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110301440A1 (en) * | 2010-06-07 | 2011-12-08 | Riley Carl W | Apparatus for supporting and monitoring a person |
CN105913046A (en) * | 2016-05-06 | 2016-08-31 | 姜振宇 | Micro-expression identification device and method |
CN106599800A (en) * | 2016-11-25 | 2017-04-26 | 哈尔滨工程大学 | Face micro-expression recognition method based on deep learning |
CN107273845A (en) * | 2017-06-12 | 2017-10-20 | 大连海事大学 | A kind of facial expression recognizing method based on confidence region and multiple features Weighted Fusion |
US20180108440A1 (en) * | 2016-10-17 | 2018-04-19 | Jeffrey Stevens | Systems and methods for medical diagnosis and biomarker identification using physiological sensors and machine learning |
CN109284713A (en) * | 2018-09-21 | 2019-01-29 | 上海健坤教育科技有限公司 | A kind of Emotion identification analysis system based on camera acquisition expression data |
CN109359599A (en) * | 2018-10-19 | 2019-02-19 | 昆山杜克大学 | Human facial expression recognition method based on combination learning identity and emotion information |
CN109409296A (en) * | 2018-10-30 | 2019-03-01 | 河北工业大学 | The video feeling recognition methods that facial expression recognition and speech emotion recognition are merged |
CN109447001A (en) * | 2018-10-31 | 2019-03-08 | 深圳市安视宝科技有限公司 | A kind of dynamic Emotion identification method |
CN109472206A (en) * | 2018-10-11 | 2019-03-15 | 平安科技(深圳)有限公司 | Methods of risk assessment, device, equipment and medium based on micro- expression |
CN110969073A (en) * | 2019-08-23 | 2020-04-07 | 贵州大学 | Facial expression recognition method based on feature fusion and BP neural network |
-
2020
- 2020-05-13 CN CN202010399822.7A patent/CN111553311A/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110301440A1 (en) * | 2010-06-07 | 2011-12-08 | Riley Carl W | Apparatus for supporting and monitoring a person |
CN105913046A (en) * | 2016-05-06 | 2016-08-31 | 姜振宇 | Micro-expression identification device and method |
US20180108440A1 (en) * | 2016-10-17 | 2018-04-19 | Jeffrey Stevens | Systems and methods for medical diagnosis and biomarker identification using physiological sensors and machine learning |
CN106599800A (en) * | 2016-11-25 | 2017-04-26 | 哈尔滨工程大学 | Face micro-expression recognition method based on deep learning |
CN107273845A (en) * | 2017-06-12 | 2017-10-20 | 大连海事大学 | A kind of facial expression recognizing method based on confidence region and multiple features Weighted Fusion |
CN109284713A (en) * | 2018-09-21 | 2019-01-29 | 上海健坤教育科技有限公司 | A kind of Emotion identification analysis system based on camera acquisition expression data |
CN109472206A (en) * | 2018-10-11 | 2019-03-15 | 平安科技(深圳)有限公司 | Methods of risk assessment, device, equipment and medium based on micro- expression |
CN109359599A (en) * | 2018-10-19 | 2019-02-19 | 昆山杜克大学 | Human facial expression recognition method based on combination learning identity and emotion information |
CN109409296A (en) * | 2018-10-30 | 2019-03-01 | 河北工业大学 | The video feeling recognition methods that facial expression recognition and speech emotion recognition are merged |
CN109447001A (en) * | 2018-10-31 | 2019-03-08 | 深圳市安视宝科技有限公司 | A kind of dynamic Emotion identification method |
CN110969073A (en) * | 2019-08-23 | 2020-04-07 | 贵州大学 | Facial expression recognition method based on feature fusion and BP neural network |
Non-Patent Citations (1)
Title |
---|
王志: "一种基于多层BP神经网络面部表情识别方法" * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112487980A (en) * | 2020-11-30 | 2021-03-12 | 深圳市广信安科技股份有限公司 | Micro-expression-based treatment method, device, system and computer-readable storage medium |
CN113160629A (en) * | 2021-05-06 | 2021-07-23 | 吉林工程技术师范学院 | Man-machine cooperation learning education robot with emotion recognition function |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108284442B (en) | Mechanical arm flexible joint control method based on fuzzy neural network | |
Karlik et al. | A fuzzy clustering neural network architecture for multifunction upper-limb prosthesis | |
CN109389207A (en) | A kind of adaptive neural network learning method and nerve network system | |
CN107102727A (en) | Dynamic gesture study and recognition methods based on ELM neutral nets | |
CN108229268A (en) | Expression Recognition and convolutional neural networks model training method, device and electronic equipment | |
CN109952581A (en) | Study for machine learning system is trained | |
CN110188794B (en) | Deep learning model training method, device, equipment and storage medium | |
CN111553311A (en) | Micro-expression recognition robot and control method thereof | |
CN106909938A (en) | Viewing angle independence Activity recognition method based on deep learning network | |
WO2020224915A1 (en) | Apparatus for machine learning-based visual equipment selection | |
Setiawan et al. | Transfer learning with multiple pre-trained network for fundus classification | |
CN110786855B (en) | Sputum induction device and control method thereof | |
Mascioli et al. | A constructive approach to neuro-fuzzy networks | |
CN108538301A (en) | A kind of intelligent digital musical instrument based on neural network Audiotechnica | |
CN111027215B (en) | Character training system and method for virtual person | |
CN114504777B (en) | Exercise intensity calculation system and method based on neural network and fuzzy comprehensive evaluation | |
JP3816762B2 (en) | Neural network, neural network system, and neural network processing program | |
CN113723594B (en) | Pulse neural network target identification method | |
CN114952791A (en) | Control method and device for musculoskeletal robot | |
CN114742292A (en) | Knowledge tracking process-oriented two-state co-evolution method for predicting future performance of students | |
Sawaragi et al. | Self-reflective segmentation of human bodily motions using recurrent neural networks | |
De Jongh, PJ & De Wet | An introduction to neural networks | |
Wang et al. | TCM pulse-condition classification method based on BP neural network | |
Anantama et al. | Comparison of Deep Learning and MOORA Performance Methods in Multi Criteria Decision Making with Case Studies Best public health center | |
JP3085312B2 (en) | Character recognition learning method and character recognition device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |