CN109697421A - Evaluation method, device, computer equipment and storage medium based on micro- expression - Google Patents
Evaluation method, device, computer equipment and storage medium based on micro- expression Download PDFInfo
- Publication number
- CN109697421A CN109697421A CN201811549282.5A CN201811549282A CN109697421A CN 109697421 A CN109697421 A CN 109697421A CN 201811549282 A CN201811549282 A CN 201811549282A CN 109697421 A CN109697421 A CN 109697421A
- Authority
- CN
- China
- Prior art keywords
- micro
- time interval
- expression
- prefixed time
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
Abstract
This application involves machine learning techniques, a kind of evaluation method based on micro- expression, device, computer equipment and storage medium are provided.This method comprises: obtaining the first object image and the second target image in multiple prefixed time intervals;According in each prefixed time interval first object image and the second target image obtain corresponding first expressive features information and the second expressive features information respectively;By in each prefixed time interval the first expressive features information and the second expressive features information respectively with preset micro- expression library and be compared, respectively obtain the first comparison result and the second comparison result in each prefixed time interval;According in each prefixed time interval the first comparison result and the second comparison result determine corresponding first micro- expression type and second micro- expression type;According in each prefixed time interval first micro- expression type and second micro- expression type determine evaluation result.Scheme provided by the present application can reflect user satisfaction in time.
Description
Technical field
This application involves field of computer technology, more particularly to a kind of evaluation method based on micro- expression, device, calculating
Machine equipment and storage medium.
Background technique
With the development of computer technology, there is micro- Expression Recognition technology, micro- Expression Recognition is compared to recognition of face, rainbow
The biological identification technologies such as film identification are more novel, mainly assign AI ability to machine by technologies such as deep learning, big data analysis
Device, to go to capture the subtle facial variation that people occur in microsecond moment.By people in the subtle face that microsecond moment occurs
Portion changes to learn the Mood State of people at that time.
However, current working service evaluation mode be all clicked manually by client physical button complete evaluation and some
The evaluation position of physical button is fixed, and interaction is not friendly enough and maintenance cost is big.Meanwhile the working service evaluation in traditional technology
Mode requires to be evaluated after the completion of handling affairs service, can not learn satisfaction of the client in working service process in time.
Summary of the invention
Based on this, it is necessary in view of the above technical problems, provide it is a kind of can reflect in time user satisfaction based on micro-
Evaluation method, device, computer equipment and the storage medium of expression.
A kind of evaluation method based on micro- expression, this method comprises:
Obtain the first object image and the second target image in multiple prefixed time intervals, wherein first object image and
The target object of second target image is different;
According in each prefixed time interval first object image and the second target image obtain corresponding first respectively
Expressive features information and the second expressive features information;
By in each prefixed time interval the first expressive features information and the second expressive features information respectively with preset it is micro-
Expression library is compared, and respectively obtains the first comparison result and the second comparison result in each prefixed time interval;
According in each prefixed time interval the first comparison result and the second comparison result determine corresponding first micro- table
Feelings type and second micro- expression type;
According in each prefixed time interval first micro- expression type and second micro- expression type determine evaluation result.
A kind of evaluating apparatus based on micro- expression, the device include:
Target image obtains module, for obtaining first object image and the second target figure in multiple prefixed time intervals
Picture, wherein the target object of first object image and the second target image is different;
Expressive features data obtaining module, for according to the first object image and the second mesh in each prefixed time interval
Logo image obtains corresponding first expressive features information and the second expressive features information respectively;
Expressive features message processing module, for by the first expressive features information and second in each prefixed time interval
Expressive features information respectively with preset micro- expression library and be compared, first respectively obtained in each prefixed time interval compares knot
Fruit and the second comparison result;
Micro- expression determination type module, for being compared according to the first comparison result and second in each prefixed time interval
As a result corresponding first micro- expression type and second micro- expression type are determined;
Evaluation result determining module, for according to the first micro- expression type and second micro- table in each prefixed time interval
Feelings type determines evaluation result.
A kind of computer equipment, including memory and processor, the memory are stored with computer program, the processing
Device performs the steps of when executing the computer program
Obtain the first object image and the second target image in multiple prefixed time intervals, wherein first object image and
The target object of second target image is different;
According in each prefixed time interval first object image and the second target image obtain corresponding first respectively
Expressive features information and the second expressive features information;
By in each prefixed time interval the first expressive features information and the second expressive features information respectively with preset it is micro-
Expression library is compared, and respectively obtains the first comparison result and the second comparison result in each prefixed time interval;
According in each prefixed time interval the first comparison result and the second comparison result determine corresponding first micro- table
Feelings type and second micro- expression type;
According in each prefixed time interval first micro- expression type and second micro- expression type determine evaluation result.
A kind of computer readable storage medium is stored thereon with computer program, when computer program is executed by processor
It performs the steps of
Obtain the first object image and the second target image in multiple prefixed time intervals, wherein first object image and
The target object of second target image is different;
According in each prefixed time interval first object image and the second target image obtain corresponding first respectively
Expressive features information and the second expressive features information;
By in each prefixed time interval the first expressive features information and the second expressive features information respectively with preset it is micro-
Expression library is compared, and respectively obtains the first comparison result and the second comparison result in each prefixed time interval;
According in each prefixed time interval the first comparison result and the second comparison result determine corresponding first micro- table
Feelings type and second micro- expression type;
According in each prefixed time interval first micro- expression type and second micro- expression type determine evaluation result.
The above-mentioned evaluation method based on micro- expression, device, computer equipment and storage medium obtain between multiple preset times
Every interior first object image and the second target image, wherein the target object of first object image and the second target image is not
Together;According in each prefixed time interval first object image and the second target image to obtain corresponding first expression respectively special
Reference breath and the second expressive features information;By the first expressive features information and the second expressive features in each prefixed time interval
Information respectively with preset micro- expression library and be compared, respectively obtain the first comparison result and second in each prefixed time interval
Comparison result;According in each prefixed time interval the first comparison result and the second comparison result determine corresponding first micro- table
Feelings type and second micro- expression type;According to the first micro- expression type and second micro- expression type in each prefixed time interval
Determine evaluation result.Micro- Expression Recognition technology is combined with the user's evaluation of working service process, can reflect in time and do
The satisfaction of user is in thing service process to learn the evaluation result for service of this time handling affairs, and passes through micro- Expression Recognition skill
Art, physical button that no setting is required to complete to evaluate for user, reduce the cost of working service evaluation.
Detailed description of the invention
Fig. 1 is the application scenario diagram of the evaluation method based on micro- expression in one embodiment;
Fig. 2 is the flow diagram of the evaluation method based on micro- expression in one embodiment;
Fig. 3 is the flow diagram of the generation step of target image in one embodiment;
Fig. 4 is the flow diagram of the extraction step of expressive features information in one embodiment;
Fig. 5 is the flow diagram of expressive features information and the comparison step for presetting micro- expression library in one embodiment;
Fig. 6 is the flow diagram of the determination step of the evaluation result in one embodiment in each prefixed time interval;
Fig. 7 is the flow diagram of the determination step of micro- expression evaluation result in one embodiment;
Fig. 8 is the structural block diagram of the evaluating apparatus based on micro- expression in one embodiment;
Fig. 9 is the internal structure chart of computer equipment in one embodiment;
Figure 10 is the internal structure chart of computer equipment in another embodiment.
Specific embodiment
It is with reference to the accompanying drawings and embodiments, right in order to which the objects, technical solutions and advantages of the application are more clearly understood
The application is further elaborated.It should be appreciated that specific embodiment described herein is only used to explain the application, not
For limiting the application.
Evaluation method provided by the present application based on micro- expression, can be applied in application environment as shown in Figure 1.Its
In, terminal 102 is communicated with server 104 by network by network.(wherein, terminal 102 can be, but not limited to be various
Personal computer, laptop, smart phone, tablet computer and portable wearable device, server 104 can be with independently
The server cluster of server either multiple servers composition realize.
Specifically, terminal 102 obtains first object image and the second target image in multiple prefixed time intervals, will be each
First object image and the second target image in a prefixed time interval are sent to server 104, and server 104 is according to each
First object image and the second target image in prefixed time interval obtain corresponding first expressive features information and respectively
Two expressive features information;The corresponding first expressive features information of each prefixed time interval and the second expressive features information are distinguished
It is compared with micro- expression library is preset, respectively obtains corresponding first comparison result of each prefixed time interval and second and compare knot
Fruit;Corresponding first micro- expression class is determined according to corresponding first comparison result of each prefixed time interval and the second comparison result
Type and second micro- expression type;It is true according to the corresponding first micro- expression type of each prefixed time interval and second micro- expression type
Determine evaluation result.Finally, evaluation result is sent to terminal 102 by server 104.
In one embodiment, it as shown in Fig. 2, providing a kind of evaluation method based on micro- expression, applies in this way
It is illustrated for terminal or server in Fig. 1, comprising the following steps:
Step 202, the first object image and the second target image in multiple prefixed time intervals are obtained, wherein the first mesh
Logo image is different with the target object of the second target image.
Wherein, prefixed time interval refers to the period being arranged in advance, and first object image is by between each preset time
It is handled every interior collected original image, similarly the second target image is by adopting in each prefixed time interval
What the original image collected was handled, first object image and the second target image are adopted by two different images respectively
Acquisition means collect.First image collecting device can be the device for acquiring staff in working service process, and
Second image collecting device can be the device for acquiring client in working service process.To being adopted in each prefixed time interval
The original image processing collected can be the face area for extracting original image, using face area as target image, wherein mesh
Mark object refers to the acquisition target in first object image and the second target image, such as people or animal, and the acquisition target is only
There is face area.
Specifically, work in working service process can be acquired in each prefixed time interval by the first image collecting device
The image for making personnel obtains corresponding first original image of each prefixed time interval, while the second image collecting device is each
The image of client obtains corresponding second original graph of each prefixed time interval in acquisition working service process in prefixed time interval
Picture.It can be obtained each by handling corresponding first original image of each prefixed time interval and the second original image
The corresponding first object image of prefixed time interval and the second target image.
Step 204, according in each prefixed time interval first object image and the second target image obtain respectively pair
The the first expressive features information and the second expressive features information answered.
Wherein, expressive features information is to carry out human facial feature extraction to target image to obtain.First expressive features information
It is to carry out human facial feature extraction to the first object image in each prefixed time interval to obtain, similarly the second expressive features
Information is to carry out human facial feature extraction to the second target image in each prefixed time interval to obtain.Expressive features information can
To be facial feature points positioning position information, it can be the corresponding contour feature information in face region or can be nasolabial fold region
The corresponding texture feature information in domain perhaps can be the area features information of palpebral region or the comprehensive knot of features described above information
Fruit etc..
Specifically, after getting the first object image and the second target image in each prefixed time interval, to each
First object image and the second target image in a prefixed time interval carry out human facial feature extraction, to obtain each default
The corresponding first expressive features information of first object image in time interval and the second target in each prefixed time interval
The corresponding second expressive features information of image.
Step 206, by the first expressive features information and the second expressive features information difference in each prefixed time interval
It is compared with micro- expression library is preset, respectively obtains the first comparison result in each prefixed time interval and second and compare knot
Fruit.
Wherein, presetting micro- expression library is the expression library for determining micro- expression type of expressive features information, presets micro- table
The corresponding relationship between each expressive features information and corresponding micro- expression type is stored in advance in feelings library.Expressive features information and pre-
If the comparison result in micro- expression library can be the corresponding expressive features information of each prefixed time interval and preset in micro- expression library
Similarity between each default expression characteristic information.Wherein, the corresponding expressive features information of each prefixed time interval and pre-
If the similarity in micro- expression library between each default expression characteristic information can be calculated by cosine similarity, Euclidean distance phase
It is obtained like calculations such as degree calculating.
Specifically, the corresponding first expressive features information of each prefixed time interval and the second expressive features letter are being got
After breath, by the corresponding first expressive features information of each prefixed time interval and the second expressive features information and micro- expression library is preset
In each default expression characteristic information carry out similarity calculation, respectively obtain corresponding first expression of each prefixed time interval
Corresponding first comparison result of characteristic information and the corresponding second expressive features information corresponding second of each prefixed time interval
Comparison result.
Step 208, according in each prefixed time interval the first comparison result and the second comparison result determine it is corresponding
First micro- expression type and second micro- expression type.
Wherein, first micro- expression type refers to the corresponding micro- expression class of the first object image in each prefixed time interval
Type, similarly second micro- expression type refers to the corresponding micro- expression type of the second target image in each prefixed time interval,
Micro- expression type includes but is not limited to the micro- expression type of positive energy and/or the micro- expression type of negative energy etc., that is to say, that first is micro-
Expression type may include the micro- expression type of positive energy and/or the micro- expression type of negative energy, and similarly second micro- expression type can
To include the micro- expression type of positive energy and/or the micro- expression type of negative energy.
Specifically, the expressive features information in each prefixed time interval is compared to obtain respectively with micro- expression library is preset
After corresponding first comparison result of a prefixed time interval and the second comparison result, according to each prefixed time interval corresponding
One comparison result and the second comparison result determine the corresponding first micro- expression type of each prefixed time interval and second micro- expression
Type.Determine that first micro- expression type can be when some is default according to corresponding first comparison result of each prefixed time interval
It, then can will be in the first comparison result of the prefixed time interval when corresponding first comparison result of time interval is more than preset threshold
The corresponding micro- expression type of default expression characteristic information be determined as first micro- expression type of the prefixed time interval.Equally
Ground determines that second micro- expression type can be when some is preset according to corresponding second comparison result of each prefixed time interval
Between the corresponding micro- expression type of default expression characteristic information that is spaced in corresponding second comparison result be determined as the preset time
Second micro- expression type at interval.
Step 210, according in each prefixed time interval first micro- expression type and the determination of the second micro- expression type comment
Valence result.
Wherein, evaluation result refers to that working service process represents the result of the quality for service of this time handling affairs.Evaluation result can
To be working attitude is good, working attitude is good, working attitude difference or working attitude are very poor etc..Specifically
Ground, after obtaining the corresponding first micro- expression type of each prefixed time interval and second micro- expression type, due to each default
The corresponding first micro- expression type of time interval is that the first image acquisition device object is corresponding in each prefixed time interval
First micro- expression type, the corresponding second micro- expression type of each prefixed time interval is the second image acquisition device pair
As in the corresponding first micro- expression type of each prefixed time interval, therefore each prefixed time interval corresponding first can be passed through
Micro- expression type determines corresponding first evaluation result of the first image acquisition device object, similarly, can be by each pre-
If the corresponding second micro- expression type of time interval determines corresponding second evaluation result of the second image acquisition device object.
Finally, can be corresponding according to corresponding first evaluation result of the first image collecting device object and the second image collecting device object
Second evaluation result determines objective appraisal result.
In the above-mentioned evaluation method based on micro- expression, the first object image and second in multiple prefixed time intervals is obtained
Target image;According in each prefixed time interval first object image and the second target image obtain corresponding first respectively
Expressive features information and the second expressive features information;By the corresponding first expressive features information of each prefixed time interval and second
Expressive features information respectively with preset micro- expression library and be compared, respectively obtain each prefixed time interval corresponding first and compare
As a result with the second comparison result;According to determining pair of corresponding first comparison result of each prefixed time interval and the second comparison result
The micro- expression type of first answered and second micro- expression type;According to the corresponding first micro- expression type of each prefixed time interval and
Second micro- expression type determines evaluation result.Micro- Expression Recognition technology is combined with the user's evaluation of working service process,
It can reflect the satisfaction of user in working service process in time to learn the evaluation result for service of this time handling affairs, and pass through
Micro- Expression Recognition technology, physical button that no setting is required to complete to evaluate for user, reduce the cost of working service evaluation.
In one embodiment, as shown in figure 3, obtaining the first object image and the second mesh in multiple prefixed time intervals
Logo image, comprising:
Step 302, obtaining collected first image, the first image in each prefixed time interval includes first object pair
As.
Step 304, acquisition the second image collecting device collected second image in each prefixed time interval, second
Image includes that the second target object, wherein first object object and the second target object are different.
Wherein, staff and working client are related in working service process, whole process is serviced for the working
Service quality can according to the attitude of staff and working client determine working service process in service quality.Specifically,
First image collecting device can be the device for acquiring the attitude of staff in the whole process in working service process,
And the second image collecting device can be the device for acquiring the attitude of client in the whole process in working service process.It will
First image collector and the second image collecting device are in the collected original image of each prefixed time interval as corresponding
First image and the second image.
Step 306, the first object object in the first image corresponding to each prefixed time interval carries out face recognition,
Each prefixed time interval corresponding first face image region is obtained, corresponding first face of each prefixed time interval is schemed
As region is as the first object image in each prefixed time interval.
Step 308, the second target object in the second image corresponding to each prefixed time interval carries out face recognition,
Each prefixed time interval corresponding second face image region is obtained, corresponding second face of each prefixed time interval is schemed
As region is as the second target image in each prefixed time interval.
Wherein, if the first image and the second image that arrive to the first image acquisition device carry out expressive features information
It extracts, is easy to cause the inaccuracy of expressive features information, therefore need first to carry out face recognition to the first image and the second image, obtain
To corresponding face image region.In this way, the expressive features information that subsequent extracted goes out is more accurate.Specifically, is being got
One image collecting device in each prefixed time interval collected first image and the second image collecting device each pre-
If in time interval after collected second image, the first image corresponding to each prefixed time interval carries out face recognition,
Obtain each prefixed time interval corresponding first face image region, wherein the first face image region include face region,
Nasolabial Fold Region and palpebral region.Face portion can specifically be divided according to predeterminable area from the first image, thus
To corresponding first face image region, using each prefixed time interval corresponding first face image region as each default
The second target image in time interval.
It, can similarly, for corresponding second image of each prefixed time interval that the second image acquisition device arrives
First each second graph divides face portion according to predeterminable area, obtains corresponding second face image region.In advance
It, will be between each preset time if region can be the face area etc. being made of face region, Nasolabial Fold Region and palpebral region
Every corresponding second face image region as the second target image in each prefixed time interval.
In one embodiment, according to the first object image and the second target image difference in each prefixed time interval
Obtain corresponding first expressive features information and the second expressive features information, comprising: obtain in each prefixed time interval
One target image and the second target image;According to default feature extraction algorithm to the first object figure in each prefixed time interval
As carrying out image characteristics extraction, the first expressive features information in each prefixed time interval is obtained;According to default feature extraction
Algorithm carries out image characteristics extraction to the second target image in each prefixed time interval, obtains in each prefixed time interval
The second expressive features information.
Wherein, expressive features information be for determining micro- expression in target image, and expressive features information includes but
It is not limited to expressive features dot position information etc..Expressive features information is can to pass through relevant image spy by image characteristics extraction
It levies extraction algorithm and extracts corresponding expressive features information.Since first object image and the second target image are face image areas
Domain, therefore the accuracy for the expressive features information that image characteristics extraction obtains is carried out to first object image and the second target image
It is higher.Specifically, after getting the first object image and the second target image in each prefixed time interval, respectively to each
First object image and the second target image in a prefixed time interval carry out image characteristics extraction, respectively obtain each default
The corresponding first expressive features information of first object image in time interval and the second target in each prefixed time interval
The corresponding second expressive features information of image.
In one embodiment, as shown in figure 4, according to default feature extraction algorithm in each prefixed time interval
One target image carries out image characteristics extraction, obtains the first expressive features information in each prefixed time interval, according to default
Feature extraction algorithm carries out image characteristics extraction to the second target image in each prefixed time interval, when obtaining each default
Between interval in the second expressive features information, comprising:
Step 402, in each prefixed time interval first object image and the second target image divide, respectively
Obtain corresponding face region, Nasolabial Fold Region, palpebral region.
Specifically, it after getting the first object image and the second target image in each prefixed time interval, can press
According to predeterminable area in each prefixed time interval first object image and the second target image divide, respectively obtain pair
The face region answered, Nasolabial Fold Region, palpebral region.So-called face region refers to the region where face's face, nasolabial fold region
Domain refers to the region of nose, mouth composition, and palpebral region refers to the region where eyes.
Step 404, according to default feature extraction algorithm to the first object image and second in each prefixed time interval
The corresponding face region of target image carries out outline identification, respectively obtains the corresponding first profile feature of each first object image
Information and the corresponding second contour feature information of each second target image.
Wherein, face region is the main region for influencing micro- expression of people, and face region has clearly profile, by right
Face region carries out outline identification, can obtain the contour feature information in face region, so-called contour feature information includes face
The variation degree of the variation duration and profile of region contour.Wherein, the method for outline identification can be edge detection algorithm,
There can also be other relevant algorithms, not do any restrictions herein.
Step 406, according to default feature extraction algorithm to the first object image and second in each prefixed time interval
The corresponding Nasolabial Fold Region of target image carries out texture analysis, and it is special to respectively obtain corresponding first texture of each first object image
Reference ceases the second texture feature information corresponding with each second target image.
Wherein, Nasolabial Fold Region is the important area for influencing micro- expression of people, and Nasolabial Fold Region has texture, by nose
Labial groove region carries out texture analysis, can obtain the texture feature information of Nasolabial Fold Region, texture feature information includes nasolabial groove
The variation degree of the variation duration and nasolabial groove in region.The method of texture analysis can be greyscale transformation, be also possible to two
Value, the present embodiment are without restriction herein.
Step 408, the first area of the corresponding palpebral region of first object image in each prefixed time interval is obtained
The second area feature letter of characteristic information palpebral region corresponding with the second target image obtained in each prefixed time interval
Breath.
Wherein, palpebral region is equally the important area for influencing micro- expression of people, and palpebral region has one piece and is bordering on plane
Skin can obtain the area features information of palpebral region by calculating the area of the palpebral region in every frame target image,
Area features information includes the variation degree of the variation duration and eyelid area of palpebral region.
Step 410, according to the first profile characteristic information of each first object image, the first texture feature information and first
Area features information determines corresponding first expressive features information.
Step 412, according to the second contour feature information of each second target image, the second texture feature information and second
Area features information determines corresponding second expressive features information.
Wherein, the corresponding contour feature information of each target image, texture feature information and area features letter are being got
After breath, corresponding expressive features information can be determined according to contour feature information, texture feature information and area features information.Wherein
Determine that corresponding expressive features information can be but unlimited according to contour feature information, texture feature information and area features information
In by contour feature information, texture feature information and area features information be corresponding expressive features information.Specifically, it is obtaining
To the corresponding first profile characteristic information of each first object image, the first texture feature information and the first area features information
Afterwards, the corresponding first profile characteristic information of each first object, the first texture feature information and the first area features information are converged
It is always corresponding first expressive features information.Similarly, corresponding second contour feature of each second target image is being got
After information, the second texture feature information and second area characteristic information, corresponding second contour feature of each second target is believed
Breath, the second texture feature information and second area characteristic information summarize for corresponding second expressive features information.
In one embodiment, as shown in figure 5, by the first expressive features information and second in each prefixed time interval
Expressive features information respectively with preset micro- expression library and be compared, first respectively obtained in each prefixed time interval compares knot
Fruit and the second comparison result, according in each prefixed time interval the first comparison result and the second comparison result determine it is corresponding
First micro- expression type and second micro- expression type, comprising:
Step 502, by the first expressive features information and the second expressive features information difference in each prefixed time interval
With preset in micro- expression library it is each preset micro- expressive features information and carry out similarity calculation, obtain each first expressive features letter
Cease corresponding first similarity and corresponding second similarity of each second expressive features information.
Wherein, due to preset micro- expression inventory the mapping of default micro- expressive features information and corresponding micro- expression type close
System, it is one-to-one for presetting micro- expressive features information with corresponding micro- expression type.Wherein presetting micro- expression library is by big
Amount training presets micro- expressive features information and corresponding micro- expression type is stored to this and preset in micro- expression library for trained.Tool
Body, it, will be each after getting the corresponding first expressive features information of each prefixed time interval and the second expressive features information
The corresponding first expressive features information of a prefixed time interval and the second expressive features information respectively with preset in micro- expression library
It is each to preset micro- expressive features information progress similarity calculation, obtain corresponding first similarity of each first expressive features information
The second similarity corresponding with each second expressive features information.Wherein, it calculates the first expressive features information and the second expression is special
It includes but is not limited to cosine similarity that reference, which ceases and presets each of micro- expression library to preset the similarity of micro- expressive features information,
The modes such as calculating, Euclidean distance similarity calculation.
Step 504, when the first similarity is greater than preset threshold, then micro- expressive features are preset by the first similarity is corresponding
The type of information is determined as the corresponding first micro- expression type of the first expressive features information in prefixed time interval.
Step 506, when the second similarity is greater than preset threshold, then micro- expressive features are preset by the second similarity is corresponding
The type of information is determined as the corresponding second micro- expression type of the second expressive features information in prefixed time interval.
Wherein, similarity is greater than preset threshold, illustrates that this is preset between micro- expressive features information and expressive features information
Similarity is higher, more similar between them.Specifically, calculate corresponding first similarity of each first expressive features information and
After corresponding second similarity of each second expressive features information, judged whether according to each first similarity and the second similarity
More than preset threshold, when the first similarity is greater than preset threshold, then micro- expressive features can be preset by the first similarity is corresponding
The type of information is determined as the corresponding first micro- expression type of the first expressive features information within a preset time interval.Or work as
Second similarity be greater than preset threshold when, then can the corresponding type for presetting micro- expressive features information of the second similarity be determined as
The corresponding second micro- expression type of the second expressive features information in prefixed time interval.For example, corresponding in prefixed time interval B
The first expressive features information A and the similarity preset between micro- expressive features information a preset in micro- expressive features library be
90%, preset threshold 60%, because 90% > 60%, micro- expressive features information a will be preset and presetting micro- expressive features library
It is b that corresponding micro- expression type b, which is determined as the corresponding micro- expression type of B prefixed time interval,.
In one embodiment, as shown in fig. 6, according to first micro- expression type and second in each prefixed time interval
Micro- expression type determines evaluation result, comprising:
Step 602, first micro- expression type in each prefixed time interval is obtained.
Step 604, the mesh in first object image is determined according to first micro- expression type in each prefixed time interval
Mark micro- expression evaluation result of object.
Wherein, first micro- expression type refers to the corresponding micro- expression class of the first object image in each prefixed time interval
Type, therefore can be by the corresponding first micro- expression type of each prefixed time interval so that it is determined that the first image collecting device is each
Micro- expression evaluation result of the corresponding collected object of collected first object image in a prefixed time interval.Specifically,
The corresponding first micro- expression type of each prefixed time interval is obtained first, further according to each prefixed time interval corresponding first
Micro- expression type determines micro- expression evaluation result of the corresponding collected object of first object image.Wherein, due to first micro- table
Feelings type includes but is not limited to the micro- expression type of positive energy and the micro- expression type of negative energy, can be according to each prefixed time interval pair
The micro- expression type of the positive energy answered and the micro- expression type of negative energy determine micro- table of the corresponding collected object of first object image
Feelings evaluation result.
In one embodiment, according to the micro- expression type of the corresponding positive energy of each prefixed time interval and the micro- table of negative energy
Feelings type determines that micro- expression evaluation result of the corresponding collected object of first object image can be according to each preset time
The quantity of the quantity and the micro- expression type of negative energy that are spaced the corresponding micro- expression type of positive energy determines that first object image is corresponding
Collected object micro- expression evaluation result.The quantity of the corresponding micro- expression type of positive energy in each predeterminable event interval is greater than
The quantity of the micro- expression type of negative energy then can determine that micro- expression evaluation result of the corresponding collected object of first object image is
Positive energy evaluation result.
In one embodiment, according to the micro- expression type of the corresponding positive energy of each prefixed time interval and the micro- table of negative energy
Feelings type determines that micro- expression evaluation result of the corresponding collected object of first object image can be according to the micro- expression of positive energy
Micro- expression evaluation result of the corresponding collected object of ratio-dependent first object image of type and the micro- expression type of negative energy.
If the ratio of the micro- expression type of positive energy and the micro- expression type of negative energy is greater than or equal to 10% to 30%, then micro- expression evaluation knot
It is general that fruit is considered as attitude, and it is good that greater than 30% to 60% micro- expression evaluation result is considered as attitude, be greater than 60% it is micro-
It is very good that expression evaluation result is considered as attitude;Being less than or 10% to 30%, then micro- expression evaluation result is that attitude is poor,
Less than 30% to 60%, then it is very poor to be considered as attitude for micro- expression evaluation result, then micro- expression evaluation result less than 60% or more
It is very poor to be considered as attitude.
Step 606, obtain second micro- expression type in each prefixed time interval, wherein first micro- expression type with
First object image in each prefixed time interval is corresponding, and second in second micro- expression type and each prefixed time interval
Target image is corresponding.
Step 608, the mesh in the second target image is determined according to second micro- expression type in each prefixed time interval
Mark micro- expression evaluation result of object.
Wherein, second micro- expression type refers to the corresponding micro- expression class of the second target image in each prefixed time interval
Type, therefore can be by the corresponding second micro- expression type of each prefixed time interval so that it is determined that the second image collecting device is each
Micro- expression evaluation result of the corresponding collected object of collected second target image in a prefixed time interval.Specifically,
The corresponding second micro- expression type of each prefixed time interval is obtained first, further according to each prefixed time interval corresponding second
Micro- expression type determines micro- expression evaluation result of the corresponding collected object of the second target image.Wherein, due to second micro- table
Feelings type includes but is not limited to the micro- expression type of positive energy and the micro- expression type of negative energy, can be according to each prefixed time interval pair
The micro- expression type of the positive energy answered and the micro- expression type of negative energy determine micro- table of the corresponding collected object of the second target image
Feelings evaluation result.
In one embodiment, according to the micro- expression type of the corresponding positive energy of each prefixed time interval and the micro- table of negative energy
Feelings type determines that micro- expression evaluation result of the corresponding collected object of the second target image can be according to each preset time
The quantity of the quantity and the micro- expression type of negative energy that are spaced the corresponding micro- expression type of positive energy determines that the second target image is corresponding
Collected object micro- expression evaluation result.The quantity of the corresponding micro- expression type of positive energy in each predeterminable event interval is greater than
The quantity of the micro- expression type of negative energy then can determine that micro- expression evaluation result of the corresponding collected object of the second target image is
Positive energy evaluation result.
Step 610, according in micro- expression evaluation result of the target object in first object image and the second target image
Micro- expression evaluation result of target object determine objective appraisal result.
Specifically, in the micro- expression evaluation result and the second target figure for obtaining the corresponding collected object of first object image
As micro- expression evaluation of corresponding collected object is as a result, can be according to micro- expression of the corresponding collected object of first object image
Evaluation result and micro- expression evaluation result of the corresponding collected object of the second target image determine objective appraisal result.Target is commented
Valence is the result is that be used to embody service process quality of entirely handling affairs in this working service process.Wherein according to first object image pair
Micro- expression evaluation result of the micro- expression evaluation result for the collected object answered and the corresponding collected object of the second target image
Determine objective appraisal result can be preset the corresponding collected object of first object image micro- expression evaluation result and
The different weight of micro- expression evaluation result of the corresponding collected object of second target image, it is corresponding further according to first object image
Collected object micro- expression evaluation result and corresponding weight calculation obtain the first integrated value, according to the second target image pair
The micro- expression evaluation result for the collected object answered and corresponding weight calculation obtain the second integrated value, finally comprehensive according to first
Value and the second integrated value determine objective appraisal result.
In one embodiment, according to micro- expression evaluation result and second of the corresponding collected object of first object image
Micro- expression evaluation result of the corresponding collected object of target image determines that objective appraisal result can be directly according to the second target
Micro- expression evaluation result of the corresponding collected object of image determines objective appraisal result.Because of the corresponding quilt of the second target image
Acquisition target is working client, in working service process, can determine mesh according to micro- expression evaluation result of working client completely
Mark evaluation result.
In one embodiment, as shown in fig. 7, first micro- expression type and second micro- expression type include the micro- table of positive energy
Feelings type and the micro- expression type of negative energy determine first object figure according to first micro- expression type in each prefixed time interval
Micro- expression evaluation of target object as in is as a result, determine second according to second micro- expression type in each prefixed time interval
Micro- expression evaluation result of target object in target image, comprising:
Step 702, the number that first micro- expression type in each prefixed time interval is the micro- expression type of positive energy is obtained
Amount.
Step 704, the number that first micro- expression type in each prefixed time interval is the micro- expression type of negative energy is obtained
Amount.
Step 706, it is calculated according to the quantity of the quantity of the micro- expression type of positive energy and the micro- expression type of negative energy micro-
Expression type fraction.
Step 708, when micro- expression type fraction meets default ratio, it is determined that the target object in first object image
Micro- expression evaluation result be the micro- expression of positive energy, it is on the contrary, it is determined that micro- expression of the target object in first object image is commented
Valence result is the micro- expression of negative energy, wherein micro- expression evaluation result is the micro- expression of positive energy, at least one in the micro- expression of negative energy
It is a.
Wherein, first micro- expression type and second micro- expression type include the micro- expression type of positive energy and the micro- expression of negative energy
Type determines the corresponding collected object of first object image according to the corresponding first micro- expression type of each prefixed time interval
Micro- expression evaluation result can be and calculated according to the quantity of the quantity of the micro- expression type of positive energy and the micro- expression type of negative energy
Micro- expression type fraction is obtained, micro- table of the corresponding collected object of first object image is determined further according to micro- expression type fraction
Feelings evaluation result.Specifically, obtaining the corresponding first micro- expression type of each prefixed time interval is the micro- expression type of positive energy
Quantity and corresponding first micro- expression type be the micro- expression type of negative energy quantity, according to the number of the micro- expression type of positive energy
Micro- expression type fraction is calculated in the quantity of amount and the micro- expression type of negative energy, determines the first mesh according to micro- expression type fraction
Micro- expression evaluation result of the corresponding collected object of logo image.
If the ratio of the micro- expression type of positive energy and the micro- expression type of negative energy is greater than or equal to 10% to 30%, then micro- table
It is general that feelings evaluation result is considered as attitude, and it is good that greater than 30% to 60% micro- expression evaluation result is considered as attitude, is greater than
It is very good that 60% micro- expression evaluation result is considered as attitude;It is less than or 10% to 30%, then micro- expression evaluation result is clothes
Attitude of being engaged in is poor, and less than 30% to 60%, then it is very poor to be considered as attitude for micro- expression evaluation result, then micro- expression less than 60% or more
It is very poor that evaluation result is considered as attitude.
Step 710, the number that second micro- expression type in each prefixed time interval is the micro- expression type of positive energy is obtained
Amount.
Step 712, the number that second micro- expression type in each prefixed time interval is the micro- expression type of negative energy is obtained
Amount.
Step 714, when the quantity of the micro- expression type of positive energy in each prefixed time interval is greater than or equal to negative energy
When the quantity of micro- expression type, it is determined that micro- expression evaluation result of the target object in the second target image is the micro- table of positive energy
Feelings, it is on the contrary, it is determined that micro- expression evaluation result of the target object in the second target image is the micro- expression of negative energy.
Wherein, first micro- expression type and second micro- expression type include the micro- expression type of positive energy and the micro- expression of negative energy
Type determines the corresponding collected object of the second target image according to the corresponding second micro- expression type of each prefixed time interval
Micro- expression evaluation result can be and determined according to the quantity of the quantity of the micro- expression type of positive energy and the micro- expression type of negative energy
Micro- expression evaluation result of the corresponding collected object of second target image.Specifically, it is corresponding to obtain each prefixed time interval
Second micro- expression type be the quantity of the micro- expression type of positive energy and corresponding second micro- expression type is the micro- expression of negative energy
The quantity of type determines the second target image according to the quantity of the quantity of the micro- expression type of positive energy and the micro- expression type of negative energy
Micro- expression evaluation result of corresponding collected object.
In one embodiment, since the corresponding collected object of the second target image is working client, working client's
Micro- expression evaluation result can be directly according to the quantity and negative energy of the micro- expression type of the corresponding positive energy of each prefixed time interval
The quantity of micro- expression type can determine if the quantity of the micro- expression type of positive energy is greater than the quantity of the micro- expression type of negative energy
Micro- expression evaluation result of the corresponding collected object of second target image is positive heat-supplied result.
It should be understood that although each step in above-mentioned flow chart is successively shown according to the instruction of arrow, this
A little steps are not that the inevitable sequence according to arrow instruction successively executes.Unless expressly state otherwise herein, these steps
It executes there is no the limitation of stringent sequence, these steps can execute in other order.Moreover, in above-mentioned flow chart at least
A part of step may include that perhaps these sub-steps of multiple stages or stage are not necessarily in same a period of time to multiple sub-steps
Quarter executes completion, but can execute at different times, the execution in these sub-steps or stage be sequentially also not necessarily according to
Secondary progress, but in turn or can replace at least part of the sub-step or stage of other steps or other steps
Ground executes.
In the application scenarios of a working service process, the evaluation method based on micro- expression be can be including following step
It is rapid:
1, window office pre-installs 2 cameras, is respectively aligned to staff and client's shooting.
2, camera accesses the micro- Expression Recognition system of safety.
3, a frame picture was taken out from staff's video every 5 seconds (configurable) during working do micro- Expression Recognition.
4, automatic prompt client does client's satisfaction evaluation against camera at the end of handling affairs.
5, the extraction picture frame expression accounting 50% in the course of work of staff.
6, the expression accounting 50% (adjustable) of client at the end of handling affairs.
7, the expression (smile, to one's heart's content size, happy, happiness etc.) of positive energy.
8, the expression (indignation, sobbing, melancholy, sad etc.) of negative energy.
9, the total accounting of micro- Expression Recognition result positive energy and the total accounting of negative energy of staff compares during handling affairs,
More than or equal to 10% to 30%, then it is general to be considered as attitude, and it is good to be considered as attitude greater than 30% to 60%, is greater than
60% to be considered as attitude very good;Be less than or 10% to 30%, then it is poor for attitude, less than 30% to 60%, then regard
It is very poor for attitude, it is very poor to be then considered as attitude less than 60% or more.
10, handling affairs terminates client is considered as satisfaction if the expression for providing positive energy, negative energy if be considered as it is dissatisfied.
In one embodiment, as shown in figure 8, providing a kind of evaluating apparatus 800 based on micro- expression, the device packet
It includes:
Target image obtains module 802, for obtaining first object image and the second mesh in multiple prefixed time intervals
Logo image, wherein the target object of first object image and the second target image is different.
Expressive features data obtaining module 804, for according to the first object image and in each prefixed time interval
Two target images obtain corresponding first expressive features information and the second expressive features information respectively.
Expressive features message processing module 806, for by each prefixed time interval the first expressive features information and
Second expressive features information respectively with preset micro- expression library and be compared, respectively obtain the first ratio in each prefixed time interval
To result and the second comparison result.
Micro- expression determination type module 808, for according to the first comparison result and second in each prefixed time interval
Comparison result determines corresponding first micro- expression type and second micro- expression type.
Evaluation result determining module 810, for according to first micro- expression type and second in each prefixed time interval
Micro- expression type determines evaluation result.
In one embodiment, target image acquisition module 802 is also used to obtain collects in each prefixed time interval
The first image, the first image includes first object object;Collected second image in each prefixed time interval is obtained, the
Two images include that the second target object, wherein first object object and the second target object are different;To each prefixed time interval
First object object in the first interior image carries out face recognition, obtains the first face image in each prefixed time interval
Region, using the first face image region in each prefixed time interval as the first object figure in each prefixed time interval
Picture;Face recognition is carried out to the second target object in the second image in each prefixed time interval, when obtaining each default
Between the second face image region in interval, using the second face image region in each prefixed time interval as each default
The second target image in time interval.
In one embodiment, expressive features data obtaining module 804 is also used to obtain in each prefixed time interval
First object image and the second target image;According to default feature extraction algorithm to the first object in each prefixed time interval
Image carries out image characteristics extraction, obtains the first expressive features information in each prefixed time interval;It is mentioned according to default feature
It takes algorithm to carry out image characteristics extraction to the second target image in each prefixed time interval, obtains each prefixed time interval
The second interior expressive features information.
In one embodiment, expressive features data obtaining module 804 is also used to in each prefixed time interval
One target image and the second target image are divided, and corresponding face region, Nasolabial Fold Region, palpebral region are respectively obtained;
According to default feature extraction algorithm to the first object image and the second target image corresponding five in each prefixed time interval
Official region carries out outline identification, respectively obtains the corresponding first profile characteristic information of each first object image and each second mesh
The corresponding second contour feature information of logo image;According to default feature extraction algorithm to the first mesh in each prefixed time interval
Logo image and the corresponding Nasolabial Fold Region of the second target image carry out texture analysis, and it is corresponding to respectively obtain each first object image
The first texture feature information and corresponding second texture feature information of each second target image;It obtains between each preset time
Every in the first area features information and each prefixed time interval of acquisition of the interior corresponding palpebral region of first object image
The second area characteristic information of the corresponding palpebral region of second target image;It is special according to the first profile of each first object image
Reference breath, the first texture feature information and the first area features information determine corresponding first expressive features information;According to each
The second contour feature information, the second texture feature information and the second area characteristic information of second target image determine corresponding
Two expressive features information.
In one embodiment, expressive features message processing module 806 is also used in each prefixed time interval
One expressive features information and the second expressive features information each preset micro- expressive features information with preset in micro- expression library respectively
Similarity calculation is carried out, corresponding first similarity of each first expressive features information and each second expressive features information are obtained
Corresponding second similarity;When the first similarity is greater than preset threshold, then by the first similarity, corresponding to preset micro- expression special
The type of reference breath is determined as the corresponding first micro- expression type of the first expressive features information in prefixed time interval;When second
When similarity is greater than preset threshold, then when the corresponding type for presetting micro- expressive features information of the second similarity being determined as default
Between interval in the corresponding second micro- expression type of the second expressive features information.
In one embodiment, evaluation result determining module 810 is also used to obtain first in each prefixed time interval
Micro- expression type;The target object in first object image is determined according to first micro- expression type in each prefixed time interval
Micro- expression evaluation result;Obtain the corresponding second micro- expression type of each prefixed time interval;According between each preset time
Micro- expression evaluation result of the target object in the second target image is determined every second interior micro- expression type;According to first object
Micro- expression evaluation result of target object in image and micro- expression evaluation result of the target object in the second target image are true
Set the goal evaluation result.
In one embodiment, evaluation result determining module 810 is also used to obtain first in each prefixed time interval
Micro- expression type is the quantity of the micro- expression type of positive energy;First in each prefixed time interval micro- expression type is obtained to be negative
The quantity of the micro- expression type of energy;It is calculated according to the quantity of the quantity of the micro- expression type of positive energy and the micro- expression type of negative energy
To micro- expression type fraction;When micro- expression type fraction meets default ratio, it is determined that the target pair in first object image
Micro- expression evaluation result of elephant is the micro- expression of positive energy, on the contrary, it is determined that micro- expression of the target object in first object image
Evaluation result is the micro- expression of negative energy, wherein micro- expression evaluation result be the micro- expression of positive energy, in the micro- expression of negative energy at least
One.
In one embodiment, evaluation result determining module 810 is also used to obtain second in each prefixed time interval
Micro- expression type is the quantity of the micro- expression type of positive energy;Second in each prefixed time interval micro- expression type is obtained to be negative
The quantity of the micro- expression type of energy;It is born when the quantity of the micro- expression type of positive energy in each prefixed time interval is greater than or equal to
When the quantity of the micro- expression type of energy, it is determined that micro- expression evaluation result of the target object in the second target image is positive energy
Micro- expression, it is on the contrary, it is determined that micro- expression evaluation result of the target object in the second target image is the micro- expression of negative energy.
Specific restriction about the evaluating apparatus based on micro- expression may refer to above for the evaluation based on micro- expression
The restriction of method, details are not described herein.Modules in the above-mentioned evaluating apparatus based on micro- expression can be fully or partially through
Software, hardware and combinations thereof are realized.Above-mentioned each module can be embedded in the form of hardware or independently of the place in computer equipment
It manages in device, can also be stored in a software form in the memory in computer equipment, in order to which processor calls execution or more
The corresponding operation of modules.
In one embodiment, a kind of computer equipment is provided, which can be server, internal junction
Composition can be as shown in Figure 9.The computer equipment include by system bus connect processor, memory, network interface and
Database.Wherein, the processor of the computer equipment is for providing calculating and control ability.The memory packet of the computer equipment
Include non-volatile memory medium, built-in storage.The non-volatile memory medium is stored with operating system, computer program and data
Library.The built-in storage provides environment for the operation of operating system and computer program in non-volatile memory medium.The calculating
The database of machine equipment is for storing the data preset in micro- expression library.The network interface of the computer equipment is used for and outside
Terminal passes through network connection communication.To realize a kind of evaluation side based on micro- expression when the computer program is executed by processor
Method.
In one embodiment, a kind of computer equipment is provided, which can be terminal, internal structure
Figure can be as shown in Figure 10.The computer equipment includes the processor connected by system bus, memory, network interface, shows
Display screen and input unit.Wherein, the processor of the computer equipment is for providing calculating and control ability.The computer equipment
Memory includes non-volatile memory medium, built-in storage.The non-volatile memory medium is stored with operating system and computer
Program.The built-in storage provides environment for the operation of operating system and computer program in non-volatile memory medium.The meter
The network interface for calculating machine equipment is used to communicate with external terminal by network connection.When the computer program is executed by processor
To realize a kind of evaluation method based on micro- expression.The display screen of the computer equipment can be liquid crystal display or electronic ink
Water display screen, the input unit of the computer equipment can be the touch layer covered on display screen, be also possible to computer equipment
Key, trace ball or the Trackpad being arranged on shell can also be external keyboard, Trackpad or mouse etc..
It will be understood by those skilled in the art that structure shown in Fig. 9 or 10, only relevant to application scheme
The block diagram of part-structure does not constitute the restriction for the computer equipment being applied thereon to application scheme, specific to calculate
Machine equipment may include perhaps combining certain components or with different components than more or fewer components as shown in the figure
Arrangement.
In one embodiment, a kind of computer equipment, including memory and processor are provided, memory is stored with meter
Calculation machine program, when computer program is executed by processor, so that processor executes the step of the above-mentioned evaluation method based on micro- expression
Suddenly.The step of evaluation method herein based on micro- expression, can be in the evaluation method based on micro- expression of above-mentioned each embodiment
The step of.
In one embodiment, a kind of computer readable storage medium is provided, computer program, computer journey are stored with
When sequence is executed by processor, so that the step of processor executes the above-mentioned evaluation method based on micro- expression.It is based on micro- expression herein
Evaluation method the step of can be step in the evaluation method based on micro- expression of above-mentioned each embodiment.
Those of ordinary skill in the art will appreciate that realizing all or part of the process in above-described embodiment method, being can be with
Relevant hardware is instructed to complete by computer program, the computer program can be stored in a non-volatile computer
In read/write memory medium, the computer program is when being executed, it may include such as the process of the embodiment of above-mentioned each method.Wherein,
To any reference of memory, storage, database or other media used in each embodiment provided herein,
Including non-volatile and/or volatile memory.Nonvolatile memory may include read-only memory (ROM), programming ROM
(PROM), electrically programmable ROM (EPROM), electrically erasable ROM (EEPROM) or flash memory.Volatile memory may include
Random access memory (RAM) or external cache.By way of illustration and not limitation, RAM is available in many forms,
Such as static state RAM (SRAM), dynamic ram (DRAM), synchronous dram (SDRAM), double data rate sdram (DDRSDRAM), enhancing
Type SDRAM (ESDRAM), synchronization link (Synchlink) DRAM (SLDRAM), memory bus (Rambus) direct RAM
(RDRAM), direct memory bus dynamic ram (DRDRAM) and memory bus dynamic ram (RDRAM) etc..
Each technical characteristic of above embodiments can be combined arbitrarily, for simplicity of description, not to above-described embodiment
In each technical characteristic it is all possible combination be all described, as long as however, the combination of these technical characteristics be not present lance
Shield all should be considered as described in this specification.
The several embodiments of the application above described embodiment only expresses, the description thereof is more specific and detailed, but simultaneously
It cannot therefore be construed as limiting the scope of the patent.It should be pointed out that coming for those of ordinary skill in the art
It says, without departing from the concept of this application, various modifications and improvements can be made, these belong to the protection of the application
Range.Therefore, the scope of protection shall be subject to the appended claims for the application patent.
Claims (10)
1. a kind of evaluation method based on micro- expression, which comprises
Obtain the first object image and the second target image in multiple prefixed time intervals, wherein the first object image and
The target object of second target image is different;
According in each prefixed time interval the first object image and second target image obtain respectively pair
The the first expressive features information and the second expressive features information answered;
By the first expressive features information and the second expressive features information difference in each prefixed time interval
It is compared with micro- expression library is preset, respectively obtains the first comparison result in each prefixed time interval and second and compare
As a result;
According in each prefixed time interval first comparison result and second comparison result determine it is corresponding
First micro- expression type and second micro- expression type;
According in each prefixed time interval described first micro- expression type and the second micro- expression type determination comment
Valence result.
2. the method according to claim 1, wherein the first object obtained in multiple prefixed time intervals
Image and the second target image, comprising:
Collected first image in each prefixed time interval is obtained, the first image includes first object object;
Collected second image in each prefixed time interval is obtained, second image includes the second target object,
Wherein the first object object is different with second target object;
Face recognition is carried out to the first object object in the corresponding the first image of each prefixed time interval,
The each prefixed time interval corresponding first face image region is obtained, by each prefixed time interval corresponding
One face image region is as the first object image in each prefixed time interval;
Face recognition is carried out to second target object in corresponding second image of each prefixed time interval,
The each prefixed time interval corresponding second face image region is obtained, by each prefixed time interval corresponding
Two face image regions are as the second target image in each prefixed time interval.
3. the method according to claim 1, wherein described according in each prefixed time interval
First object image and second target image obtain corresponding first expressive features information and the second expressive features letter respectively
Breath, comprising:
Obtain the first object image and second target image in each prefixed time interval;
It is special that image is carried out to the first object image in each prefixed time interval according to default feature extraction algorithm
Sign is extracted, and the first expressive features information in each prefixed time interval is obtained;
Figure is carried out to second target image in each prefixed time interval according to the default feature extraction algorithm
As feature extraction, the second expressive features information in each prefixed time interval is obtained.
4. according to the method described in claim 3, it is characterized in that, the basis presets feature extraction algorithm to each described pre-
If the first object image in time interval carries out image characteristics extraction, the institute in each prefixed time interval is obtained
The first expressive features information is stated, according to the default feature extraction algorithm to described second in each prefixed time interval
Target image carries out image characteristics extraction, obtains the second expressive features information in each prefixed time interval, wraps
It includes:
To in each prefixed time interval the first object image and second target image divide, respectively
Obtain corresponding face region, Nasolabial Fold Region, palpebral region;
According to the default feature extraction algorithm to the first object image in each prefixed time interval and described
The corresponding face region of second target image carries out outline identification, and it is corresponding to respectively obtain each first object image
First profile characteristic information and the corresponding second contour feature information of each second target image;
According to the default feature extraction algorithm to the first object image in each prefixed time interval and described
The corresponding Nasolabial Fold Region of second target image carries out texture analysis, and it is corresponding to respectively obtain each first object image
The first texture feature information and corresponding second texture feature information of each second target image;
Obtain the first area of the corresponding palpebral region of the first object image in each prefixed time interval
The of the characteristic information palpebral region corresponding with second target image obtained in each prefixed time interval
Two area features information;
According to first profile characteristic information, the first texture feature information and the first area features of each first object image
Information determines corresponding first expressive features information;
According to the second contour feature information, the second texture feature information and the second area feature of each second target image
Information determines corresponding second expressive features information.
5. the method according to claim 1, wherein described by each prefixed time interval
One expressive features information and the second expressive features information respectively with preset micro- expression library and be compared, respectively obtain each institute
The first comparison result and the second comparison result in prefixed time interval are stated, according in each prefixed time interval
First comparison result and second comparison result determine corresponding first micro- expression type and second micro- expression type, comprising:
By the first expressive features information and the second expressive features information difference in each prefixed time interval
With it is described preset in micro- expression library it is each preset micro- expressive features information and carry out similarity calculation, obtain each first table
Corresponding first similarity of feelings characteristic information and corresponding second similarity of each second expressive features information;
When first similarity is greater than preset threshold, then micro- expressive features information is preset by first similarity is corresponding
Type be determined as the corresponding first micro- expression type of the first expressive features information in the prefixed time interval;
When second similarity is greater than the preset threshold, then micro- expressive features are preset by second similarity is corresponding
The type of information is determined as the corresponding second micro- expression type of the second expressive features information in the prefixed time interval.
6. the method according to claim 1, wherein described according in each prefixed time interval
First micro- expression type and second micro- expression type determine evaluation result, comprising:
Obtain described first micro- expression type in each prefixed time interval;
The mesh in the first object image is determined according to described first micro- expression type in each prefixed time interval
Mark micro- expression evaluation result of object;
Obtain described second micro- expression type in each prefixed time interval, wherein first micro- expression type with
First object image in each prefixed time interval is corresponding, second micro- expression type and each preset time
The second target image in interval is corresponding;
The mesh in second target image is determined according to described second micro- expression type in each prefixed time interval
Mark micro- expression evaluation result of object;
According to the mesh in the micro- expression evaluation result and second target image of the target object in the first object image
Micro- expression evaluation result of mark object determines objective appraisal result.
7. according to the method described in claim 6, it is characterized in that, first micro- expression type and second micro- expression class
Type includes the micro- expression type of positive energy and the micro- expression type of negative energy, described according in each prefixed time interval
First micro- expression type determines micro- expression evaluation result of the target in the first object image, comprising:
Obtain the number that described first micro- expression type in each prefixed time interval is the micro- expression type of the positive energy
Amount;
Obtain the number that described first micro- expression type in each prefixed time interval is the micro- expression type of the negative energy
Amount;
Micro- expression is calculated according to the quantity of the quantity of the micro- expression type of the positive energy and the micro- expression type of the negative energy
Type fraction;
When micro- expression type fraction meets default ratio, it is determined that the target object in the first object image
Micro- expression evaluation result be the micro- expression of positive energy, it is on the contrary, it is determined that the target object in the first object image
Micro- expression evaluation result is the micro- expression of negative energy, wherein micro- expression evaluation result is the micro- expression of positive energy, the micro- table of negative energy
At least one of feelings;
Described second micro- expression type according in each prefixed time interval determines in second target image
Target object micro- expression evaluation result, comprising:
Obtain the number that described second micro- expression type in each prefixed time interval is the micro- expression type of the positive energy
Amount;
Obtain the number that described second micro- expression type in each prefixed time interval is the micro- expression type of the negative energy
Amount;
When the quantity of the micro- expression type of the positive energy in each prefixed time interval is greater than or equal to the negative energy
When the quantity of micro- expression type, it is determined that micro- expression evaluation result of the target object in second target image is positive energy
Micro- expression, it is on the contrary, it is determined that micro- expression evaluation result of the target object in second target image is the micro- expression of negative energy.
8. a kind of evaluating apparatus based on micro- expression, which is characterized in that described device includes:
Target image obtains module, for obtaining first object image and the second target image in multiple prefixed time intervals,
Wherein the first object image is different with the target object of second target image;
Expressive features data obtaining module, for according in each prefixed time interval the first object image and institute
It states the second target image and obtains corresponding first expressive features information and the second expressive features information respectively;
Expressive features message processing module, for by each prefixed time interval the first expressive features information and
The second expressive features information respectively with preset micro- expression library and be compared, respectively obtain in each prefixed time interval
The first comparison result and the second comparison result;
Micro- expression determination type module, for according to first comparison result in each prefixed time interval and described
Second comparison result determines corresponding first micro- expression type and second micro- expression type;
Evaluation result determining module, for according to described first micro- expression type in each prefixed time interval and described
Second micro- expression type determines evaluation result.
9. a kind of computer equipment, including memory and processor, the memory are stored with computer program, feature exists
In the step of processor realizes any one of claims 1 to 7 the method when executing the computer program.
10. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the computer program
The step of method described in any one of claims 1 to 7 is realized when being executed by processor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811549282.5A CN109697421A (en) | 2018-12-18 | 2018-12-18 | Evaluation method, device, computer equipment and storage medium based on micro- expression |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811549282.5A CN109697421A (en) | 2018-12-18 | 2018-12-18 | Evaluation method, device, computer equipment and storage medium based on micro- expression |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109697421A true CN109697421A (en) | 2019-04-30 |
Family
ID=66231824
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811549282.5A Pending CN109697421A (en) | 2018-12-18 | 2018-12-18 | Evaluation method, device, computer equipment and storage medium based on micro- expression |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109697421A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110222597A (en) * | 2019-05-21 | 2019-09-10 | 平安科技(深圳)有限公司 | The method and device that screen is shown is adjusted based on micro- expression |
WO2021054889A1 (en) * | 2019-09-19 | 2021-03-25 | Arctan Analytics Pte. Ltd. | A system and method for assessing customer satisfaction from a physical gesture of a customer |
-
2018
- 2018-12-18 CN CN201811549282.5A patent/CN109697421A/en active Pending
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110222597A (en) * | 2019-05-21 | 2019-09-10 | 平安科技(深圳)有限公司 | The method and device that screen is shown is adjusted based on micro- expression |
CN110222597B (en) * | 2019-05-21 | 2023-09-22 | 平安科技(深圳)有限公司 | Method and device for adjusting screen display based on micro-expressions |
WO2021054889A1 (en) * | 2019-09-19 | 2021-03-25 | Arctan Analytics Pte. Ltd. | A system and method for assessing customer satisfaction from a physical gesture of a customer |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109359548B (en) | Multi-face recognition monitoring method and device, electronic equipment and storage medium | |
CN108229369B (en) | Image shooting method and device, storage medium and electronic equipment | |
US10810409B2 (en) | Identifying facial expressions in acquired digital images | |
US9767349B1 (en) | Learning emotional states using personalized calibration tasks | |
WO2021047232A1 (en) | Interaction behavior recognition method, apparatus, computer device, and storage medium | |
WO2019128508A1 (en) | Method and apparatus for processing image, storage medium, and electronic device | |
CN106897658B (en) | Method and device for identifying human face living body | |
JP5008269B2 (en) | Information processing apparatus and information processing method | |
Abd El Meguid et al. | Fully automated recognition of spontaneous facial expressions in videos using random forest classifiers | |
US9547808B2 (en) | Head-pose invariant recognition of facial attributes | |
CN109508638A (en) | Face Emotion identification method, apparatus, computer equipment and storage medium | |
US9443325B2 (en) | Image processing apparatus, image processing method, and computer program | |
CN108875452A (en) | Face identification method, device, system and computer-readable medium | |
GB2560340A (en) | Verification method and system | |
CN109299658B (en) | Face detection method, face image rendering device and storage medium | |
CN109376598A (en) | Facial expression image processing method, device, computer equipment and storage medium | |
CN112364827A (en) | Face recognition method and device, computer equipment and storage medium | |
CN110717449A (en) | Vehicle annual inspection personnel behavior detection method and device and computer equipment | |
CN109697421A (en) | Evaluation method, device, computer equipment and storage medium based on micro- expression | |
CN110969045B (en) | Behavior detection method and device, electronic equipment and storage medium | |
WO2021082045A1 (en) | Smile expression detection method and apparatus, and computer device and storage medium | |
CN112101479B (en) | Hair style identification method and device | |
CN114399699A (en) | Target recommendation object determination method and device, electronic equipment and storage medium | |
CN113343948A (en) | Satisfaction degree analysis method and system and computer storage medium thereof | |
CN113569676A (en) | Image processing method, image processing device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |