CN109686418A - Facial paralysis degree evaluation method, apparatus, electronic equipment and storage medium - Google Patents
Facial paralysis degree evaluation method, apparatus, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN109686418A CN109686418A CN201811534193.3A CN201811534193A CN109686418A CN 109686418 A CN109686418 A CN 109686418A CN 201811534193 A CN201811534193 A CN 201811534193A CN 109686418 A CN109686418 A CN 109686418A
- Authority
- CN
- China
- Prior art keywords
- target face
- value
- area
- evaluation
- state
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biophysics (AREA)
- Physical Education & Sports Medicine (AREA)
- Epidemiology (AREA)
- Medical Informatics (AREA)
- Primary Health Care (AREA)
- Public Health (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Present applicant proposes a kind of facial paralysis degree evaluation method, apparatus, electronic equipment and storage mediums, it is related to virtual rehabilitation technique field, this method comprises: the facial coordinate value according to target face under static state, obtains the first score value of target face under static state;According to facial coordinate value of the target face under motion state, obtain the second score value and third score value of target face, wherein, motion state, which includes at least, at least one of lifts forehead, eye closing, alarms nose, show tooth, curl one's lip, when second score value characterization is kept in motion to target face, part scoring to facial expression, third score value characterization are kept in motion to target face, score the linkage of facial expression;According to the first score value, the second score value and third score value, the facial paralysis degree evaluation value of target face is obtained.A kind of facial paralysis degree evaluation method, apparatus, electronic equipment and storage medium provided herein, can make the evaluation of facial paralysis degree more objectify.
Description
Technical field
This application involves virtual rehabilitation technique fields, in particular to a kind of facial paralysis degree evaluation method, apparatus, electricity
Sub- equipment and storage medium.
Background technique
Facial paralysis can make patient mouthful strabismus askew, influence the expression of the normal expression of patient, or even will affect patient's appearance instrument shape
As generating greatly negative effect to patients ' psychological health, generating obstruction to the social interaction of patient.China facial paralysis patient is many
More, very serious by facial paralysis harm, disease incidence is in trend is risen year by year, and facial paralysis patient's all age group has, due to young man
The increase of social work pressure, morbidity are in rejuvenation trend.
Facial paralysis can bring some negative impacts to the physical and mental health of patient, affect the daily life and work of patient, face
Paralysis functionally will affect the function of eye and mouth, make vision disorder serious person's ocular infection, and chewing obstacle influences feed.Furthermore by
It makes one to feel oneself inferior in the defect of appearance self-closing, depressed etc..If facial paralysis patient and early treatment can make facial paralysis fully recover.When treatment, health
Multiple doctor need to carry out facial paralysis assessment to facial paralysis patient, and then formulate targetedly facial paralysis rehabilitation exercise prescription, and see by assessment
Examine the subsequent rehabilitation situation of patient.
Summary of the invention
The application's is designed to provide a kind of facial paralysis degree evaluation method, apparatus, electronic equipment and storage medium, can
The evaluation of facial paralysis degree is set more to objectify.
To achieve the goals above, the embodiment of the present application the technical solution adopted is as follows:
In a first aspect, the embodiment of the present application provides a kind of facial paralysis degree evaluation method, applied to the face to target face
Paralysed degree is evaluated, which comprises
According to the facial coordinate value of the target face under static state, the target face is obtained under static state
The first score value;
According to facial coordinate value of the target face under motion state, the second score value of the target face is obtained
And third score value, wherein the motion state, which includes at least, at least one of lifts forehead, eye closing, alarms nose, show tooth, curl one's lip,
When the second score value characterization is kept in motion to the target face, score the part of facial expression, the third
Score value characterization is kept in motion to the target face, scores the linkage of facial expression;
According to first score value, second score value and the third score value, the target face is obtained
Facial paralysis degree evaluation value.
Second aspect, the embodiment of the present application provides a kind of facial paralysis degree evaluation device, applied to the face to target face
Paralysed degree is evaluated, and described device includes:
First processing module obtains the mesh for the facial coordinate value according to the target face under static state
Mark the first score value of face under static state;
Second processing module obtains the mesh for the facial coordinate value according to the target face under motion state
Mark the second score value and third score value of face, wherein the motion state includes at least lift forehead, eye closing, alarms nose, shows
Tooth at least one of is curled one's lip, when the second score value characterization is kept in motion to the target face, to facial expression
Part scoring, third score value characterization is kept in motion to the target face, scores the linkage of facial expression;
Third processing module, for obtaining according to first score value, second score value and the third score value
Obtain the facial paralysis degree evaluation value of the target face.
The third aspect, the embodiment of the present application provide a kind of electronic equipment, and the electronic equipment includes memory, for depositing
Store up one or more programs;Processor.When one or more of programs are executed by the processor, above-mentioned facial paralysis is realized
Degree evaluation method.
Fourth aspect, the embodiment of the present application provide a kind of computer readable storage medium, are stored thereon with computer journey
Sequence, the computer program realize above-mentioned facial paralysis degree evaluation method when being executed by processor.
Compared with the existing technology, a kind of facial paralysis degree evaluation method, apparatus, electronic equipment provided by the embodiment of the present application
And storage medium, shape is being moved by the first score value and target face that are obtained under static state due to target face
The table that the second score value locally to score and target face of the characterization facial expression obtained under state obtain under motion state
The third score value for levying the linkage scoring of facial expression, finally obtains the facial paralysis degree evaluation of characterization target person face paralysis degree
Value, compared with the prior art, avoids when evaluating patient's facial paralysis degree, by artificial subjective factor, makes facial paralysis degree
Evaluation more objectifies.
To enable the above objects, features, and advantages of the application to be clearer and more comprehensible, preferred embodiment is cited below particularly, and cooperate
Appended attached drawing, is described in detail below.
Detailed description of the invention
Technical solution in ord to more clearly illustrate embodiments of the present application, below will be to needed in the embodiment attached
Figure is briefly described, it should be understood that the following drawings illustrates only some embodiments of the application, therefore is not construed as pair
The restriction of range for those of ordinary skill in the art without creative efforts, can also be according to this
A little attached drawings obtain other relevant attached drawings.
Fig. 1 shows a kind of schematic diagram of a kind of electronic equipment provided by the embodiment of the present application;
Fig. 2 shows a kind of a kind of schematic flow charts of facial paralysis degree evaluation method provided by the embodiment of the present application;
Fig. 3 is a kind of schematic diagram of human face characteristic point collection distributed model;
Fig. 4 is a kind of schematic flow chart of the sub-step of S101 in Fig. 2;
Fig. 5 is a kind of schematic flow chart of the sub-step of S103 in Fig. 2;
Fig. 6 is another schematic flow chart of the sub-step of S103 in Fig. 2;
Fig. 7 shows a kind of another schematic flow of facial paralysis degree evaluation method provided by the embodiment of the present application
Figure;
Fig. 8 is that a kind of face rotates angle schematic diagram;
Fig. 9 shows a kind of a kind of schematic diagram of facial paralysis degree evaluation device provided by the embodiment of the present application.
In figure: 100- electronic equipment;101- memory;102- processor;103- storage control;104- Peripheral Interface;
105- radio frequency unit;106- communication bus/signal wire;107- camera unit;108- display unit;200- facial paralysis degree evaluation dress
It sets;201- preprocessing module;202- first processing module;203- Second processing module;204- third processing module.
Specific embodiment
To keep the purposes, technical schemes and advantages of the embodiment of the present application clearer, below in conjunction with the embodiment of the present application
In attached drawing, the technical scheme in the embodiment of the application is clearly and completely described, it is clear that described embodiment is
Some embodiments of the present application, instead of all the embodiments.The application being usually described and illustrated herein in the accompanying drawings is implemented
The component of example can be arranged and be designed with a variety of different configurations.
Therefore, the detailed description of the embodiments herein provided in the accompanying drawings is not intended to limit below claimed
Scope of the present application, but be merely representative of the selected embodiment of the application.Based on the embodiment in the application, this field is common
Technical staff's every other embodiment obtained without creative efforts belongs to the model of the application protection
It encloses.
It should also be noted that similar label and letter indicate similar terms in following attached drawing, therefore, once a certain Xiang Yi
It is defined in a attached drawing, does not then need that it is further defined and explained in subsequent attached drawing.Meanwhile the application's
In description, term " first ", " second " etc. are only used for distinguishing description, are not understood to indicate or imply relative importance.
It should be noted that, in this document, relational terms such as first and second and the like are used merely to a reality
Body or operation are distinguished with another entity or operation, are deposited without necessarily requiring or implying between these entities or operation
In any actual relationship or order or sequence.Moreover, the terms "include", "comprise" or its any other variant are intended to
Non-exclusive inclusion, so that the process, method, article or equipment including a series of elements is not only wanted including those
Element, but also including other elements that are not explicitly listed, or further include for this process, method, article or equipment
Intrinsic element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that
There is also other identical elements in process, method, article or equipment including the element.
With reference to the accompanying drawing, it elaborates to some embodiments of the application.In the absence of conflict, following
Feature in embodiment and embodiment can be combined with each other.
Clinically used evaluation criteria mainly has House-Brackman facial nerve hierarchy system, the classification of Nottingham facial nerve
System etc., rehabilitation doctor by observation patient facial region's state feature, artificial definition patient be normal, slight dysfunction, in
Spend dysfunction, middle severe dysfunction, severe dysfunction or entirely reactive energy etc..But different doctors mark according to same assessment
Quasi- artificially to carry out facial paralysis assessment to same patient, assessment result is often not consistent.
Based on drawbacks described above, a kind of possible implementation provided by the embodiment of the present application are as follows: by due to target person
The office for the characterization facial expression that the first score value and target face that face obtains under static state obtain under motion state
The third of the linkage scoring for the characterization facial expression that the second score value and target face of portion's scoring obtain under motion state
Score value finally obtains the facial paralysis degree evaluation value of characterization target person face paralysis degree.
Referring to Fig. 1, one kind that Fig. 1 shows a kind of electronic equipment 100 provided by the embodiment of the present application is schematically tied
Composition, in the embodiment of the present application, the electronic equipment 100 may be, but not limited to, smart phone, PC
(personal computer, PC), tablet computer, pocket computer on knee, personal digital assistant (personal
Digital assistant, PDA) etc..The electronic equipment 100 include memory 101, storage control 103, one or
Multiple (one is only shown in figure) processors 102, Peripheral Interface 104, radio frequency unit 105, camera unit 107, display unit 108
Deng.These components are mutually communicated by one or more communication bus/signal wire 106.
Memory 101 can be used for storing software program and mould group, and the facial paralysis degree as provided by the embodiment of the present application is commented
The corresponding program instruction of valence device 200/mould group, processor 102 by the software program that is stored in memory 101 of operation and
Mould group, thereby executing various function application and image procossing, the facial paralysis degree evaluation method as provided by the embodiment of the present application.
Wherein, the memory 101 may be, but not limited to, random access memory (Random Access
Memory, RAM), read-only memory (Read Only Memory, ROM), programmable read only memory (Programmable
Read-Only Memory, PROM), erasable read-only memory (Erasable Programmable Read-Only
Memory, EPROM), electricallyerasable ROM (EEROM) (Electric Erasable Programmable Read-Only
Memory, EEPROM) etc..
Processor 102 can be a kind of IC chip, have signal handling capacity.Above-mentioned processor 102 can be with
It is general processor, including central processing unit (Central Processing Unit, CPU), network processing unit (Network
Processor, NP), speech processor and video processor etc.;Can also be digital signal processor, specific integrated circuit,
Field programmable gate array or other programmable logic device, discrete gate or transistor logic, discrete hardware components.
It may be implemented or execute disclosed each method, step and the logic diagram in the embodiment of the present application.General processor can be
Microprocessor or the processor 102 are also possible to any conventional processor etc..
Various input/output devices are couple processor 102 and memory 101 by Peripheral Interface 104.In some possibility
Implementation in, Peripheral Interface 104, processor 102 and storage control 103 can be realized in one single chip.At this
In the other some possible implementations of application, they can also be realized by independent chip respectively.
Radio frequency unit 105 is used to receive and transmit electromagnetic wave, realizes the mutual conversion of electromagnetic wave and electric signal, thus with
Communication network or other equipment are communicated.
Camera unit 107 is for shooting picture, so that the photo of 102 pairs of processor shootings is handled.
Display unit 108 is used to provide images outputting interface for user, shows image information, so that user obtains facial paralysis journey
Spend evaluation result.
It is appreciated that structure shown in FIG. 1 is only to illustrate, electronic equipment 100 may also include it is more than shown in Fig. 1 or
Less component, or with the configuration different from shown in Fig. 1.Each component shown in Fig. 1 can using hardware, software or its
Combination is realized.
Referring to Fig. 2, Fig. 2 shows a kind of a kind of signals of facial paralysis degree evaluation method provided by the embodiment of the present application
Property flow chart, which is applied to evaluate the facial paralysis degree of target face, over there paralysed patient
When facial paralysis degree is evaluated, scheme by acquiring the face-image of facial paralysis patient, and then according to the target face of patient in face
Coordinate value as in obtains the facial muscles of the target face respective score value under static state and under motion state,
And then it obtains and the facial paralysis degree of patient is evaluated.
Optionally, as a kind of possible implementation, the facial paralysis degree evaluation method the following steps are included:
S101 obtains target face under static state according to the facial coordinate value of target face under static state
First score value.
In general, facial paralysis patient is mostly unilateral facial paralysis, and face can distinguish Ipsilateral and strong side, and Ipsilateral is facial paralysis side occur,
Strong side is not occur facial paralysis side, that is to say, that even facial paralysis patient, and there are the facial area of half be normal, and
There is facial paralysis in non-entire facial area.
Therefore, the symptom feature based on above-mentioned facial paralysis patient divides target face in the image that electronic equipment 100 acquires
For first area and second area;Also, target face is obtained in image using the feature point set model in electronic equipment 100
In each characteristic point coordinate value, and then the target face image under static state acquired according to electronic equipment 100, and
Under static state, the facial coordinate value of each face feature point obtains the of target face under static state to target face
One score value, to evaluate the facial paralysis degree of patient under static state.
For example, referring to Fig. 3, Fig. 3 is a kind of schematic diagram of human face characteristic point collection distributed model, the face model characteristic point
All characteristic points distribution that collection includes can be obtained by Dlib open source library, and electronic equipment 100 is according to target face obtained
Image, and combine the human face characteristic point collection distributed model, and combine Dlib increase income library, each of target face can be obtained
The facial coordinate value of face feature point, and then calculate the first score value of target face under static state.
Optionally, in the image that electronic equipment 100 acquires, target face is divided into first area and second area, than
It,, will using the left side of target face as first area using the line of point 27 and point 8 as boundary such as in schematic diagram as shown in Figure 3
The right side of target face is as second area.As a result, when evaluating the facial paralysis degree of target face under static state,
Referring to Fig. 4, Fig. 4 is a kind of schematic flow chart of the sub-step of S101 in Fig. 2, as a kind of possible implementation,
S101 includes following sub-step:
S101-1 obtains target face in stationary state according to first area and the respective eye coordinate value of second area
Under eyelid evaluation of estimate.
In the embodiment of the present application, by taking first area in such as Fig. 3 is strong side, second area is Ipsilateral as an example, based on each
Facial coordinate value of the face feature point in two dimensional image respectively obtains first area and the respective eye coordinate of second area
Value, for example, the characteristic point 40 of the palpebra inferior of first area and the coordinate value of characteristic point 41 are respectively D40(x40,y40) and D41(x41,
y41), and the coordinate value of the characteristic point 46 of the palpebra inferior of second area and characteristic point 47 is respectively D46(x46,y46) and D47(x47,
y47), calculate the arithmetic mean of instantaneous value L=(y of both two palpebra inferior characteristic points 40 and characteristic point 41 ordinate in first area40+
y41)/2;And calculate the arithmetic mean of instantaneous value R=of both two palpebra inferior point feature points 46 and characteristic point 47 ordinate in second area
(y46+y47)/2;If the difference of both R and L is greater than the threshold value e of setting1, then determine second area compared with first area, eyelid
It is sagging, remember that the eyelid evaluation of estimate of target face is preset score, such as 1 point of note at this time;Conversely, if the difference of R and both L is small
In or equal to setting threshold value e1, then second area is determined compared with first area, and eyelid counts 0 point without obvious sagging at this time.
S101-2 obtains target face in stationary state according to first area and the respective nose coordinate value of second area
Under nasal evaluations value.
Similarly, it is respective with second area that first area is respectively obtained in the evaluation of stationary state for the nose of patient
Nose coordinate value calculates nasal evaluations value.
For example, take characteristic point 31 and characteristic point 35 respectively as the nose characteristic point of first area and second area, then
The nose coordinate value of one region and second area is respectively D31(x31,y31) and D35(x35,y35), calculate characteristic point 31 and characteristic point
The difference DELTA b=y of both 35 ordinates31-y35If Δ b is greater than the threshold value e of setting2, then determine second area and first area phase
Than no pleat remembers that nasal evaluations value is 2 points at this time;If Δ b is less than or equal to the threshold value e of setting2, and it is greater than the threshold value of setting
e3, i.e. e3<Δb≤e2, then determine that the pleat of first area and second area is not quite identical, remember that nasal evaluations value is 1 at this time
Point;And if Δ b is less than or equal to the threshold value e of setting3, Δ b≤e3, then determine that first area is consistent with the pleat of second area, this
When note nasal evaluations value be 0 point.
S101-3 obtains target face in stationary state according to first area and the respective oral area coordinate value of second area
Under oral area evaluation of estimate.
For patient oral area in the evaluation of stationary state, respectively obtain first area and the respective oral area of second area sat
Scale value, and then oral area evaluation of estimate is calculated.
For example, take characteristic point 48 and characteristic point 54 respectively as the oral area characteristic point of first area and second area, then
The oral area coordinate value of one region and second area is respectively D48(x48,y48) and D54(x54,y54), calculate characteristic point 48 and characteristic point
The difference DELTA k=of both 54 ordinates | y48-y54|, if Δ k is greater than the threshold value e of setting4, then determine second area and first area
It compares, the corners of the mouth tilts upward or downward, remembers that oral area evaluation of estimate is 1 point at this time;Conversely, if Δ k is less than or equal to the threshold value of setting
e4, then second area is determined compared with first area, and the corners of the mouth remembers that oral area evaluation of estimate is 0 point without obvious inclination at this time.
S101-4 obtains target face under static state according to eyelid evaluation of estimate, nasal evaluations value and oral area evaluation of estimate
The first score value.
In summary the eyelid evaluation of estimate, nasal evaluations value and oral area evaluation of estimate for calculating acquisition, as a kind of possible reality
Existing mode, the calculation formula of the first score value of target face under static state are as follows:
S=S0*(S1+S2+S3),
Wherein, S is the first score value, S1For eyelid evaluation of estimate, S2For nasal evaluations value, S3For oral area evaluation of estimate, S0It is pre-
If proportionality coefficient, for example, can be by S0It is set as 5.
It is worth noting that in each sub-steps of above-mentioned S101, between S101-1 and S101-2 and S101-3 not
The sequencing of execution, these three sub-steps can sequentially execute sometimes (such as successively execute S101-1, S101-2, S101-3,
Either successively execute S101-2, S101-1, S101-3), it can also concurrently be executed with three sub-steps sometimes, this depends on specific
Depending on related function or computer program.
Target face is obtained according to facial coordinate value of the target face under motion state please continue to refer to Fig. 3, S103
The second score value and third score value.
The facial paralysis degree of patient is evaluated, not only needs to consider the scoring of target face under static state, also needs
Consider the score value under the facial muscles of target face are kept in motion, wherein the motion state of facial muscles is at least wrapped
At least one of include lift forehead, eye closing, alarm nose, show tooth, curl one's lip, and the second score value characterization is in movement shape to target face
It when state, scores the part of facial expression, for example for executing the second score value alarmmed and obtained when nose movement, characterization is to towering
The part scoring of nose movement;Third score value is then characterized and is kept in motion to target face, is scored the linkage of facial expression,
For example for similarly alarmming the third score value obtained when nose movement, characterization is target face when executing towering nose movement, mouth
The evaluation of the linkage status in portion (other regions such as eye or other eyebrows).
Optionally, it referring to Fig. 5, Fig. 5 is a kind of schematic flow chart of the sub-step of S103 in Fig. 2, is commented for second
The calculating of value, as a kind of possible implementation, S103 includes following sub-step:
S103-1, according to target face in the case where lifting forehead state, first area and the respective eyelid coordinate value of second area
And eyebrow coordinate value, obtain lift volume Motion evaluation value of the target face under motion state.
It is above-mentioned be that lift forehead movement is executed according to patient in the strong example of side, second area as Ipsilateral using first area
When target face image, calculate characteristic point 42 and 22 ordinate of characteristic point difference EBR42-22=y42-y22, characteristic point 42 with
The difference EBR of 23 ordinate of characteristic point42-23=y42-y23, the difference EBR of 24 ordinate of characteristic point 47 and characteristic point47-24=y47-
y24, the difference EBR of 25 ordinate of characteristic point 46 and characteristic point46-25=y46-y25, the difference of 26 ordinate of characteristic point 45 and characteristic point
Value EBR45-26=y45-y26, and calculate the arithmetic mean of instantaneous value EBR of above-mentioned each ordinate differencen=(EBR42-22+EBR42-23+
EBR47-24+EBR46-25+EBR45-26)/5;And in the same way, aforementioned each coordinate points are calculated at target face
The arithmetic mean of instantaneous value EBR of ordinate when stationary stateor;And obtain the coordinate value D of the characteristic point 17 of eyebrow in first area17
(x17,y17), the coordinate value D of characteristic point 1818(x18,y18), the coordinate value D of characteristic point 1919(x19,y19), the coordinate of characteristic point 20
Value D20(x20,y20), the coordinate value D of characteristic point 2121(x21,y21), calculate the arithmetic mean of instantaneous value lbr of this 5 characteristic point ordinates
=(y17+y18+y19+y20+y21)/5;Obtain the coordinate value D of the characteristic point 22 of eyebrow in second area22(x22,y22), characteristic point
23 coordinate value D23(x23,y23), the coordinate value D of characteristic point 2424(x24,y24), the coordinate value D of characteristic point 2525(x25,y25), it is special
The coordinate value D of sign point 2626(x26,y26), calculate the arithmetic mean of instantaneous value rbr=(y of this 5 characteristic point ordinates22+y23+y24+y25+
y26)/5;If EBRnSubtract EBRorDifference be less than setting threshold value EBR1, note lift volume Motion evaluation value is 1 point;If EBRnIt subtracts
EBRorDifference be greater than or equal to setting threshold value EBR1, and it is less than the threshold value EBR of setting2, note lift volume Motion evaluation value is 2
Point;If EBRnSubtract EBRorDifference be greater than or equal to setting threshold value EBR2, and it is less than the threshold value EBR of setting3, note lift volume fortune
Dynamic evaluation of estimate is 3 points;If EBRnSubtract EBRorDifference be greater than or equal to setting threshold value EBR3, and | lbr-rbr | value it is big
In the threshold value T of setting1, note lift volume Motion evaluation value is 4 points;If EBRnSubtract EBRorDifference be greater than or equal to setting threshold value
EBR3, and | lbr-rbr | value be less than or equal to setting threshold value T1, note lift volume Motion evaluation value is 5 points.
Optionally, as a kind of possible implementation, EBR is obtained calculatingnWhen, it can choose in continuous multiple frames image
In, for example in continuous 8 seconds multiframe pictures, choose and calculate the multiple EBR of acquisitionnIn maximum value EBRn-max, as above-mentioned calculating
EBRn。
S103-2, according to target face under closed-eye state, first area and the respective eyelid coordinate value of second area are obtained
To eye closing Motion evaluation value of the target face under motion state.
It is above-mentioned be in the strong example of side, second area as Ipsilateral, when executing eye closing movement according to patient using first area
The image of target face, the characteristic point 44, characteristic point 43, characteristic point 47 and characteristic point 46 for obtaining palpebra inferior on second area are respective
Coordinate value be respectively D44(x44,y44)、D43(x43,y43)、D47(x47,y47) and D46(x46,y46), calculating difference EC47-43=
y47-y43, EC46-44=y46-y44, and obtain EC47-43With EC46-44Arithmetic mean of instantaneous value ECn=(EC47-43+EC46-44)/2;And it can
To obtain characteristic point 44, characteristic point 43, characteristic point 47 and the respective target face of characteristic point 46 and be in quiet according to aforementioned calculation
The EC being only calculated when stateor;Also, obtain the coordinate value D of the characteristic point 38 in first area38(x38,y38), characteristic point
37 coordinate value D37(x37,y37), calculate the arithmetic mean of instantaneous value lt for obtaining characteristic point 38 and 37 ordinate of characteristic point in first area
=(y38+y37)/2;Obtain the coordinate value D of characteristic point 40 in first area40(x40,y40), the coordinate value D of characteristic point 4141(x41,
y41), and calculate the arithmetic mean of instantaneous value lb=(y of characteristic point 40 and 41 ordinate of characteristic point in first area40+y41)/2;Obtain the
The coordinate value D of characteristic point 43 in two regions43(x43,y43), the coordinate value D of characteristic point 4444(x44,y44), and calculate second area
Arithmetic mean of instantaneous value rt=(the y of middle characteristic point 40 and 41 ordinate of characteristic point43+y44)/2;Obtain characteristic point 47 in second area
Coordinate value D47(x47,y47), the coordinate value D of characteristic point 4646(x46,y46), and calculate characteristic point 47 and characteristic point in second area
Arithmetic mean of instantaneous value rb=(the y of 46 ordinates47+y46)/2.If ECnSubtract ECorDifference be less than setting threshold value EC1, then note is closed
Eye movement evaluation of estimate is 1 point;If ECnSubtract ECorDifference be greater than or equal to setting threshold value EC1, and it is less than given threshold EC2,
Then remember that eye closing Motion evaluation value is 2 points;If ECnSubtract ECorDifference be greater than or equal to setting threshold value EC2, and it is less than setting
Threshold value EC3, then remember that eye closing Motion evaluation value is 3 points;If ECnSubtract ECorDifference be greater than or equal to setting threshold value EC3, and |
Lt-rt | value or | lb-rb | value be greater than setting threshold value T2, then remember that eye closing Motion evaluation value is 4 points;If ECnSubtract ECor
Difference be greater than or equal to setting threshold value EC3, and | lt-rt | value and | lb-rb | value be respectively less than or equal to setting threshold
Value T2, then remember that eye closing Motion evaluation value is 5 points.
S103-3, according to target face in the case where alarmming nose state, first area and the respective nose coordinate value of second area are obtained
To towering nose Motion evaluation value of the target face under motion state.
It is in the strong example of side, second area as Ipsilateral when being executed to alarm nose movement according to patient using first area above-mentioned
The image of target face calculates the difference NR of characteristic point 35 and 27 ordinate of characteristic point in second arean=y35-y27;And according to
The arithmetic mean of instantaneous value of aforementioned coordinate points ordinate when target face remains static is calculated in identical calculation
NRor;Obtain the relative difference NR of characteristic point 31 and 35 ordinate of characteristic point in second area in first areaj=| y31-y35|;
If NRnWith NRorDifference be less than setting threshold value NR1, then remember that alarmming nose Motion evaluation value is 1 point;If NRnWith NRorDifference it is big
In or equal to setting threshold value NR1, and it is less than the threshold value NR of setting2, then remember that alarmming nose Motion evaluation value is 2 points;If NRnWith NRor's
Difference is greater than or equal to the threshold value NR of setting2, and it is less than the threshold value NR of setting3, then remember that alarmming nose Motion evaluation value is 3 points;If NRn
With NRorDifference be greater than or equal to setting threshold value NR3, and NRjGreater than the threshold value T of setting3, then remember that alarmming nose Motion evaluation value is 4
Point;If NRnWith NRorDifference be greater than or equal to setting threshold value NR3, and NRjLess than or equal to the threshold value T of setting3, then note is alarmmed
Nose Motion evaluation value is 5 points.
S103-4, according to target face in the case where showing dentation state, first area and the respective oral area coordinate value of second area are obtained
Show tooth Motion evaluation value under motion state to target face.
It is in the strong example of side, second area as Ipsilateral when being executed to show tooth movement according to patient using first area above-mentioned
The image of target face calculates the difference MOY of characteristic point 64 and 27 ordinate of characteristic point64-27=y64-y27And characteristic point 64
With the difference MOX of 27 abscissa of characteristic point64-27=x64-x27, the difference MOY of characteristic point 63 and 27 ordinate of characteristic point63-27=
y63-y27And the difference MOY of characteristic point 65 and 27 ordinate of characteristic point65-27=y65-y27, and obtain arithmetic mean of instantaneous value MOn=
(MOY64-27+MOX64-27+MOY63-27+MOY65-27)/4;And according to identical calculation, aforementioned coordinate points are calculated in mesh
Arithmetic mean of instantaneous value MO when mark face remains staticor;It obtains in first area in the abscissa, second area of characteristic point 48
The abscissa of characteristic point 54 and the abscissa of characteristic point 27 obtain the abscissa difference lmo=of characteristic point 48 Yu characteristic point 27
x48-x27And the abscissa difference rmo=x of characteristic point 54 and characteristic point 2754-x27;If MOnSubtract MOorDifference be less than set
Fixed threshold value MO1, remember and show that tooth Motion evaluation value is 1 point;If MOnSubtract MOorDifference be greater than or equal to setting threshold value MO1, and
Less than the threshold value MO of setting2, remember and show that tooth Motion evaluation value is 2 points;If MOnSubtract MOorDifference be greater than or equal to setting threshold
Value MO2, and it is less than the threshold value MO of setting3, remember and show that tooth Motion evaluation value is 3 points;If MOnSubtract MOorDifference be greater than or equal to set
Fixed threshold value MO3, and | lmo-rmo | value be greater than setting threshold value T4, remember and show that tooth Motion evaluation value is 4 points;If MOnSubtract MOor
Difference be greater than or equal to setting threshold value MO3, and | lmo-rmo | value be less than or equal to setting threshold value T4, remember and show that tooth moves
Evaluation of estimate is 5 points.
S103-5, according to target face under the state of curling one's lip, first area and the respective oral area coordinate value of second area are obtained
To curl one's lip Motion evaluation value of the target face under motion state.
It is in the strong example of side, second area as Ipsilateral to be executed to curl one's lip when acting according to patient using first area above-mentioned
The image of target face calculates the difference MP of characteristic point 54 and 27 abscissa of characteristic point in second arean=x54-x27;And according to
The difference MP of aforementioned coordinate points abscissa when target face remains static is calculated in identical calculationor;It obtains
In first area in the abscissa, second area of characteristic point 48 characteristic point 27 of the abscissa and nose of characteristic point 54 horizontal seat
Mark calculates the difference lmp=x of characteristic point 48 and 27 abscissa of characteristic point48-x27And characteristic point 54 and 27 abscissa of characteristic point
Difference rmp=x54-x27;If MPnSubtract MPorDifference be less than setting threshold value MP1, remember that Motion evaluation value of curling one's lip is 1 point;If
MPnSubtract MPorDifference be greater than or equal to setting threshold value MP1, and it is less than the threshold value MP of setting2, remember that Motion evaluation value of curling one's lip is
2 points;If MPnSubtract MPorDifference be greater than or equal to setting threshold value MP2, and it is less than the threshold value MP of setting3, remember that movement of curling one's lip is commented
Value is 3 points;If MPnSubtract MPorDifference be greater than or equal to setting threshold value MP3, and | lmp-rmp | greater than the threshold value of setting
T5, remember that Motion evaluation value of curling one's lip is 4 points;If MPnSubtract MPorDifference be greater than or equal to setting threshold value MP3, and | lmp-rmp |
Less than or equal to the threshold value T of setting5, remember that Motion evaluation value of curling one's lip is 5 points.
S103-6 according to lift volume Motion evaluation value, eye closing Motion evaluation value, towering nose Motion evaluation value, shows tooth Motion evaluation
It is worth and Motion evaluation value of curling one's lip, obtains second score value of the target face under motion state.
In summary the lift volume Motion evaluation value of acquisition is calculated, eye closing Motion evaluation value, nose Motion evaluation value is alarmmed, shows that tooth is transported
Dynamic evaluation of estimate and Motion evaluation value of curling one's lip, as a kind of possible implementation, target face second commenting under motion state
The calculation formula of score value are as follows:
D=D0*(D1+D2+D3+D4+D5),
Wherein, D is the second score value, D1To lift volume Motion evaluation value, D2For eye closing Motion evaluation value, D3To alarm nose movement
Evaluation of estimate, D4To show tooth Motion evaluation value, D5For Motion evaluation value of curling one's lip, D0For preset proportionality coefficient, for example, can be by D0
It is set as 4.
In facial paralysis patient, synkinesia refers to one group of facial muscle of facial paralysis patient during exercise, and another group of facial muscle may
It can move together in spite of oneself.Unlike the second score value, third score value characterization is that target face is in movement
When state, score the linkage of facial expression, therefore, in the embodiment of the present application, third score value can be with the second score value
It calculates together, difference is only that, it is different to participate in the coordinate points calculated.
Optionally, referring to Fig. 6, Fig. 6 is another schematic flow chart of the sub-step of S103 in Fig. 2, the target person
Face includes the target area that user selectes in target face, which is the Ipsilateral region that user selectes, such as on
The second area in example is stated, the calculating for third evaluation of estimate, as a kind of possible implementation, S103 includes following son
Step:
S103-7, according to target face in the case where lifting forehead state, the oral area coordinate value of target area obtains target face and exists
Lift volume linkage evaluation of estimate under motion state.
S103-8, according to target face under closed-eye state, the eyebrow coordinate value of target area obtains target face and is transporting
Eye closing linkage evaluation of estimate under dynamic state.
S103-9, according to target face in the case where alarmming nose state, the oral area coordinate value of target area obtains target face and is transporting
Towering nose linkage evaluation of estimate under dynamic state.
S103-10, according to target face in the case where showing dentation state, the eye coordinate value of target area obtains target face and exists
Show tooth linkage evaluation of estimate under motion state.
S103-11, according to target face under the state of curling one's lip, the eye coordinate value of target area obtains target face and exists
The evaluation of estimate that links of curling one's lip under motion state.
It is above-mentioned be in the strong example of side, second area as Ipsilateral, then using second area as target area using first area
Domain, for calculating lift volume linkage evaluation of estimate, the image of target face, takes second area when executing lift forehead movement according to patient
Oral area characteristic point 54 and nose characteristic point 27, calculate separately the difference of both characteristic point 54 and characteristic points 27 abscissa
X27-54=| x27-x54| and the difference Y of the ordinate of the two27-54=| y27-y54|;And in the same way, it is calculated
The difference X ' of aforementioned each coordinate points abscissa when target face remains static27-54=| x '27-x′54| and the two
Ordinate difference Y '27-54=| y '27-y′54|;Then two characteristic points are calculated when lifting under forehead state relative to static
Offset when state are as follows:If the value of offset F is less than setting
Threshold value F0, note lift volume linkage evaluation of estimate is 0 point;If the value of offset F is greater than or equal to the threshold value F of setting0, and it is less than setting
Threshold value F1, note lift volume linkage evaluation of estimate is 1 point;If the value of offset F is greater than or equal to the threshold value F of setting1, and it is less than setting
Threshold value F2, note lift volume linkage evaluation of estimate is 2 points;If the value of offset F is greater than or equal to the threshold value F of setting2, note lift volume, which links, to be commented
Value is 3 points.
Similarly, close one's eyes linkage evaluation of estimate, alarm nose linkage evaluation of estimate, show the tooth linkage evaluation of estimate and evaluation of estimate that links of curling one's lip
It is calculated in the way of above-mentioned calculating lift volume linkage evaluation of estimate, both do not repeat them here herein, difference is only that: the linkage of lift volume is commented
It is worth the characteristic point used and is characterized a little 54 and characteristic point 27, and the characteristic point that evaluation of estimate is used that links of closing one's eyes is characterized a little 24 He
Characteristic point 27 alarms the characteristic point used of nose linkage evaluation of estimate and is characterized a little 54 and characteristic point 27, and shows the tooth evaluation of estimate that links and use
Characteristic point be characterized a little 46 and characteristic point 27, the characteristic point that evaluation of estimate is used that links of curling one's lip is characterized a little 46 and characteristic point 27.
S103-12, according to lift volume link evaluation of estimate, close one's eyes linkage evaluation of estimate, alarm nose linkage evaluation of estimate, show tooth linkage evaluation
It is worth and the evaluation of estimate that links of curling one's lip, obtains third score value of the target face under motion state.
In summary the lift volume linkage evaluation of estimate for calculating acquisition, alarms nose linkage evaluation of estimate, shows that tooth joins at linkage evaluation of estimate of closing one's eyes
Dynamic evaluation of estimate and the evaluation of estimate that links of curling one's lip, as a kind of possible implementation, third of the target face under motion state is commented
The calculation formula of score value are as follows:
I=I0*(I1+I2+I3+I4+I5),
Wherein, I is the second score value, I1For lift volume linkage evaluation of estimate, I2For the evaluation of estimate that links of closing one's eyes, I3To alarm nose linkage
Evaluation of estimate, I4To show tooth linkage evaluation of estimate, I5For the evaluation of estimate that links of curling one's lip, I0For preset proportionality coefficient, for example, can be by I0
It is set as 1.
Target face is obtained according to the first score value, the second score value and third score value please continue to refer to Fig. 2, S105
Facial paralysis degree evaluation value.
In summary the first score value, the second score value and third score value for calculating acquisition, as a kind of possible reality
Existing mode, calculates the formula of facial paralysis degree evaluation value are as follows:
A=D-S-I,
Wherein, A is facial paralysis degree evaluation value, and D is the second score value, and S is the first score value, and I is third score value.
Based on above-mentioned design, a kind of facial paralysis degree evaluation method provided by the embodiment of the present application, by due to target person
The office for the characterization facial expression that the first score value and target face that face obtains under static state obtain under motion state
The third of the linkage scoring for the characterization facial expression that the second score value and target face of portion's scoring obtain under motion state
Score value, the facial paralysis degree evaluation value for finally obtaining characterization target person face paralysis degree are avoided and are being commented compared with the prior art
When valence patient's facial paralysis degree, by artificial subjective factor, the evaluation of facial paralysis degree is made more to objectify.
Electronic equipment 100 is when execution above-mentioned steps are evaluated with the facial paralysis degree to patient, due to acquired figure
As in, due to the target face in image may not be it is horizontal, will lead to when calculating through the above steps, generate it is larger
Error, once because target face be not it is horizontal, then the difference of the abscissa of two characteristic points and the difference of ordinate all can be by
It reduces.
Therefore, optionally, referring to Fig. 7, Fig. 7 shows a kind of facial paralysis degree evaluation side provided by the embodiment of the present application
Another schematic flow chart of method, as a kind of possible implementation, before executing S101, the facial paralysis degree evaluation side
Method is further comprising the steps of:
S100, in the multiframe picture of acquisition, rolling target face, so that target face is in horizontality.
In general, even facial paralysis patient, two characteristic points on the inside of eye, i.e., in schematic diagram as shown in Figure 3,
Characteristic point 39 and characteristic point 42, and it is symmetrical.
Therefore, optionally, as a kind of possible implementation, electronic equipment 100, will be described in rolling target face
The angle of two of target face interior eyespot line and abscissa line, the rotation angle as target face.
For example, referring to Fig. 8, Fig. 8 be a kind of face rotate angle schematic diagram, connection features point 39 with characteristic point 42 two
Point obtains line segment d, and obtains line segment d using arctan function according to the coordinate value of characteristic point 39 and 42 two points of characteristic point
With the inclination alpha of horizontal line (i.e. x-axis in Fig. 8), by target face rotation alpha angle, to keep line segment d parallel with horizontal line,
And then the step of subsequent S101 etc. is executed again.
Based on above-mentioned design, a kind of facial paralysis degree evaluation method provided by the embodiment of the present application, by the more of acquisition
In frame picture, target face is rotated, so that target face is in horizontality, and then improves and is carried out to target face
Computational accuracy when facial paralysis degree evaluation.
Referring to Fig. 9, Fig. 9 shows a kind of one kind of facial paralysis degree evaluation device 200 provided by the embodiment of the present application
Schematic diagram is evaluated applied to the facial paralysis degree to target face, which includes first
Processing module 202, Second processing module 203 and third processing module 204.
First processing module 202 is for facial coordinate value according to the target face under static state, described in acquisition
The first score value of target face under static state.
Optionally, as a kind of possible implementation, first processing module 202 is specifically used for:
According to the first area and the respective eye coordinate value of the second area, the target face is obtained static
Eyelid evaluation of estimate under state;
According to the first area and the respective nose coordinate value of the second area, the target face is obtained static
Nasal evaluations value under state;
According to the first area and the respective oral area coordinate value of the second area, the target face is obtained static
Oral area evaluation of estimate under state;
According to the eyelid evaluation of estimate, the nasal evaluations value and the oral area evaluation of estimate, obtains the target face and exist
First score value under stationary state.
Second processing module 203 is for facial coordinate value according to the target face under motion state, described in acquisition
The second score value and third score value of target face, wherein the motion state includes at least lift forehead, eye closing, alarms nose, shows
Tooth at least one of is curled one's lip, when the second score value characterization is kept in motion to the target face, to facial expression
Part scoring, third score value characterization is kept in motion to the target face, scores the linkage of facial expression.
Optionally, as a kind of possible implementation, Second processing module 203 is specifically used for:
According to the target face in the case where lifting forehead state, the first area and the respective eyelid of the second area are sat
Scale value and eyebrow coordinate value obtain lift volume Motion evaluation value of the target face under motion state;
According to the target face under closed-eye state, the first area and the respective eyelid coordinate of the second area
Value, obtains eye closing Motion evaluation value of the target face under motion state;
According to the target face in the case where alarmming nose state, the first area and the respective nose coordinate of the second area
Value, obtains towering nose Motion evaluation value of the target face under motion state;
According to the target face in the case where showing dentation state, the first area and the respective oral area coordinate of the second area
Value, obtains the target face and shows tooth Motion evaluation value under motion state;
According to the target face under the state of curling one's lip, the first area and the respective oral area coordinate of the second area
Value, obtains curl one's lip Motion evaluation value of the target face under motion state;
According to the lift volume Motion evaluation value, the eye closing Motion evaluation value, the towering nose Motion evaluation value, described show tooth
Motion evaluation value and the Motion evaluation value of curling one's lip, obtain second score value of the target face under the motion state.
Optionally, as a kind of possible implementation, the target face includes that user selects in the target face
Fixed target area, Second processing module 203 also particularly useful for:
According to the target face in the case where lifting forehead state, the oral area coordinate value of the target area obtains the target
Lift volume linkage evaluation of estimate of the face under motion state;
According to the target face under closed-eye state, the eyebrow coordinate value of the target area obtains the target person
Eye closing linkage evaluation of estimate of the face under motion state;
According to the target face in the case where alarmming nose state, the oral area coordinate value of the target area obtains the target person
Towering nose linkage evaluation of estimate of the face under motion state;
According to the target face in the case where showing dentation state, the eye coordinate value of the target area obtains the target person
Face shows tooth linkage evaluation of estimate under motion state;
According to the target face under the state of curling one's lip, the eye coordinate value of the target area obtains the target person
Curl one's lip link evaluation of estimate of the face under motion state;
According to the lift volume link evaluation of estimate, eyes closing linkage evaluation of estimate, the towering nose link evaluation of estimate, described show tooth
The evaluation of estimate that links and the evaluation of estimate that links of curling one's lip, obtain third score value of the target face under the motion state.
Third processing module 204 is used for according to first score value, second score value and the third score value,
Obtain the facial paralysis degree evaluation value of the target face.
Optionally, as a kind of possible implementation, the formula of the facial paralysis degree evaluation value is calculated are as follows:
A=D-S-I,
Wherein, A is the facial paralysis degree evaluation value, and D is second score value, and S is first score value, and I is institute
State third score value.
Optionally, as a kind of possible implementation, please continue to refer to Fig. 9, which is also wrapped
Preprocessing module 201 is included, which is used in the multiframe picture of acquisition, the target face is rotated, so that institute
It states target face and is in horizontality.
Optionally, as a kind of possible implementation, which is specifically used for:
Rotation by the angle of two of the target face interior eyespot lines and abscissa line, as the target face
Angle.
In embodiment provided herein, it should be understood that disclosed device and method, it can also be by other
Mode realize.The apparatus embodiments described above are merely exemplary, for example, the flow chart and block diagram in attached drawing are shown
According to the device of the embodiment of the present application, the architecture, function and operation in the cards of method and computer program product.
In this regard, each box in flowchart or block diagram can represent a part of a module, section or code, the mould
A part of block, program segment or code includes one or more executable instructions for implementing the specified logical function.Also it answers
When note that function marked in the box can also be to be different from being marked in attached drawing in some implementations as replacement
The sequence of note occurs.For example, two continuous boxes can actually be basically executed in parallel, they sometimes can also be by opposite
Sequence execute, this depends on the function involved.It is also noted that each box in block diagram and or flow chart and
The combination of box in block diagram and or flow chart can use the dedicated hardware based system for executing defined function or movement
System is to realize, or can realize using a combination of dedicated hardware and computer instructions.
In addition, each functional module in the embodiment of the present application can integrate one independent part of formation together,
It can be modules individualism, an independent part can also be integrated to form with two or more modules.
It, can be with if the function is realized and when sold or used as an independent product in the form of software function module
It is stored in a computer readable storage medium.Based on this understanding, the technical solution of the application is substantially in other words
The part of the part that contributes to existing technology or the technical solution can be embodied in the form of software products, the meter
Calculation machine software product is stored in a storage medium, including some instructions are used so that a computer equipment (can be a
People's computer, server or network equipment etc.) execute the embodiment of the present application the method all or part of the steps.And it is preceding
The storage medium stated includes: USB flash disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory
The various media that can store program code such as (RAM, Random Access Memory), magnetic or disk.
In conclusion a kind of facial paralysis degree evaluation method, apparatus, electronic equipment provided by the embodiment of the present application and storage
Medium is obtained under motion state by the first score value and target face that are obtained under static state due to target face
The characterization face that the second score value locally to score and target face of the characterization facial expression obtained obtain under motion state
The third score value of the linkage scoring of expression, finally obtains the facial paralysis degree evaluation value of characterization target person face paralysis degree, compares
In the prior art, avoids when evaluating patient's facial paralysis degree, by artificial subjective factor, make the evaluation of facial paralysis degree more
It objectifies;Also by the way that in the multiframe picture of acquisition, target face is rotated, so that target face is in horizontality,
And then improve the computational accuracy when carrying out facial paralysis degree evaluation to target face.
The foregoing is merely preferred embodiment of the present application, are not intended to limit this application, for the skill of this field
For art personnel, various changes and changes are possible in this application.Within the spirit and principles of this application, made any to repair
Change, equivalent replacement, improvement etc., should be included within the scope of protection of this application.
It is obvious to a person skilled in the art that the application is not limited to the details of above-mentioned exemplary embodiment, Er Qie
In the case where without departing substantially from spirit herein or essential characteristic, the application can be realized in other specific forms.Therefore, no matter
From the point of view of which point, the present embodiments are to be considered as illustrative and not restrictive, and scope of the present application is by appended power
Benefit requires rather than above description limits, it is intended that all by what is fallen within the meaning and scope of the equivalent elements of the claims
Variation is included in the application.Any reference signs in the claims should not be construed as limiting the involved claims.
Claims (10)
1. a kind of facial paralysis degree evaluation method, which is characterized in that it is evaluated applied to the facial paralysis degree to target face, it is described
Method includes:
According to the facial coordinate value of the target face under static state, of the target face under static state is obtained
One score value;
According to facial coordinate value of the target face under motion state, the second score value and of the target face is obtained
Three score values, wherein the motion state, which includes at least, at least one of lifts forehead, eye closing, alarms nose, show tooth, curl one's lip, described
When second score value characterization is kept in motion to the target face, score the part of facial expression, the third scoring
Value characterization is kept in motion to the target face, scores the linkage of facial expression;
According to first score value, second score value and the third score value, the facial paralysis of the target face is obtained
Degree evaluation value.
2. the method as described in claim 1, which is characterized in that the target face includes first area and second area, institute
The facial coordinate value under static state according to the target face is stated, first of the target face under static state is obtained
The step of score value, comprising:
According to the first area and the respective eye coordinate value of the second area, the target face is obtained in stationary state
Under eyelid evaluation of estimate;
According to the first area and the respective nose coordinate value of the second area, the target face is obtained in stationary state
Under nasal evaluations value;
According to the first area and the respective oral area coordinate value of the second area, the target face is obtained in stationary state
Under oral area evaluation of estimate;
According to the eyelid evaluation of estimate, the nasal evaluations value and the oral area evaluation of estimate, the target face is obtained static
First score value under state.
3. the method as described in claim 1, which is characterized in that the target face includes first area and second area, institute
The facial coordinate value according to the target face under motion state is stated, the step of the second score value of the target face is obtained
Suddenly, comprising:
According to the target face in the case where lifting forehead state, the first area and the respective eyelid coordinate value of the second area
And eyebrow coordinate value, obtain lift volume Motion evaluation value of the target face under motion state;
According to the target face under closed-eye state, the first area and the respective eyelid coordinate value of the second area,
Obtain eye closing Motion evaluation value of the target face under motion state;
According to the target face in the case where alarmming nose state, the first area and the respective nose coordinate value of the second area,
Obtain towering nose Motion evaluation value of the target face under motion state;
According to the target face in the case where showing dentation state, the first area and the respective oral area coordinate value of the second area,
It obtains the target face and shows tooth Motion evaluation value under motion state;
According to the target face under the state of curling one's lip, the first area and the respective oral area coordinate value of the second area,
Obtain curl one's lip Motion evaluation value of the target face under motion state;
According to the lift volume Motion evaluation value, the eye closing Motion evaluation value, the towering nose Motion evaluation value, it is described show tooth move
Evaluation of estimate and the Motion evaluation value of curling one's lip, obtain second score value of the target face under the motion state.
4. the method as described in claim 1, which is characterized in that the target face includes that user selects in the target face
Fixed target area, the facial coordinate value according to the target face under motion state, obtains the target face
The step of third score value, comprising:
According to the target face in the case where lifting forehead state, the oral area coordinate value of the target area obtains the target face
Lift volume linkage evaluation of estimate under motion state;
According to the target face under closed-eye state, the eyebrow coordinate value of the target area obtains the target face and exists
Eye closing linkage evaluation of estimate under motion state;
According to the target face in the case where alarmming nose state, the oral area coordinate value of the target area obtains the target face and exists
Towering nose linkage evaluation of estimate under motion state;
According to the target face in the case where showing dentation state, the eye coordinate value of the target area obtains the target face and exists
Show tooth linkage evaluation of estimate under motion state;
According to the target face under the state of curling one's lip, the eye coordinate value of the target area obtains the target face and exists
The evaluation of estimate that links of curling one's lip under motion state;
According to the lift volume link evaluation of estimate, eyes closing linkage evaluation of estimate, towering nose linkage evaluation of estimate, described show tooth linkage
Evaluation of estimate and the evaluation of estimate that links of curling one's lip obtain third score value of the target face under the motion state.
5. the method as described in claim 1, which is characterized in that according to first score value, second score value and institute
Third score value is stated, the formula of the facial paralysis degree evaluation value is calculated are as follows:
A=D-S-I,
Wherein, A is the facial paralysis degree evaluation value, and D is second score value, and S is first score value, and I is described the
Three score values.
6. the method as described in claim 1, which is characterized in that in the face according to the target face under static state
Portion's coordinate value, before the step of obtaining the first score value of the target face under static state, the method also includes:
In the multiframe picture of acquisition, the target face is rotated, so that the target face is in horizontality.
7. method as claimed in claim 6, which is characterized in that it is described in the multiframe picture of acquisition, rotate the target person
The step of face, comprising:
Rotation angle by the angle of two of the target face interior eyespot lines and abscissa line, as the target face
Degree.
8. a kind of facial paralysis degree evaluation device, which is characterized in that it is evaluated applied to the facial paralysis degree to target face, it is described
Device includes:
First processing module obtains the target person for the facial coordinate value according to the target face under static state
The first score value of face under static state;
Second processing module obtains the target person for the facial coordinate value according to the target face under motion state
The second score value and third score value of face, wherein the motion state includes at least lift forehead, eye closing, alarms nose, show tooth, skim
At least one of mouth, when the second score value characterization is kept in motion to the target face, to the office of facial expression
Portion's scoring, the third score value characterization are kept in motion to the target face, score the linkage of facial expression;
Third processing module, for obtaining institute according to first score value, second score value and the third score value
State the facial paralysis degree evaluation value of target face.
9. a kind of electronic equipment characterized by comprising
Memory, for storing one or more programs;
Processor;
When one or more of programs are executed by the processor, such as side of any of claims 1-7 is realized
Method.
10. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the computer program quilt
Such as method of any of claims 1-7 is realized when processor executes.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811534193.3A CN109686418A (en) | 2018-12-14 | 2018-12-14 | Facial paralysis degree evaluation method, apparatus, electronic equipment and storage medium |
PCT/CN2019/123478 WO2020119584A1 (en) | 2018-12-14 | 2019-12-06 | Facial paralysis degree evaluation method and apparatus, electronic device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811534193.3A CN109686418A (en) | 2018-12-14 | 2018-12-14 | Facial paralysis degree evaluation method, apparatus, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109686418A true CN109686418A (en) | 2019-04-26 |
Family
ID=66187741
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811534193.3A Pending CN109686418A (en) | 2018-12-14 | 2018-12-14 | Facial paralysis degree evaluation method, apparatus, electronic equipment and storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN109686418A (en) |
WO (1) | WO2020119584A1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020119584A1 (en) * | 2018-12-14 | 2020-06-18 | 深圳先进技术研究院 | Facial paralysis degree evaluation method and apparatus, electronic device and storage medium |
CN111553250A (en) * | 2020-04-25 | 2020-08-18 | 深圳德技创新实业有限公司 | Accurate facial paralysis degree evaluation method and device based on face characteristic points |
CN111680545A (en) * | 2020-04-25 | 2020-09-18 | 深圳德技创新实业有限公司 | Semantic segmentation based accurate facial paralysis degree evaluation method and device |
CN112001213A (en) * | 2020-04-25 | 2020-11-27 | 深圳德技创新实业有限公司 | Accurate facial paralysis degree evaluation method and device based on 3D point cloud segmentation |
CN112466437A (en) * | 2020-11-03 | 2021-03-09 | 桂林医学院附属医院 | Apoplexy information processing system |
CN113327247A (en) * | 2021-07-14 | 2021-08-31 | 中国科学院深圳先进技术研究院 | Facial nerve function evaluation method and device, computer equipment and storage medium |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113343927B (en) * | 2021-07-03 | 2023-06-23 | 郑州铁路职业技术学院 | Intelligent face recognition method and system suitable for facial paralysis patient |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016083826A1 (en) * | 2014-11-28 | 2016-06-02 | The Nottingham Trent University | Facial exercise system |
CN107713984A (en) * | 2017-02-07 | 2018-02-23 | 王俊 | Facial paralysis objective evaluation method and its system |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4756589B2 (en) * | 2005-12-07 | 2011-08-24 | ソニー株式会社 | Image processing apparatus and method, program, and recording medium |
CN106980815A (en) * | 2017-02-07 | 2017-07-25 | 王俊 | Facial paralysis objective evaluation method under being supervised based on H B rank scores |
CN109686418A (en) * | 2018-12-14 | 2019-04-26 | 深圳先进技术研究院 | Facial paralysis degree evaluation method, apparatus, electronic equipment and storage medium |
-
2018
- 2018-12-14 CN CN201811534193.3A patent/CN109686418A/en active Pending
-
2019
- 2019-12-06 WO PCT/CN2019/123478 patent/WO2020119584A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016083826A1 (en) * | 2014-11-28 | 2016-06-02 | The Nottingham Trent University | Facial exercise system |
CN107713984A (en) * | 2017-02-07 | 2018-02-23 | 王俊 | Facial paralysis objective evaluation method and its system |
Non-Patent Citations (1)
Title |
---|
韩真真 等: "量表在面瘫治疗中的应用与分析", 《天津中医药》 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020119584A1 (en) * | 2018-12-14 | 2020-06-18 | 深圳先进技术研究院 | Facial paralysis degree evaluation method and apparatus, electronic device and storage medium |
CN111553250A (en) * | 2020-04-25 | 2020-08-18 | 深圳德技创新实业有限公司 | Accurate facial paralysis degree evaluation method and device based on face characteristic points |
CN111680545A (en) * | 2020-04-25 | 2020-09-18 | 深圳德技创新实业有限公司 | Semantic segmentation based accurate facial paralysis degree evaluation method and device |
CN112001213A (en) * | 2020-04-25 | 2020-11-27 | 深圳德技创新实业有限公司 | Accurate facial paralysis degree evaluation method and device based on 3D point cloud segmentation |
CN112001213B (en) * | 2020-04-25 | 2024-04-12 | 深圳德技创新实业有限公司 | Accurate facial paralysis degree evaluation method and device based on 3D point cloud segmentation |
CN112466437A (en) * | 2020-11-03 | 2021-03-09 | 桂林医学院附属医院 | Apoplexy information processing system |
CN113327247A (en) * | 2021-07-14 | 2021-08-31 | 中国科学院深圳先进技术研究院 | Facial nerve function evaluation method and device, computer equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
WO2020119584A1 (en) | 2020-06-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109686418A (en) | Facial paralysis degree evaluation method, apparatus, electronic equipment and storage medium | |
Climent-Pérez et al. | A review on video-based active and assisted living technologies for automated lifelogging | |
CN110399849A (en) | Image processing method and device, processor, electronic equipment and storage medium | |
KR101284561B1 (en) | System and method for multi-modality emotion recognition and recording medium thereof | |
CN109659006A (en) | Facial muscle training method, device and electronic equipment | |
Cvetkoska et al. | Smart mirror E-health assistant—Posture analyze algorithm proposed model for upright posture | |
Kukharev et al. | Digital facial anthropometry: application and implementation | |
Maskeliūnas et al. | BiomacVR: A virtual reality-based system for precise human posture and motion analysis in rehabilitation exercises using depth sensors | |
TWI829944B (en) | Avatar facial expression generating system and method of avatar facial expression generation | |
Tanikawa et al. | Objective three-dimensional assessment of lip form in patients with repaired cleft lip | |
Agarwal et al. | Skin disease classification using CNN algorithms | |
Abdulghafor et al. | An analysis of body language of patients using artificial intelligence | |
Pikulkaew et al. | Pain detection using deep learning with evaluation system | |
TWI383776B (en) | Weight-predicted system and method thereof | |
Zhang et al. | Non-linear finite element model established on pectoralis major muscle to investigate large breast motions of senior women for bra design | |
Leone et al. | Ambient and wearable sensor technologies for energy expenditure quantification of ageing adults | |
Eldib et al. | Sleep analysis for elderly care using a low-resolution visual sensor network | |
Vrochidou et al. | Automatic Facial Palsy Detection—From Mathematical Modeling to Deep Learning | |
CN110164547A (en) | A kind of Chinese medicine facial diagnosis system based on human face region and tongue fur | |
Guarin et al. | Automatic facial landmark localization in clinical populations-improving model performance with a small dataset | |
Trotman et al. | Influence of objective three-dimensional measures and movement images on surgeon treatment planning for lip revision surgery | |
Faraway et al. | Shape change along geodesics with application to cleft lip surgery | |
Johnston et al. | A Fully Automated System for Sizing Nasal PAP Masks Using Facial Photographs | |
KR101098564B1 (en) | System for diagnosis of psychologicalsymptoms and record medium for recording program using the same | |
Venkata Chalapathi | A novel two-layer feature selection for emotion recognition with body movements |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190426 |