CN111914871A - Artificial intelligence auxiliary evaluation method and system applied to beauty treatment and electronic device - Google Patents

Artificial intelligence auxiliary evaluation method and system applied to beauty treatment and electronic device Download PDF

Info

Publication number
CN111914871A
CN111914871A CN202010388355.8A CN202010388355A CN111914871A CN 111914871 A CN111914871 A CN 111914871A CN 202010388355 A CN202010388355 A CN 202010388355A CN 111914871 A CN111914871 A CN 111914871A
Authority
CN
China
Prior art keywords
medical
artificial intelligence
module
evaluation result
result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010388355.8A
Other languages
Chinese (zh)
Inventor
李至伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of CN111914871A publication Critical patent/CN111914871A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation

Abstract

The invention provides an artificial intelligence auxiliary evaluation method and system applied to beauty treatment and an electronic device. The method is applied to an artificial intelligence identification analysis module. The method comprises the steps of inputting a real-time facial expression evaluation result of a tested person to an artificial intelligent recognition analysis module, and selecting and matching at least one of a medical knowledge rule module and a beauty medical treatment auxiliary evaluation result historical database to execute an artificial intelligent recognition analysis program. And then, the artificial intelligence identification analysis module generates and outputs a beauty medical auxiliary evaluation result. Therefore, the cosmetic medical behavior can be executed according to the auxiliary evaluation result of the cosmetic medical treatment, and the personalized cosmetic effect can be achieved.

Description

Artificial intelligence auxiliary evaluation method and system applied to beauty treatment and electronic device
Technical Field
The present invention relates to an Artificial Intelligence (AI) assistant evaluation method, and more particularly, to an AI assistant evaluation method based on facial expressions and applied to beauty treatment, a system using the AI assistant evaluation method, and an electronic device using the AI assistant evaluation method.
Background
The popular beauty treatment, especially the micro-plastic beauty treatment on the face, is popular and accepted by people of all ages.
The treatment mode for performing facial micro-plastic cosmetology treatment mainly depends on medical professional skill and knowledge of doctors, and is processed by a general or normalized standard program, or is slightly judged by medical professional experience of some doctors. However, these conventional approaches often suffer from more or less predictable differences between actual cosmetic treatment results and the expected therapeutic effect due to the generally less-than-deep customization considerations.
Moreover, there are some actual treatment results after beauty treatment, which results in inferior treatment effect after operation due to the wrong individual judgment of physicians, thus causing more medical treatment disputes and deficiencies in medical treatment.
Therefore, how to provide more and better assisting treatment methods and tools aiming at personalized beauty medical needs is a technical subject to be solved by the beauty medical industry at present.
Disclosure of Invention
The present invention provides an Artificial Intelligence (AI) aided evaluation method and system and an electronic device for use in medical beauty treatment, aiming at the above-mentioned shortcomings of the prior art, and providing an AI (artificial intelligence) aided evaluation for medical beauty treatment based on facial expression for satisfying the personalized medical beauty treatment needs, so as to solve the above-mentioned shortcomings of the prior art.
The technical scheme adopted by the invention for solving the technical problem is to provide an artificial intelligence auxiliary evaluation method applied to beauty treatment, which at least comprises the following steps: providing a real-time facial expression evaluation result of a testee; inputting the real-time facial expression evaluation result into an artificial intelligent identification and analysis module; and selecting and matching at least one of a medical knowledge rule module and a beauty medical auxiliary evaluation result historical database to execute an artificial intelligent identification analysis program and generate and output a beauty medical auxiliary evaluation result.
Preferably, the method further comprises the steps of: feeding back and storing the result of the assistant evaluation of the beauty treatment to at least one of the medical knowledge rule module and the historical database of the assistant evaluation result of the beauty treatment.
Preferably, the output result of the cosmetology medical auxiliary evaluation at least comprises: the subject's combination and order of preference for an estimated treatment site outcome or combination and order of preference for an estimated treatment site outcome and an injected filler type and dosage.
Preferably, the medical knowledge rule module comprises a functional medical anatomical rule and a dynamic medical anatomical rule.
Preferably, the real-time facial expression evaluation result includes: a static expression evaluation result, or the static expression evaluation result and a dynamic expression evaluation result.
Preferably, the step of providing the result of the real-time facial expression evaluation of the human subject further comprises the steps of: dividing a face into a plurality of face action units according to the medical knowledge rule module; forming a plurality of emotion index combinations according to at least one of the static expression evaluation result and the dynamic expression evaluation result; and forming the real-time facial expression evaluation result according to the proportional result of each emotion index combination.
Preferably, the plurality of emotion indicator combinations is at least one of a positive emotion indicator combination and a negative emotion indicator combination.
Preferably, the negative emotion indicator combination is one or/and combination of a sadness indicator, a vitality indicator, a fear indicator, a surprise indicator, a fear indicator, a keeping away from sight indicator and a nuisance indicator.
Preferably, the positive emotion index combination is one or/and combination of a happy index, a satisfied index, a feeling index, a positive index and a relaxing index.
Preferably, the historical database of the assistant evaluation result of the beauty medical treatment comprises a plurality of assistant evaluation results of the beauty medical treatment, which are used for the artificial intelligence recognition analysis module to execute an artificial intelligence deep learning/training program according to at least one artificial intelligence deep learning/training algorithm.
Preferably, the artificial intelligence deep learning/training program comprises the following steps: providing the plurality of cosmetic medical assistant evaluation results; wherein each result of the assistant evaluation of the beauty medical treatment at least comprises: a basic data related to the historical subject, a facial expression evaluation result, a facial feature, a functional medical anatomy rule and a dynamic medical anatomy rule in the medical knowledge rule module, a combination and preferred sequence of evaluation treatment site results, and a type and dosage of injected filler; and inputting the plurality of auxiliary evaluation results of the beauty treatment medical treatment to the artificial intelligence identification and analysis module.
Preferably, the facial features of the individual include a static line feature, a static contour feature or a skin feature of a custom expression.
Preferably, the at least one artificial intelligent deep learning/training algorithm is at least one of an artificial neural network algorithm and a deep learning algorithm.
The present invention also provides an electronic device using the artificial intelligence aided assessment method, the electronic device at least comprises: a facial expression evaluation module for providing the real-time facial expression evaluation result; the artificial intelligence identification and analysis module comprises an artificial intelligence identification and analysis program, and the artificial intelligence identification and analysis program receives the real-time facial expression evaluation result and generates the cosmetology medical auxiliary evaluation result; the input/output module is used for outputting the beauty treatment auxiliary evaluation result; the artificial intelligence identification analysis module receives at least one human facial feature from at least one of the facial expression evaluation module and the input/output module.
Preferably, the electronic device is connected to at least one of the medical assistant evaluation result history database and the medical knowledge rule module in at least one of a wireless transmission mode and a wired transmission mode.
Preferably, the electronic device is a handheld smart mobile device, a Personal Computer (PC) or a stand-alone smart device.
The invention also provides an artificial intelligence auxiliary evaluation system for cosmetic medical treatment, which at least comprises: the facial expression evaluation module is used for providing a real-time facial expression evaluation result of a testee; the artificial intelligent identification and analysis module is connected with the facial expression evaluation module; the artificial intelligence identification analysis module receives the real-time facial expression evaluation result, executes an artificial intelligence identification analysis program according to at least one of a medical knowledge rule module and a beauty medical auxiliary evaluation result historical database which are connected, and adaptively generates and outputs a beauty medical auxiliary evaluation result.
Preferably, the artificial intelligence recognition analysis module feeds back and stores the assistant evaluation result of the beauty medical treatment to at least one of the medical knowledge rule module and the history database of the assistant evaluation result of the beauty medical treatment.
Preferably, the result of the cosmetology-medical auxiliary evaluation at least comprises the combination and the preferred sequence of the result of the evaluated treatment site of the testee, or the combination and the preferred sequence of the result of the evaluated treatment site and the type and the dosage of an injection filler.
Preferably, the medical knowledge rule module comprises a functional medical anatomical rule and a dynamic medical anatomical rule.
Preferably, the real-time facial expression evaluation result includes a static expression evaluation result, or the static expression evaluation result and a dynamic expression evaluation result.
Preferably, the facial expression evaluation module includes: a face image capturing unit for performing an image capturing operation to obtain a real-time face image; the face action coding unit divides a real-time face action presented in the image into a plurality of face action units according to the real-time face image and the medical knowledge rule module; the static expression evaluation result and the dynamic expression evaluation result are formed according to the change between one detection result of each facial action unit and another detection result of at least one other facial action unit.
Preferably, at least one of the facial expression evaluation module and the artificial intelligence recognition and analysis module further comprises: an emotion analysis and face recognition unit for forming a plurality of emotion index combinations according to at least one of the static expression evaluation result and the dynamic expression evaluation result; and forming the real-time facial expression evaluation result according to the proportional result of each of the plurality of emotion index combinations.
Preferably, the plurality of emotion indicator combinations is at least one of a positive emotion indicator combination and a negative emotion indicator combination.
Preferably, the negative emotion indicator combination is one or/and combination of a sadness indicator, a vitality indicator, a fear indicator, a surprise indicator, a fear indicator, a keeping away from sight indicator and a nuisance indicator.
Preferably, the positive emotion index combination is one or/and combination of a happy index, a satisfied index, a feeling index, a positive index and a relaxing index.
Preferably, the historical database of the assistant evaluation result of the beauty medical treatment comprises a plurality of assistant evaluation results of the beauty medical treatment, which are used for the artificial intelligence recognition analysis module to execute an artificial intelligence deep learning/training program according to at least one artificial intelligence deep learning/training algorithm.
Preferably, each of the result of the assistant evaluation of beauty medical treatment at least comprises: one or a combination of basic data related to a historical subject, facial expression assessment results, facial features, functional medical anatomy rules and dynamic medical anatomy rules in the medical knowledge rules module, a combination and preferred sequence of assessment of treatment site results, and a type and dosage of injected filler.
Preferably, the facial features of the individual include a static line feature, a static contour feature or a skin feature of a custom expression.
Preferably, the facial features of the individual are provided to the artificial intelligence recognition analysis module from at least one of the facial expression evaluation module and an input-output module.
Preferably, the artificial intelligence deep learning/training procedure is to input the plurality of auxiliary evaluation results of beauty medical treatment to the artificial intelligence recognition and analysis module.
Preferably, the at least one artificial intelligent deep learning/training algorithm is at least one of an artificial neural network algorithm and a deep learning algorithm.
Preferably, the system is assembled into an electronic device by the facial expression evaluation module, an input/output module and the artificial intelligence identification and analysis module; wherein the electronic device is a handheld intelligent mobile device, a Personal Computer (PC) or a stand-alone intelligent device.
Preferably, the electronic device is connected to at least one of the medical assistant evaluation result history database and the medical knowledge rule module in at least one of a wireless transmission mode and a wired transmission mode.
The invention uses the facial expression evaluation module to obtain the personalized real-time facial expression evaluation result, then executes the artificial intelligent identification analysis program according to the personalized real-time facial expression evaluation result, and combines the functional medical dissection rules, the dynamic medical dissection rules and the cosmetology medical auxiliary evaluation result historical database in the medical knowledge rule module, thereby providing the cosmetology medical auxiliary evaluation result and achieving the purpose of providing the exclusive personalized cosmetology medical auxiliary suggestion.
Drawings
FIG. 1: is a conceptual diagram of a preferred implementation of the artificial intelligence aided assessment system in the inventive concept.
FIG. 2: the invention is a flow chart of a preferred implementation of the artificial intelligence aided assessment method in the inventive concept.
FIG. 3: a block diagram of a preferred embodiment of the facial expression evaluation module in fig. 1 is shown.
FIG. 4: a flow chart of a preferred implementation of the facial expression evaluation module in fig. 3 is shown.
FIG. 5: a conceptual diagram of a preferred implementation of the facial action encoding unit in fig. 3 is shown.
FIG. 6A: a conceptual diagram of an embodiment of generating a static expression evaluation result and a dynamic expression evaluation result by the facial motion encoding unit in fig. 3 is shown.
FIG. 6B: the invention is a specific implementation concept diagram of quantitatively analyzing the action strength of the facial muscle group of each facial action unit under different emotional expressions by the emotion analysis and facial recognition unit.
FIG. 7A: is a preferred implementation concept diagram of the artificial intelligence deep learning/training architecture of the artificial intelligence recognition and analysis module in the inventive concept.
FIG. 7B: is a conceptual diagram of the preferred embodiment of the medical assistant evaluation result history database applied to the artificial intelligence recognition and analysis module in fig. 7A.
FIG. 8: a flowchart of a preferred implementation of the artificial intelligence deep learning/training process of the inventive concept of fig. 7A is shown.
Fig. 9A to 9F: the first preferred embodiment of the method and system for auxiliary evaluation using artificial intelligence in the invention is schematically shown.
FIG. 10: the present invention is a schematic diagram of a second preferred implementation concept of the artificial intelligence aided evaluation method and the system thereof.
FIG. 11: the third preferred embodiment of the present invention is a schematic diagram of an artificial intelligence aided evaluation method and system.
[ List of reference numerals ]
100 artificial intelligence auxiliary evaluation system
110 facial expression evaluation module
111 face action coding unit
112 emotion analysis and face recognition unit
113 face image capturing unit
120 artificial intelligence identification and analysis module
121 artificial intelligence identifying and analyzing program
130 medical knowledge rule module
131 functional medical anatomical rules
132 dynamic medical anatomical rules
140 database for the medical assistant evaluation result history of cosmetology
141 results of assistant evaluation of cosmetic medical treatment
150 output-input module
160 electronic device
220 artificial intelligence identification and analysis module
221 Artificial Intelligence deep learning/training program
230 medical knowledge rule module
231 functional medical anatomical rules
232 dynamic medical anatomical rules
240 database for history of auxiliary evaluation results of cosmetic and medical treatment
241 auxiliary evaluation result of cosmetic medical treatment
A real-time facial expression evaluation result
A' facial expression evaluation result
A1 static expression assessment results
A2 dynamic expression assessment results
A31-A33, A3n emotion index combination
AU1-AUn static expression evaluation result facial action unit
AU1 '-AUn' dynamic expression evaluation result face action unit
B basic data
Sex B1
Age of B2
C. C1-Cn combination and order of preference for assessing treatment site outcomes
P personal facial features
Static texture characteristics of P1 and P1' habitual expressions
P2, P2' static profile features
P3, P3' skin characteristics
D type of injection Filler
U dose of injectable bulking agent
Multiple medical rules of R1, R11-R14 functional medical anatomical rules
Multiple medical rules of dynamic medical anatomy rules of R2, R21-R24
S10 artificial intelligence auxiliary evaluation method
S11-S14, S111-S113 artificial intelligence auxiliary evaluation method flow steps
Process steps of the artificial intelligence deep learning/training program of S31-S32
Detailed Description
The following embodiments are provided for illustrative purposes only and do not limit the scope of the present invention. In addition, the drawings in the embodiments omit elements which are not necessary or can be accomplished by common techniques, so as to clearly show the technical features of the present invention. The following description will further describe preferred embodiments of the present invention with reference to the drawings.
Please refer to fig. 1 and fig. 2, which are a conceptual diagram of a preferred implementation of the artificial intelligence assistant evaluation system and a flowchart of a preferred implementation of the artificial intelligence assistant evaluation method according to the present invention.
As shown in fig. 1, the artificial intelligence assistant evaluation system 100 based on facial expressions and applied to aesthetic medical treatment of the present invention includes: a facial expression evaluation module 110, an artificial intelligence recognition analysis module 120, a medical knowledge rule module 130, a medical treatment assistant evaluation result history database 140 and an output and input module 150.
The facial expression evaluation module 110 at least includes a facial motion encoding unit 111, an emotion analyzing and face recognizing unit 112, and a facial image capturing unit 113. The artificial intelligence recognition analysis module 120 includes at least an artificial intelligence recognition analysis program 121. The medical knowledge rules module 130 includes at least functional medical anatomical rules 131 and dynamic medical anatomical rules 132. The historical database 140 of the assistant evaluation results of the beauty medical treatment at least comprises a plurality of assistant evaluation results 1-N of the beauty medical treatment.
Furthermore, the input/output module 150 is used for receiving input or outputting various information, such as: receiving and inputting basic data B1-B2 and/or individual facial features P1-P3 of the testee, and inputting the data into the artificial intelligent recognition analysis module 120; alternatively, the combination of the results of evaluating the treatment sites and the preferred sequence C1-Cn, and/or the type of filler injected D and the dosage of filler injected U, received from the ANS module 120, are output.
The facial expression evaluation module 110, the artificial intelligence recognition and analysis module 120, and the input/output module 150 may be assembled to form an electronic device 160, and the electronic device 160 may be a handheld intelligent mobile device, a Personal Computer (PC), or a stand-alone intelligent device. For example, the electronic device 160 may be a tablet computer, a smart mobile device, a notebook computer, a desktop computer, a standalone smart device or a standalone smart module, wherein the smart device or the smart module may be assembled or separated in a medical device (not shown).
The electronic device 160 is connected to the medical assistant evaluation result history database 140 and/or the medical knowledge rule module 130 in a wireless transmission manner and/or a wired transmission manner. For example, the medical assistant evaluation result history database 140 and/or the medical knowledge rule module 130 may be stored in a cloud storage platform, and the electronic device 160 may be connected to the cloud storage platform via various local/wide area networks (not shown).
Next, please refer to fig. 2, which is a method for auxiliary evaluation of artificial intelligence S10 applied to the system 100, and please refer to fig. 1; that is, the aforementioned artificial intelligence aided evaluation method S10 includes the following steps: step S11 is executed: providing the real-time facial expression evaluation result A of the testee, and the basic data B1-B2 and/or the personal facial features P1-P3 of the testee.
The real-time facial expression evaluation result a may include a static expression evaluation result a1, a dynamic expression evaluation result a2, and a plurality of emotion indicator combinations a31-A3n generated according to at least one of the static expression evaluation result a1 and the dynamic expression evaluation result a2, which are provided by the facial expression evaluation module 110.
In addition, the basic data B1-B2 and/or the facial features P1-P3 of the subject can be provided by the input/output module 150, and in other preferred embodiments, the facial expression evaluation module 110 can directly provide another facial feature P1 '-P3' of the subject; and the foregoing operations may be performed alternatively or in steps without departing from the spirit of the invention.
Next, step S12 is executed: the real-time facial expression evaluation result A, and the basic data B1-B2 and/or the facial features P1-P3 of the subject are inputted into the artificial intelligence recognition and analysis module 120.
Next, step S13 is executed: the artificial intelligence recognition and analysis module 120 can select and match at least one of the medical knowledge rule module 130 and the beauty medical assistant evaluation result history database 140 to execute the artificial intelligence recognition and analysis program 121 and generate and output a beauty medical assistant evaluation result 141; the artificial intelligence recognition and analysis module 120 may be connected to the medical knowledge rule module 130 and the medical assistant evaluation result history database 140 in a wireless transmission manner or a wired transmission manner.
Furthermore, the artificial intelligence recognition analysis module 120 can adaptively generate the assistant evaluation result of the beauty medical treatment corresponding to the subject according to the functional medical anatomy rule 131 and the dynamic medical anatomy rule 132 in the medical knowledge rule module 130, and/or the plurality of assistant evaluation results 1-N in the history database 140 of assistant evaluation results of beauty medical treatment; wherein the result of the cosmetology-medical auxiliary evaluation at least comprises the combination of the result of the evaluated treatment sites of the testee and the preferred sequence C1-Cn.
Of course, the result of the aforementioned cosmetology and medical assistant evaluation may further include the combination and preferred sequence C1-Cn corresponding to the result of the evaluated treatment site, so as to generate the assistant evaluation result regarding the injected filler type D and the filler dosage U.
In addition, the artificial intelligence recognition analysis module 120 can output or display the combination of the results of the evaluation of the treatment site and the preferred sequence C1-Cn and/or the type D of filler injected and the dosage U of the filler through the output/input module 150 shown in FIG. 1.
In addition, regarding another preferred implementation (not shown) of step S13 in the present invention, the artificial intelligence recognition and analysis module 120 of the present invention can also directly execute the artificial intelligence recognition and analysis program 121 according to the real-time facial expression evaluation result a and according to various built-in medical knowledge rules and/or beauty medical auxiliary evaluation results, and generate another beauty medical auxiliary evaluation result; the aforementioned another preferred implementation is characterized in that at least one of the aforementioned medical knowledge rule module 130 and the history database 140 of the auxiliary evaluation result of the pre-cosmetic medical treatment need not be additionally configured.
A more preferred method of the present invention further includes performing step S14: feeding back and storing the result of the assistant evaluation of the beauty medical treatment to at least one of the medical knowledge rule module 130 and the history database 140 of the assistant evaluation result of the beauty medical treatment; the artificial intelligence recognition and analysis module 120 can store the assistant evaluation result of the beauty medical care of each tested person in the history database 140 of the assistant evaluation result of the beauty medical care, so that the artificial intelligence recognition and analysis module 120 can execute an artificial intelligence deep learning/training program according to at least one artificial intelligence deep learning/training algorithm, which will be described in the supplementary description below and will not be repeated herein.
Of course, the step S14 can be omitted in other embodiments, and all of them belong to various equally modified embodiments of the present invention.
In more detail, how the facial expression evaluation module 110 in the aforementioned step S11 provides the real-time facial expression evaluation result a may further include the following steps. Please refer to fig. 3 and fig. 4, which are a conceptual diagram of a preferred implementation of the facial expression evaluation module and a flowchart of a preferred implementation of the facial expression evaluation module according to the present invention.
As shown in fig. 3, the facial expression evaluation module 110 of the present invention includes: a face motion encoding unit 111, a emotion analyzing and face recognizing unit 112, and a face image capturing unit 113; after an image capturing operation (for example, a camera (not shown) is performed by the face image capturing unit 113, a real-time face image is obtained and output to the face motion encoding unit 111; then, the face motion encoding unit 111 divides a real-time face motion represented in the image into a plurality of face motion units AU1-AUn according to the real-time face image and medical knowledge rule module 130 (please refer to the related description of fig. 5 described later).
As for the emotion analyzing and face recognizing unit 112, it includes a plurality of emotion index combinations a31-A3n, and may be a positive emotion index combination or a negative emotion index combination; wherein, the aforementioned positive emotion index combination is meant to include, for example: one or a combination of a happy index, a satisfied index, a feeling index, a positive index, and a relaxed index, and the aforementioned negative emotion index combination includes, for example: sadness index, anger index, worry index, surprise index, fear index, keeping away from one of sight index and offensive index or/and combination thereof.
Next, referring to fig. 4, the step S11 further includes the following steps, step S111 is executed: by using the facial expression evaluation module 110, a facial muscle action is divided into a plurality of facial action units AU1-AUn according to the medical knowledge rule module 130; the facial action encoding unit 111 can distinguish various facial muscle actions according to the functional medical anatomical rules 131 and the dynamic medical anatomical rules 132 in the medical knowledge rule module 130, and the plurality of medical anatomical rules may be, for example: the linkage relationship between each muscle group and the adjacent muscle group, and/or the function of each muscle group.
For example, please refer to fig. 5, which is a conceptual diagram of a preferred implementation of the facial action encoding unit 111. As shown in fig. 5, the facial Action encoding Unit 111 of the present invention may define the frown muscle at the brow as the facial Action Unit AU4(Face Action Unit 4), or define the corner-lowering muscle (depressor anguli oris) and the chin muscle (mental) that may make the mouth corner sag as the facial Action Unit AU15(Face Action Unit 15) and the facial Action Unit AU17(Face Action Unit 17), respectively.
Next, step S112 is executed: forming a plurality of emotion index combinations A31-A3n according to at least one of the static expression evaluation result A1 and the dynamic expression evaluation result A2; the facial movement encoding unit 111 forms the static expression evaluation result a1 and the dynamic expression evaluation result a2 according to the combination of the linkage or movement between each facial movement unit and another facial movement unit in response to the change of the emotion of the human face and the strength of the muscle movement.
Please refer to fig. 6A with reference to fig. 3, wherein fig. 6A is a conceptual diagram illustrating an embodiment of generating the static expression evaluation result a1 and the dynamic expression evaluation result a2 by the facial motion encoding unit 111 of the facial expression evaluation module 110 according to the present invention; the facial expression evaluation module 110 of the present invention first detects facial expression changes of a subject in static and dynamic states (e.g., static changes from a left image to a right image), and then matches the static expression evaluation result a1 and the dynamic expression evaluation result a2 with the detected face action units AU1, AU4 and AU 15.
As shown in the left diagram of fig. 6A, the static expression evaluation result a1 may be each static parameter value of the facial action units AU1-AUn of the subject without any emotion, or a short piece of recording is recorded when the face of the subject is relaxed, which can detect and analyze whether the facial muscle groups of the subject exert forces or act involuntarily, which is equivalent to the linkage and dynamics between the muscle groups.
The above-mentioned dynamic expression evaluation result a2 shows that the tested person presents different facial expressions according to different emotions as shown in the right side of fig. 6A, for example, it can be: sad emotions; of course, the aforementioned dynamic expression evaluation result a2 may also include various emotional expressions such as more emotional expressions, a laughter expression, and the like.
Next, please refer to fig. 6B with fig. 3, wherein fig. 6B is a conceptual diagram of an embodiment of quantitatively analyzing the actuation strength of the facial muscle groups of each facial action unit in different emotional expressions by the emotion analyzing and facial recognizing unit 112 of the present invention to provide more accurate dynamic parameter values. That is, the emotion analyzing and facial recognizing unit 112 of the present invention can form the emotion index combinations A31-A3n according to at least one of the static expression evaluation result A1 and the dynamic expression evaluation result A2.
The facial expression evaluation module 110 of the present invention can, for example, classify facial emotional expressions into 7 categories of definitions, wherein besides Neutral (Neutral) expressions, the facial expression evaluation module further includes: emotional definitions of Happy (Happy), Sad (Sad), Angry (Angry), surprised (surrise), Fear (Fear/Scared), nausea (disorsted), etc., which correspond to the above-mentioned combination of emotional indicators a31-A3n of the present invention.
The facial action encoding unit 111 of the present invention defines the features of the sad emotional expression as the codes of a plurality of facial action units, such as Eyebrow glide (Eyebrow down), Frown line (Frown line), toena (Clock time Eyebrow 8:20), inverted u (reverse u), Brow head up (Brow head up), Dark eye ring (Dark eye), Lower Lip protrusion (Lower Lip protruding), Lip closing (Lip light), Lip corner sagging (Lip corner drop), and Chin protrusion (Chin balls), and the features of the sad emotional expression can be used as the basis for judging the sad index by the emotion analyzing and recognizing unit 112 and distinguishing the sad index from other emotional indexes (not shown).
That is, the present invention can utilize a plurality of facial action units AU1-AUn to comprehensively analyze the real emotion to be presented or conveyed by facial expression by recognizing the emotional expression characteristics related to emotion comprehensively expressed by the muscle groups of the face. In this way, the emotion analyzing and facial recognizing unit 112 can objectively recognize and analyze the subtle changes of the facial expression by the micro expression between the static expression evaluation result a1 and the dynamic expression evaluation result a2, and further more precisely evaluate and quantify the activation among the facial action units AU 1-AUn.
Of course, in various preferred embodiments, the emotion analyzing and facial recognizing unit may be assembled or installed and loaded in at least one of the facial expression evaluating module and the artificial intelligence recognition analyzing module, so that the emotion index combinations can be formed according to at least one of the static expression evaluating result and the dynamic expression evaluating result.
Finally, step S113 is performed: forming a real-time facial expression evaluation result A according to the proportional result of each emotion index combination; the real-time facial expression evaluation result a includes a plurality of proportion results of emotion index combinations a31-A3n, for example: the anger index is 14.1%, the sadness index is 35.2%, and so on.
Next, how to perform the artificial intelligence deep learning/training is illustrated by the artificial intelligence recognition and analysis module 120 for recognizing and analyzing the real-time facial expression evaluation result a. Please refer to fig. 7A and 7B, which are a conceptual diagram of an artificial intelligence deep learning/training architecture of the artificial intelligence recognition and analysis module and a conceptual diagram of a history database of the assistant evaluation result of the cosmetic medical treatment applied to the artificial intelligence recognition and analysis module, respectively.
As shown in fig. 7A and 7B, the artificial intelligence deep learning/training architecture of the artificial intelligence recognition analysis module includes: an artificial intelligence identification and analysis module 220, a medical knowledge rule module 230 and a beauty medical treatment auxiliary evaluation result history database 240; the artificial intelligence recognition and analysis module 220 includes an artificial intelligence deep learning/training program 221, the medical knowledge rule module 230 includes functional medical anatomical rules 231 and dynamic medical anatomical rules 232, and the historical database 240 includes a plurality of auxiliary cosmetic medical evaluation results 241.
Wherein, each result of the cosmetology and medical auxiliary evaluation at least comprises the name of each testee, basic data B, the facial expression evaluation result A' under test, the personal facial feature P, a plurality of medical rules R1 and R2 in the functional medical anatomical rule 231 and the dynamic medical anatomical rule 232, the preferred combination C1 and C2 of the combination and the preferred sequence of the result of the evaluation treatment part, the type D of the injected filler and the dosage U of the filler, and the like.
For practical applications, the basic data B may include sex B1 and age B2; the facial expression evaluation result a' at least includes: static expression evaluation results A1, dynamic expression evaluation results A2 and a plurality of emotion index combinations A31-A33; the static expression evaluation result A1 can be each static parameter value of a plurality of facial action units AU1-AUn when the testee has no emotion; the dynamic expression evaluation result a2 may be values of various dynamic parameters of a plurality of facial action units AU1 '-AUn' generated by the subject according to different emotions, and the emotion index combinations a31-a33 may include, for example: the fear index A31, the gas generation index A32 and the keeping away from the vision index A33 of the negative emotion index combination or the joy index A31, the feeling index A32 and the satisfaction index A33 of the positive emotion index combination can be included.
Furthermore, the personal facial features P may include static line features P1, static contour features P2, or skin-type features P3 of the habitual expression; the facial features P1-P3 can be provided by at least one of the facial expression evaluation module 110 and the input/output module 150.
As for the medical rule R1 of the functional medical anatomical rule 231, for example, the medical rules include the rule of the degree of stretching of each facial muscle group in response to different emotional expressions, the rule of the degree of tension R11-R14; the medical rule R2 of the dynamic medical anatomical rule 232 includes, for example, a linkage rule and contraction rules R21-R24 between facial muscle groups according to different emotional expressions.
Finally, the preferred combinations C1, C2 may be, for example, one or/and combinations of facial action units AU1-AUn of the part of the subject to be treated; class D of injectable bulking agents may include: hydrogel agent W, botulinum toxin agent X, hyaluronic acid agent Y and collagen agent Z. Wherein, the hydrogel agent W, the hyaluronic acid agent Y and the collagen agent Z can reduce static lines of human face expression, thereby reducing negative emotion index combination (sadness index, anger index and the like), and can increase positive emotion index combination (happiness index, satisfaction index and the like).
Of course, in the embodiment, the aforementioned auxiliary evaluation results 1-N for aesthetic medical treatment can be adjusted according to the requirements of the treatment targets of actual aesthetic medical treatment, and should not be limited by the present embodiment. Moreover, the content of the aforementioned assistant evaluation results 1-N for cosmetic medical treatment can be modified or designed equally by those skilled in the art, and can be further adjusted and designed to be suitable for the practical requirement of cosmetic medical treatment of the subject.
Next, the artificial intelligence recognition analysis module 220 executes the artificial intelligence deep learning/training program 221 according to the above-mentioned artificial intelligence deep learning/training architecture. Please refer to fig. 8, which is a flowchart illustrating a preferred embodiment of the artificial intelligence deep learning/training process.
As shown in fig. 8, the flow steps of the artificial intelligence deep learning/training program 221 include executing step S31: providing the assistant evaluation results 241 of the beauty medical treatment; wherein each result of the assistant evaluation of the beauty medical treatment at least comprises: basic data B related to the historical testee, facial expression evaluation results A', personal facial features P, functional medical anatomical rules 231 and dynamic medical anatomical rules 232 in the medical knowledge rule module 230, combination and preferred sequence of evaluation treatment site results, and type D of injected filler and dosage U of the filler; the content of the assistant evaluation results 241 for beauty treatment is shown in fig. 7B, but the static expression evaluation result a1 and the dynamic expression evaluation result a2 may be selected according to the actual usage requirement.
After that, step S32 is executed: inputting the assistant evaluation results 241 of the beauty treatment to the artificial intelligence recognition and analysis module 220 for executing an artificial intelligence deep learning/training program 221; the artificial intelligence recognition analysis module 220 executes an artificial intelligence deep learning/training program 221 according to at least one artificial intelligence deep learning/training algorithm. The artificial intelligence deep learning/training algorithm is a machine learning algorithm, an artificial neural network algorithm, a fuzzy logic algorithm, a deep learning algorithm or/and a combination thereof, wherein at least one of the artificial neural network algorithm and the deep learning algorithm is preferred in this embodiment.
Furthermore, a plurality of preferred implementation concepts will be described below to explain how to perform cosmetic medical actions by the artificial intelligence aided assessment method and the system thereof of the present invention. Among the main therapeutic targets of the preferred embodiments, the following embodiments are needed to improve the negative emotion index combinations of the facial expressions of the subjects 1-3, and to enhance the personal appeal and interpersonal relationship by reducing or improving the negative emotion index combinations in response to the negative emotions.
For example, the main therapeutic purpose of these preferred embodiments is to reduce or improve the involuntary frown or mouth-corner drooping facial expressions, so as to reduce the feeling of anger or serious, even reduce the negative micro-expression, and in other preferred embodiments, further enhance the combination of the positive emotional indicators, so as to achieve better, precise and personalized aesthetic beauty treatment.
Please refer to fig. 9A to 9F, which are schematic diagrams illustrating a first preferred embodiment of a system and a method for auxiliary evaluation by artificial intelligence according to the present invention.
Taking the contents of fig. 9A to 9F in combination with fig. 1 to 4 and 7B as examples, the facial expression evaluation module 110 is used to detect a plurality of facial action units AU1-AUn of the subject 1, and provide a real-time facial expression evaluation result a according to a change between a detection result of each facial action unit AU1-AUn and another detection result of another facial action unit AU 1-AUn; the real-time facial expression evaluation result a is formed according to the proportional result of each emotion index combination generated by the static expression evaluation result a1 and the dynamic expression evaluation result a 2.
As shown in FIG. 9A, the subject may involuntarily frown the eyebrows or have drooping mouth corners due to aging, and the micro expressions caused by the facial action units AU1-AUn are recorded and combined to form a static expression evaluation result A1.
In addition, as shown in fig. 9B, the dynamic expression evaluation result a2 shows that the subject 1 presents different facial expressions according to different emotions, for example: angry expression, laugh expression, etc.
Then, the emotion analyzing and facial recognition unit 112 further quantitatively analyzes the action strength of the facial muscle groups of each facial action unit in different emotional expressions (including the static expression assessment result a1 and the dynamic expression assessment result a2) to provide more accurate dynamic parameter values as the treatment references of the emotion index combinations a31-a33, the combination and preferred sequence of assessment of treatment site results C1-C2, and the type D of filler and the dosage U of filler.
Of course, in other preferred embodiments, the emotion analyzing and face recognizing unit may be a process step built in and installed to be loaded in the artificial intelligence recognition and analysis module, or a part of the process (not shown) of the artificial intelligence recognition and analysis process.
Further, as shown in fig. 9C and 9D, the emotion index combinations a31-a33 of the subject 1 respectively include a sad index of 35.2%, an angry index of 14.1%, and a fear index of 17.7%, and information related to each face action unit AU1-AUn corresponding to the emotion index combinations a31-a 33.
On the other hand, the facial expression evaluation module 110 may further provide static texture features P1, static contour line features P2, or skin-type features P3 of the habitual expression of the personal facial features P by matching a human face three-dimensional (3D) simulation unit and a skin-type detection unit (not shown).
Then, as shown in fig. 9E, the real-time facial expression evaluation result a is input into the artificial intelligence recognition and analysis module 120, or the artificial intelligence recognition and analysis module 120 actively receives the real-time facial expression evaluation result a and selects whether to match at least one of the medical knowledge rule module 130 and the beauty medical treatment auxiliary evaluation result history database 140 to execute the artificial intelligence recognition and analysis program 121. Thereafter, a result of the cosmeceutical-assisted evaluation of the subject 1 is generated and outputted, wherein the result of the cosmeceutical-assisted evaluation includes at least a combination of the results of evaluating the treatment site and the preferred sequence C1-C2, and the type D of filler injected and the dose U of filler.
For this example, Botulinum Toxin botulin Toxin 8s.u. was administered for the muscle group associated with facial action unit AU1 (medial frontal muscle), and a cosmetic and medical assistant assessment result suggestion of botulin Toxin DAO 4s.u. and Mentalis 4s.U was administered for the muscle group associated with facial action unit AU15 (depressor anguli oris) and the muscle group associated with facial action unit a17 (chin muscles).
As a result, referring to fig. 9F, the facial expression evaluation results a' before, after, and after three weeks of treatment of the facial expression of the subject 1 are compared, from which it can be found that the sad indicator of the face of the subject 1 is directly reduced from 35.2% to 0% after one week of treatment; in addition, the change of the qi-generating index of the face of the subject 1 was reduced to 7.8% after one week from 14.1% before the treatment, and even to 0% after three weeks from the botulinum treatment.
On the other hand, for the result suggestion of the cosmetology medical auxiliary evaluation of the sadness index, another treatment reference guidance path can be provided through the implementation operation of the invention: for example, when a sad indicator accounts for more than 10% of the total emotion indicator combination (total expression) and all of the face action unit AU1, the face action unit AU4, and the face action unit AU15 in the face action coding unit are enhanced (i.e., increased in percentage), it is suggested that Botulinum Toxin a type can be injected at the corresponding muscle group position. Of course, the present invention can obviously provide a better treatment proposal scheme by adopting more case data (the result of the auxiliary evaluation of the beauty medical treatment) and performing a plurality of times of artificial intelligence deep learning/training procedures, and is not limited to the result of the auxiliary evaluation of the beauty medical treatment.
Please refer to fig. 10, which is a schematic diagram illustrating a second preferred embodiment of the artificial intelligence aided evaluation method and system according to the present invention.
As shown in fig. 10, the facial action units AU1-AUn of the subject 2 are detected by the facial expression evaluation module 110, so as to provide a real-time facial expression evaluation result a, thereby knowing that the neutral index of the combination of emotional indexes of the subject 2 is the highest 26.3%, then the sad index is the second highest 13.9%, and meanwhile, in the result of the cosmetology and medical assistance evaluation, the order of the recommended preferred treatment parts is the facial action unit AU1 mainly causing the sad index, which is classified as the Inner Brow Raiser.
Based on this, by combining the real-time facial expression evaluation result a with the medical knowledge rule module 130 and the beauty medical treatment assistant evaluation result history database 140, it can be known that the functional medical dissection rules and the dynamic medical dissection rules can indicate the position of the muscle group highly related and linked with the facial action unit AU1, and then, the beauty medical treatment assistant evaluation result for personalized beauty medical treatment can be further provided, for example: treatment reference recommendations for botulinum Toxin 8s.u. were administered against the muscle group associated with facial action unit AU1 (Inner Frontalis).
As such, the facial expression evaluation results a' of the subject 2 before, one week after, and three weeks after the treatment were compared, from which it was found that the sadness index of the face of the subject 2 was directly reduced to 8.4% after one week of the treatment from 13.9%, and even three weeks after the botulinum treatment, the sadness index of the face could be completely reduced to 0%. In short, the artificial intelligence auxiliary evaluation method and the system applied to the beauty medical treatment of the invention have quite remarkable curative effect of the beauty medical treatment.
Please refer to fig. 11, which is a schematic diagram of a third preferred embodiment of the artificial intelligence aided evaluation method and system according to the present invention.
As shown in fig. 11, the facial expression evaluation module 110 detects a plurality of facial action units AU1-AUn of the subject 3 to provide a real-time facial expression evaluation result a, so that no matter which gender or age of the human emotional expressions, the real-time facial expression evaluation result a is substantially related to the muscle groups of the facial action units AU15 and AU 17; and the facial expression evaluation results a' of the subject 3 before, after one week and after three weeks of treatment were compared, from which the indicators of the anger of the face of the subject 3 could be found, which had been significantly improved after three weeks of treatment.
That is, the result suggestion of the assistant evaluation of the beauty treatment aiming at the qi generation index can give another treatment reference guidance path through the implementation operation of the invention, such as: when the qi generation index is more than 10% and the face motion unit AU15 and the face motion unit AU17 in the face motion coding unit are enhanced (i.e. the occupation ratio is increased), it is suggested that Botulinum Toxin type A can be injected at the corresponding muscle group (the position of the lower corner muscle and the lower chin muscle).
However, it can be seen that the above-mentioned preferred embodiments are compared with the currently known practice of cosmetic medical treatment only determined by the individual physicians, and the conventional practice is indeed easily influenced by the personal experience and stereotypy impression of the physicians, so that the whole objective consideration cannot be obtained, and the significant loss of missing the difference of the micro-expressions of each individual is still remained.
In detail, the objective of treatment is to reduce the indicators of anger, physicians often use botulinum toxin to reduce the action of the eyebrow frowning muscles, but neglecting that the muscle action of each angry expression is slightly different, a few people may have a mouth angle downward at the same time, or the chin muscle may contract and rise, and someone lifts the eyebrow and has some action on the intramuscular side, and the muscle dragging may only be partially visible with naked eyes, so that the muscle dragging is too fine and not easy to be perceived, thereby generating blind spots and error zones in treatment, and further causing medical effect of counter effect and unnecessary medical dispute.
For example, in fig. 9A to 9F, if the subject 1 is changed to the beauty medical treatment suggestion judged by the conventional physician, the main treatment sites will be concentrated into the face motion unit AU15 and the face motion unit AU17, wherein the physician omits the face motion unit AU1 with the qi generation index, so that the beauty medical treatment effect is not good (not shown).
Next, in the subject 2 in fig. 10, the cosmetic medical treatment suggestion according to the personal judgment of the conventional doctor is that the main treatment part is usually the facial action unit AU2, which is the basis of the judgment that the eye ring muscle of the subject 2 causes eye drop and causes the sadness index. However, after the cosmetic medical action is performed, the facial expression evaluation results a' of the subject 2 before, one week after, and three weeks after the treatment are compared, and from these, it is found that the sad index of the face of the subject 2 is directly decreased from 6.8% to 5.1% after one week of treatment, but the sad index three weeks after the botulinum treatment is returned to 6.7%, because the doctor misjudges the treatment site of the subject 2, and the cosmetic medical treatment effect is not good, and the sad index cannot be effectively improved (not shown).
Finally, in the subject 3 in fig. 11, if the cosmetic medical treatment recommendation judged by the conventional physician is changed, the main treatment site is usually the injection of a total of 4 doses (Units) of the botulinum toxin abeobotulinumtoxin a at the facial action unit AU 17. However, after the cosmetic medical action is performed, the facial expression evaluation results a' of the subject 3 before, after, and after three weeks of the treatment are compared, and from these results, it is found that the vital sign of the face of the subject 3 is directly decreased from 10.9% to 5.9% after one week of the treatment, but the vital sign of the face of the subject 3 is increased to 13.9% after three weeks of the botulinum treatment because not only the physician neglects to determine the treatment site of the subject 3 (facial action unit AU15), but also the injection dose of the botulinum abobotuliumtoxin a into the facial action unit AU17 is insufficient, which results in not only the inability to improve the facial expression of the subject, but also in the adverse effect of the increase of the vital sign (not shown).
Therefore, compared with the artificial intelligence auxiliary evaluation method and the system thereof applied to beauty treatment of the invention, the human face expression evaluation module 110 provides the real-time human face expression evaluation result A of the tested person, and the artificial intelligence identification and analysis module selects and matches a plurality of medical rules in the medical knowledge rule module and the historical database of the beauty treatment auxiliary evaluation result to execute the artificial intelligence identification and analysis program, so as to generate and output the beauty treatment auxiliary evaluation result, wherein the beauty treatment auxiliary evaluation result at least comprises the combination and the preferred sequence of the evaluation treatment part results of the tested person and/or the type and the dosage of the injected filler. Therefore, the invention not only can accurately analyze and evaluate the correct and complete treatment part (the facial action units AU1-AUn), but also can accurately provide the type and dosage of the injected filler, thereby achieving the cosmetic medical treatment effect of personalized aesthetic feeling.
In addition, the invention can also be used for strengthening the beauty treatment behavior of the combination of positive emotion indexes or making a beauty treatment suggestion for preventing and improving the treatment part of the aged human face, such as: the muscle of the human face can cause the mouth corner to drop after being relaxed, so that the faces possibly generating the qi generation indexes are equal.
On the other hand, the method and the system can be applied to various cosmetic medical or aesthetic fields, can also be used as a judgment basis for the treatment effect before and after the cosmetic medical operation, and can also be applied to the medical teaching field as a blind spot and a wrong area when a doctor is trained to carry out advanced repair or improve the prior treatment.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the claims, therefore, all equivalent changes and modifications without departing from the spirit of the present invention should be included in the scope of the present invention.

Claims (34)

1. An artificial intelligence auxiliary evaluation method applied to beauty treatment is characterized by at least comprising the following steps:
providing a real-time facial expression evaluation result of a testee;
inputting the real-time facial expression evaluation result into an artificial intelligent identification and analysis module; and
at least one of a medical knowledge rule module and a beauty medical auxiliary evaluation result history database is selected and matched to execute an artificial intelligent identification analysis program and generate and output a beauty medical auxiliary evaluation result.
2. The artificial intelligence aided assessment method of claim 1, further comprising the steps of:
feeding back and storing the result of the assistant evaluation of the beauty treatment to at least one of the medical knowledge rule module and the historical database of the assistant evaluation result of the beauty treatment.
3. The artificial intelligence aided assessment method according to claim 1, wherein the outputted result of the aided assessment of cosmetic medical treatment at least comprises:
the subject's combination and order of preference for an estimated treatment site outcome or combination and order of preference for an estimated treatment site outcome and an injected filler type and dosage.
4. The method of claim 1, wherein the medical knowledge rule module comprises a functional medical anatomical rule and a dynamic medical anatomical rule.
5. The artificial intelligence aided assessment method of claim 1, wherein the real-time facial expression assessment result comprises:
a static expression evaluation result, or the static expression evaluation result and a dynamic expression evaluation result.
6. The artificial intelligence aided assessment method of claim 5, wherein said step of providing said real-time facial expression assessment result of said subject further comprises the steps of:
dividing a face into a plurality of face action units according to the medical knowledge rule module;
forming a plurality of emotion index combinations according to at least one of the static expression evaluation result and the dynamic expression evaluation result; and
and forming the real-time facial expression evaluation result according to the proportional result of each emotion index combination.
7. The method of claim 6, wherein the combination of emotional indicators is at least one of a positive emotional indicator combination and a negative emotional indicator combination.
8. The method of claim 7, wherein the negative emotion indicator combination is one or a combination of a sadness indicator, a vitality indicator, a fear indicator, a surprise indicator, a fear indicator, a keeping away from sight indicators and a nuisance indicator.
9. The method of claim 7, wherein the positive emotion indicator combination is one or a combination of a happy indicator, a satisfied indicator, a feeling indicator, a positive indicator and a relaxed indicator.
10. The method of claim 1, wherein the History database of the assistant evaluation results of beauty medical treatment comprises a plurality of assistant evaluation results of beauty medical treatment for the artificial intelligence recognition analysis module to perform an artificial intelligence deep learning/training procedure according to at least one artificial intelligence deep learning/training algorithm.
11. The artificial intelligence aided assessment method of claim 10, wherein the artificial intelligence deep learning/training procedure comprises the steps of:
providing the plurality of cosmetic medical assistant evaluation results; wherein each result of the assistant evaluation of the beauty medical treatment at least comprises: a basic data related to the historical subject, a facial expression evaluation result, a facial feature, a functional medical anatomy rule and a dynamic medical anatomy rule in the medical knowledge rule module, a combination and preferred sequence of evaluation treatment site results, and a type and dosage of injected filler; and
inputting the plurality of auxiliary evaluation results of the beauty treatment to the artificial intelligence identification and analysis module.
12. The method of claim 11, wherein the facial features of the individual comprise static texture features, static contour features, or skin features of a custom expression.
13. The method of claim 12, wherein the at least one artificial intelligence deep learning/training algorithm is at least one of an artificial neural network algorithm and a deep learning algorithm.
14. An electronic device using the artificial intelligence aided assessment method according to claim 1, wherein the electronic device comprises at least:
a facial expression evaluation module for providing the real-time facial expression evaluation result;
the artificial intelligence identification and analysis module comprises an artificial intelligence identification and analysis program, and the artificial intelligence identification and analysis program receives the real-time facial expression evaluation result and generates the cosmetology medical auxiliary evaluation result; and
an input/output module for outputting the result of the auxiliary evaluation of beauty treatment;
the artificial intelligence identification analysis module receives at least one human facial feature from at least one of the facial expression evaluation module and the input/output module.
15. The electronic device of claim 14, wherein the electronic device is at least one of wirelessly and wirelessly connected to at least one of the medical assistant assessment result history database and the medical knowledge rule module.
16. The electronic device of claim 14, wherein the electronic device is a handheld smart mobile device, a personal computer, or a standalone smart device.
17. An artificial intelligence auxiliary evaluation system for cosmetic medical treatment, characterized by comprising at least:
the facial expression evaluation module is used for providing a real-time facial expression evaluation result of a testee; and
an artificial intelligent identification and analysis module connected with the facial expression evaluation module;
the artificial intelligence identification analysis module receives the real-time facial expression evaluation result, executes an artificial intelligence identification analysis program according to at least one of a medical knowledge rule module and a beauty medical auxiliary evaluation result historical database which are connected, and adaptively generates and outputs a beauty medical auxiliary evaluation result.
18. The system of claim 17, wherein the analysis module feeds back and stores the result of the assistant evaluation of beauty treatment to at least one of the medical knowledge rule module and the historical database of the result of the assistant evaluation of beauty treatment.
19. The system of claim 17, wherein the result of the cosmetology-assisted assessment comprises at least one of a combination and preference sequence of results of the evaluated treatment sites of the subject, or a combination and preference sequence of results of the evaluated treatment sites and a type and dosage of an injectable filler.
20. The system of claim 17, wherein the medical knowledge rule module comprises a functional medical anatomical rule and a dynamic medical anatomical rule.
21. The system of claim 17, wherein the real-time facial expression evaluation result comprises a static expression evaluation result, or the static expression evaluation result and a dynamic expression evaluation result.
22. The system of claim 21, wherein the facial expression evaluation module comprises:
a face image capturing unit for performing an image capturing operation to obtain a real-time face image; and
a face action coding unit, which divides a real-time face action presented in the image into a plurality of face action units according to the real-time face image and the medical knowledge rule module;
the static expression evaluation result and the dynamic expression evaluation result are formed according to the change between one detection result of each facial action unit and another detection result of at least one other facial action unit.
23. The system of claim 22, wherein at least one of the facial expression evaluation module and the artificial intelligence recognition analysis module further comprises:
an emotion analysis and face recognition unit for forming a plurality of emotion index combinations according to at least one of the static expression evaluation result and the dynamic expression evaluation result;
and forming the real-time facial expression evaluation result according to the proportional result of each of the plurality of emotion index combinations.
24. The system of claim 23, wherein the combination of mood indicators is at least one of a positive mood indicator combination and a negative mood indicator combination.
25. The system of claim 24, wherein the negative emotion indicator combination is one or more of a sadness indicator, a vitality indicator, a fear indicator, a surprise indicator, a fear indicator, a keeping away from sight indicators and a disagreement indicator, and combinations thereof.
26. The system of claim 24, wherein the positive emotion index combination is one or more of a happy index, a satisfied index, a feeling index, a positive index and a relaxed index, and combinations thereof.
27. The system of claim 17, wherein the History database of the evaluation results of the medical assistance comprises a plurality of evaluation results of the medical assistance for the artificial intelligence recognition module to perform an artificial intelligence deep learning/training procedure according to at least one artificial intelligence deep learning/training algorithm.
28. The system of claim 27, wherein each result of the assistant evaluation of beauty medical treatment comprises at least: one or a combination of basic data related to a historical subject, facial expression assessment results, facial features, functional medical anatomy rules and dynamic medical anatomy rules in the medical knowledge rules module, a combination and preferred sequence of assessment of treatment site results, and a type and dosage of injected filler.
29. The system of claim 28, wherein the facial features of the individual comprise static texture features, static contour features, or skin-type features of a custom expression.
30. The system of claim 28, wherein the facial features of the individual are provided to the artificial intelligence recognition module from at least one of the facial expression evaluation module and an input-output module.
31. The system of claim 27, wherein the deep learning/training process is for inputting the results of the assistant aesthetic medical evaluations to the module.
32. The system of claim 27, wherein the at least one artificial intelligence deep learning/training algorithm is at least one of an artificial neural network algorithm and a deep learning algorithm.
33. The system of claim 17, wherein the system is assembled to form an electronic device by the facial expression evaluation module, an input/output module and the artificial intelligence recognition and analysis module; the electronic device is a handheld intelligent mobile device, a personal computer or an intelligent device which operates independently.
34. The system of claim 33, wherein the electronic device is at least one of wirelessly and wirelessly connected to at least one of the historical database of cosmetology and medical knowledge rules module.
CN202010388355.8A 2019-05-09 2020-05-09 Artificial intelligence auxiliary evaluation method and system applied to beauty treatment and electronic device Pending CN111914871A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962845355P 2019-05-09 2019-05-09
US62/845,355 2019-05-09

Publications (1)

Publication Number Publication Date
CN111914871A true CN111914871A (en) 2020-11-10

Family

ID=73237854

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010388355.8A Pending CN111914871A (en) 2019-05-09 2020-05-09 Artificial intelligence auxiliary evaluation method and system applied to beauty treatment and electronic device

Country Status (2)

Country Link
CN (1) CN111914871A (en)
TW (1) TWI756681B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201320031A (en) * 2011-11-15 2013-05-16 Univ Nat Taiwan Normal Testing system and method having face expressions recognizing auxiliary
TW201521820A (en) * 2013-12-13 2015-06-16 Pei Er Biotechnology Co Ltd Medical cosmetic treatment method and device
CN107007257A (en) * 2017-03-17 2017-08-04 深圳大学 The automatic measure grading method and apparatus of the unnatural degree of face
ES2633152A1 (en) * 2017-02-27 2017-09-19 Universitat De Les Illes Balears Method and system for the recognition of the state of mood by means of image analysis (Machine-translation by Google Translate, not legally binding)
CN107993280A (en) * 2017-11-30 2018-05-04 广州星天空信息科技有限公司 Beauty method and system based on threedimensional model

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI315042B (en) * 2006-11-21 2009-09-21 Jing Jing Fan Method of three-dimensional digital human model construction from two photos and obtaining anthropometry information
ITPD20130010A1 (en) * 2013-01-23 2014-07-24 Amato Dott Aldo PROCEDURE FOR THE AESTHETIC ANALYSIS OF THE DENTAL INSTRUMENT IN THE SMILE AREA AND FOR THE SUPPORT FOR THE IDENTIFICATION OF DENTISTRY AND DENTAL TECHNICAL AESTHETIC TREATMENTS

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201320031A (en) * 2011-11-15 2013-05-16 Univ Nat Taiwan Normal Testing system and method having face expressions recognizing auxiliary
TW201521820A (en) * 2013-12-13 2015-06-16 Pei Er Biotechnology Co Ltd Medical cosmetic treatment method and device
ES2633152A1 (en) * 2017-02-27 2017-09-19 Universitat De Les Illes Balears Method and system for the recognition of the state of mood by means of image analysis (Machine-translation by Google Translate, not legally binding)
CN107007257A (en) * 2017-03-17 2017-08-04 深圳大学 The automatic measure grading method and apparatus of the unnatural degree of face
CN107993280A (en) * 2017-11-30 2018-05-04 广州星天空信息科技有限公司 Beauty method and system based on threedimensional model

Also Published As

Publication number Publication date
TW202044279A (en) 2020-12-01
TWI756681B (en) 2022-03-01

Similar Documents

Publication Publication Date Title
Roether et al. Critical features for the perception of emotion from gait
Todorov et al. Validation of data-driven computational models of social perception of faces.
Olderbak et al. Psychometric challenges and proposed solutions when scoring facial emotion expression codes
Liang et al. Multimodal local-global ranking fusion for emotion recognition
Buettner Robust user identification based on facial action units unaffected by users' emotions
Beyan et al. Moving as a leader: Detecting emergent leadership in small groups using body pose
Deshmukh et al. Facial emotion recognition system through machine learning approach
CN114842522A (en) Artificial intelligence auxiliary evaluation method applied to beauty treatment
WO2018154098A1 (en) Method and system for recognizing mood by means of image analysis
CN110147822B (en) Emotion index calculation method based on face action unit detection
Štěpánek et al. Machine-Learning and R in Plastic Surgery–Evaluation of Facial Attractiveness and Classification of Facial Emotions
Pablos et al. Dynamic facial emotion recognition oriented to HCI applications
Triyanti et al. Basic emotion recogniton using automatic facial expression analysis software
CN111914871A (en) Artificial intelligence auxiliary evaluation method and system applied to beauty treatment and electronic device
CN115607156B (en) Multi-mode-based psychological cognitive screening evaluation method, system and storage medium
TWI646438B (en) Emotion detection system and method
Khanna et al. Rule based system for recognizing emotions using multimodal approach
Zhang et al. Facial multi-characteristics and applications
Liliana et al. The Fuzzy Emotion Recognition Framework Using Semantic-Linguistic Facial Features
Ahmed et al. Assisting the autistic with improved facial expression recognition from mixed expressions
Namba et al. Spatio-temporal properties of amused, embarrassed, and pained smiles
Testa et al. Generating facial emotions for diagnosis and training
WO2023171162A1 (en) Psychological state estimation device and psychological state estimation method
Andayani et al. Exploitation of Nasolabial Folds for Happy Smile Recognition on an Image Using ANN
JP2023038871A (en) Feature extraction method and feature extraction system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination