WO2019085495A1 - Procédé et appareil de reconnaissance de micro-expression, système et support de stockage lisible par ordinateur - Google Patents

Procédé et appareil de reconnaissance de micro-expression, système et support de stockage lisible par ordinateur Download PDF

Info

Publication number
WO2019085495A1
WO2019085495A1 PCT/CN2018/091372 CN2018091372W WO2019085495A1 WO 2019085495 A1 WO2019085495 A1 WO 2019085495A1 CN 2018091372 W CN2018091372 W CN 2018091372W WO 2019085495 A1 WO2019085495 A1 WO 2019085495A1
Authority
WO
WIPO (PCT)
Prior art keywords
gene
emotional
micro
emotion
expression recognition
Prior art date
Application number
PCT/CN2018/091372
Other languages
English (en)
Chinese (zh)
Inventor
袁晖
Original Assignee
深圳市科迈爱康科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市科迈爱康科技有限公司 filed Critical 深圳市科迈爱康科技有限公司
Publication of WO2019085495A1 publication Critical patent/WO2019085495A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships

Definitions

  • the present application relates to the field of face recognition, and in particular, to a micro-expression recognition method, apparatus, system, and computer readable storage medium.
  • the specific manifestations of human emotions and expressions are related to genes. People with the same genetic characteristics may produce very similar expressions under the same emotions. Israeli researchers have confirmed that people in the same family have similarities in their expressions of emotions and griefs. Even a considerable number of people have family characteristics, such as biting their lips when they are angry, and sticking their tongues when thinking. In addition, it has been found that certain specific gene sequences have a greater correlation to human mood performance. For example, the 5-HTTLPR region DNA sequence can be divided into two main types: a shorter "s” type and a longer "l” type. Some studies in the past have found that people with "s" alleles (that is, individuals with genotype "ss” or "sl”) have more sensitive emotional reactions and are more susceptible to environmental and personal experiences.
  • the ADRA2B gene in some populations, will have three glutamate deletion variants, and the variant carriers are more sensitive to negative emotions. People with CMOT genotypes of “mm”, “vv”, and “mv” have different feelings about emotions such as fear and anxiety. At present, the research on expression recognition mostly focuses on the identification algorithm based on the standard-based database. The images or videos collected in the real environment are inevitably affected by the environment and the personal characteristics of the subjects, and the emotional triggering factors are subtle and diverse. Therefore, when identifying facial expressions, the corresponding genetic characteristics must also be considered.
  • the main purpose of the present application is to provide a micro-expression recognition method aimed at solving the problem of expression recognition caused by emotional genes.
  • the present application provides a micro-expression recognition method, including the following contents:
  • the user's emotion is determined based on the emotional gene threshold and the emotional score.
  • the step of acquiring the facial feature value of the facial image of the person to be recognized, and calculating the emotional score of the facial feature value includes:
  • the step of acquiring a corresponding emotional gene threshold according to the detection result includes:
  • the emotion gene threshold corresponding to the detection result is acquired in a storage area of the preset emotion gene threshold.
  • the step of acquiring an emotional gene threshold corresponding to the detection result in a storage area of the preset emotion gene threshold according to the detection result of the emotion gene comprising:
  • An emotional gene threshold corresponding to the emotional gene type is acquired in a storage area of the preset emotional gene threshold on the condition of the emotional gene type.
  • the method before the step of acquiring the emotional gene threshold corresponding to the emotional gene type in the storage area of the preset emotional gene threshold on the condition of the emotional gene type, the method further includes:
  • the method before the step of acquiring the emotional gene detection result of the person to be identified, the method further includes:
  • the method before the step of acquiring the emotional gene detection result corresponding to the to-be-identified person according to the facial image, the method further includes:
  • the step of acquiring the facial feature value of the facial image of the person to be recognized, and calculating the emotional score of the facial feature value further includes:
  • the emotion score is calculated according to the weight parameter and the region feature value.
  • the method before the step of acquiring the weight parameter of each of the feature regions, the method further includes:
  • the weight parameter is adjusted with the weight parameter adjustment value.
  • the step of acquiring the facial feature value of each feature area in the to-be-identified face image further includes:
  • the calculated emotion score is adjusted by the emotion score adjustment value.
  • the step of acquiring the emotional gene detection result of the person to be identified, and acquiring the corresponding emotional gene threshold according to the detection result further includes:
  • the emotion gene threshold is calculated according to each of the acquired emotional gene thresholds.
  • the step of acquiring the weight parameter adjustment value of the emotion gene according to the confirmed emotion gene of the face image further comprising:
  • the corresponding emotional gene adjustment threshold is acquired to adjust the calculated emotional gene threshold.
  • the present application further provides a micro-expression recognition device, the micro-expression recognition device comprising: a memory, a processor, and a computer program stored on the memory and operable on the processor, The steps of the micro-expression recognition method as described above are implemented when the computer program is executed by the processor.
  • the present application also provides a computer readable storage medium having a micro-expression recognition program stored thereon, the micro-expression recognition program being executed by a processor to implement the steps of the micro-expression recognition method as described above.
  • the present application also provides a micro-expression recognition system, the micro-expression recognition device includes a monitoring module, a storage module, and a data module;
  • the monitoring module is configured to detect a micro-expression of the person to be identified, and submit micro-expression reminding information when detecting the micro-expression feature of the person to be recognized; wherein the micro-expression reminding information includes the to-be-identified Human micro-expression features;
  • the storage module is configured to store genetic features and micro-expression features of the person to be identified
  • the data module is configured to correct a genetic feature detection result of the person to be identified according to a result of the micro-expression recognition.
  • a micro-expression recognition method is provided by the embodiment of the present application, by acquiring an emotional gene detection result of a person to be recognized, and acquiring a corresponding emotional gene threshold according to the detection result; and acquiring a face of the face image of the person to be recognized An eigenvalue, and calculating an emotional score of the facial eigenvalue; determining a user's emotion based on the emotional gene threshold and the emotional score. Determine the corresponding emotions with the adjusted range of emotional gene thresholds, and achieve the beneficial effect of improving the micro-expression recognition efficiency.
  • FIG. 1 is a schematic structural diagram of a terminal/device in a hardware operating environment according to an embodiment of the present application
  • FIG. 2 is a schematic flowchart of a first embodiment of a micro-expression recognition method according to the present application
  • FIG. 3 is a schematic diagram showing the refinement step of step S20 in FIG. 1.
  • the main solution of the embodiment of the present application is: acquiring an emotional gene detection result of the person to be identified, and acquiring a corresponding emotional gene threshold according to the detection result; acquiring a facial feature value of the face image of the to-be-identified person, and Calculating an emotional score of the facial feature value; determining a user's emotion according to the emotional gene threshold and the emotional score.
  • the micro-expression does not recognize the micro-expression by excluding the influence of the emotional gene in the recognition process.
  • the present application provides a solution for determining a corresponding micro-expression by obtaining a corresponding emotional gene threshold range of an emotional gene, thereby achieving a beneficial effect of improving micro-expression recognition efficiency.
  • FIG. 1 is a schematic structural diagram of a terminal in a hardware operating environment involved in an embodiment of the present application.
  • the terminal in the embodiment of the present application may be a PC, or may be a mobile terminal device having a display function, such as a smart phone, a tablet computer, an e-book reader, and a portable computer.
  • the terminal may include a processor 1001, such as a CPU, a network interface 1004, a user interface 1003, a memory 1005, and a communication bus 1002.
  • the communication bus 1002 is used to implement connection communication between these components.
  • the user interface 1003 can include a display, an input unit such as a keyboard, and the optional user interface 1003 can also include a standard wired interface, a wireless interface.
  • the network interface 1004 can optionally include a standard wired interface, a wireless interface (such as a WI-FI interface).
  • the memory 1005 may be a high speed RAM memory or a stable memory (non-volatile) Memory), such as disk storage.
  • the memory 1005 can also optionally be a storage device independent of the aforementioned processor 1001.
  • terminal structure shown in FIG. 1 does not constitute a limitation to the terminal, and may include more or less components than those illustrated, or a combination of certain components, or different component arrangements.
  • an operating system may be included in the memory 1005 as a computer storage medium.
  • a network communication module may be included in the memory 1005 as a computer storage medium.
  • a user interface module may be included in the memory 1005 as a computer storage medium.
  • a micro-expression recognition program may be included in the memory 1005 as a computer storage medium.
  • the network interface 1004 is mainly used to connect to the background server and perform data communication with the background server;
  • the user interface 1003 is mainly used to connect the client (user end), and perform data communication with the client;
  • the processor 1001 can be used to call the micro-expression recognition program stored in the memory 1005 and perform the following operations:
  • the user's emotion is determined based on the emotional gene threshold and the emotional score.
  • processor 1001 may call the micro-expression recognition program stored in the memory 1005, and further perform the following operations:
  • processor 1001 may call the micro-expression recognition program stored in the memory 1005, and further perform the following operations:
  • the emotion gene threshold corresponding to the detection result is acquired in a storage area of the preset emotion gene threshold.
  • processor 1001 may call the micro-expression recognition program stored in the memory 1005, and further perform the following operations:
  • An emotional gene threshold corresponding to the emotional gene type is acquired in a storage area of the preset emotional gene threshold on the condition of the emotional gene type.
  • processor 1001 may call the micro-expression recognition program stored in the memory 1005, and further perform the following operations:
  • processor 1001 may call the micro-expression recognition program stored in the memory 1005, and further perform the following operations:
  • processor 1001 may call the micro-expression recognition program stored in the memory 1005, and further perform the following operations:
  • processor 1001 may call the micro-expression recognition program stored in the memory 1005, and further perform the following operations:
  • the emotion score is calculated according to the weight parameter and the region feature value.
  • processor 1001 may call the micro-expression recognition program stored in the memory 1005, and further perform the following operations:
  • the weight parameter is adjusted with the weight parameter adjustment value.
  • processor 1001 may call the micro-expression recognition program stored in the memory 1005, and further perform the following operations:
  • the calculated emotion score is adjusted by the emotion score adjustment value.
  • processor 1001 may call the micro-expression recognition program stored in the memory 1005, and further perform the following operations:
  • the emotion gene threshold is calculated according to each of the acquired emotional gene thresholds.
  • processor 1001 may call the micro-expression recognition program stored in the memory 1005, and further perform the following operations:
  • the corresponding emotional gene adjustment threshold is acquired to adjust the calculated emotional gene threshold.
  • FIG. 2 is a schematic flowchart diagram of a first embodiment of a micro-expression recognition method according to the present application, where the micro-expression recognition method includes:
  • Step S10 Obtain an emotional gene detection result of the person to be identified, and obtain a corresponding emotional gene threshold according to the detection result;
  • the corresponding facial image to be recognized is acquired.
  • the source of the face image to be recognized may be a video, a photo, or the like. If the source of the to-be-recognized face image is a video, a positive face is selected in the video to improve the accuracy of the recognition. And acquiring, according to the acquired face image to be recognized, a detection result of the emotion gene of the face image to be recognized.
  • the emotional gene detection is a prior art emotional gene detection function, and will not be described here. After the detection result is obtained, according to the detection result, a corresponding emotional gene threshold of the emotional gene in the detection result is obtained.
  • the emotional gene includes a gene type in which a certain specific gene sequence has been found to have a greater correlation with human emotion expression, including but not limited to a 5-HTTLPR gene, an ADRA2B gene, a CMOT gene, etc.
  • Gene types are all existing emotional gene information.
  • a corresponding emotional gene threshold is obtained based on the content of the gene.
  • the obtained corresponding emotional gene threshold is a set of emotional gene thresholds respectively set by the relevant research and development personnel according to the existing discovered emotional gene types, and the set emotional gene threshold is stored to the relevant
  • the storage area is identified by a corresponding emotion type, and the operation of obtaining the emotion gene threshold according to the emotional gene type is facilitated.
  • the emotional gene threshold is an emotion adjustment threshold in the present application. That is, the preset preset threshold range is adjusted according to the obtained emotion gene threshold correspondingly.
  • the preset threshold range is an emotional sub-threshold range set according to the corresponding micro-expression content, which does not include the influence of the related emotional gene, and therefore, after detecting the to-be-recognized face image, according to the waiting The emotion gene detection result of the face image is recognized, and the corresponding emotion gene adjustment threshold is obtained correspondingly to adjust the preset threshold range.
  • the detected emotional gene is the ADRA2B gene
  • the emotional gene is more sensitive to negative emotions
  • the corresponding emotional gene adjustment threshold of the acquired ADRA2B gene is increased by a predetermined threshold range of negative emotions such as crying and fear.
  • the target value is to increase the target emotion threshold range of the preset threshold range to form a new emotion gene threshold, and perform an operation of micro-expression recognition on the to-be-identified face image by the emotion gene threshold.
  • the method further includes:
  • the method further includes:
  • the step of acquiring the emotional gene detection result of the person to be identified, and acquiring the corresponding emotional gene threshold according to the detection result further includes:
  • the emotion gene threshold is calculated according to each of the acquired emotional gene thresholds.
  • the corresponding emotion gene adjustment threshold is acquired, and the preset threshold range is adjusted by each of the emotion gene thresholds.
  • each of the emotion genes may adjust a preset threshold range of the same emotion, setting an upper and lower limits of the preset threshold range of the emotion to ensure a range value of the preset threshold range, so as to avoid causing the The preset threshold range is too large, and the micro-expression recognition may be wrong.
  • Step S20 acquiring a facial feature value of the face image of the person to be recognized, and calculating an emotional score of the facial feature value;
  • the specific operation of the face feature value is that the face image to be recognized is divided into several regions, including but not limited to the forehead, eyebrow, eye, under-frame triangle, nose, nasolabial fold, mouth, etc. .
  • the specific distinguishing manner is a method for dividing the facial feature region of the existing micro-expression recognition. N feature values are defined for the feature region based on the feature regions of the differentiated face images.
  • the number the number and length of the folds (heads); the distance between the eyebrows and the eyebrows, the angle of the eyebrows; the number of inflection points on the outer contour of the eyebrows; the depth of wrinkles between the eyebrows; the eye: the area of the eyelids, the area of the eye, the pupil Position, lower eyelid elevation; frame lower triangle: area size, contour position; nose: nose width; nasolabial groove (legal pattern): grain depth, texture shape; mouth: left and right mouth angle position, person length, upper and lower lip Line shape, mouth width, tongue extension, puppet depth, shape, etc.
  • Each of the feature regions performs a weight assignment for each feature region, wherein the weight assignment parameter is a preset parameter and is a defined related parameter content, for example, a “joy” mood.
  • the weights of each region are as follows: 40% in the eye; 20% in the lower triangle; 30% in the mouth; 10% in the nasolabial fold.
  • the eyelid rupture area is greatly reduced, the lower eyelid ridge is increased; the lower triangular area of the frame is moved upward; the nasolabial groove is deepened, and the corner of the mouth moves outward.
  • Step S30 determining the emotion of the user according to the emotional gene threshold and the emotion score.
  • the emotional gene threshold includes a classification of all human emotions, and each emotion has a corresponding threshold range. If the emotion score of the face image calculated in step S20 is 70, and the threshold value of "joy" in the emotion gene threshold is 65-80, that is, according to the emotion score The emotion of the face image to be recognized is “joy”.
  • the corresponding emotional gene threshold range is obtained, and the emotion score of the face image is calculated, and the emotion is confirmed by the obtained corresponding emotional gene threshold range.
  • the corresponding emotions of the scores are used to identify human emotions based on the influence of emotional genes, which improves the efficiency of micro-expression recognition.
  • FIG. 3 is a schematic diagram of the refinement step of step S20 in FIG. 1.
  • the step of calculating the sentiment score of the facial feature value further includes:
  • Step S21 Obtain a facial feature value of each feature region in the to-be-identified face image.
  • Step S22 acquiring weight parameters of each of the feature regions
  • Step S23 calculating the emotion score according to the weight parameter and the region feature value.
  • the to-be-recognized face image is divided into feature regions, and the feature regions are assigned values. And determining, according to the differentiated feature regions of the face image to be recognized, the N feature values for the feature regions.
  • the number the number and length of the folds (heads); the distance between the eyebrows and the eyebrows, the angle of the eyebrows; the number of inflection points on the outer contour of the eyebrows; the depth of wrinkles between the eyebrows; the eye: the area of the eyelids, the area of the eye, the pupil Position, lower eyelid elevation; frame lower triangle: area size, contour position; nose: nose width; nasolabial groove (legal pattern): grain depth, texture shape; mouth: left and right mouth angle position, person length, upper and lower lip Line shape, mouth width, tongue extension, puppet depth, shape, etc.
  • Each of the feature regions performs a weight assignment for each feature region, wherein the weight assignment parameter is a preset parameter and is a defined related parameter content, for example, a “joy” mood.
  • the weights of each region are as follows: 40% in the eye; 20% in the lower triangle; 30% in the mouth; 10% in the nasolabial fold.
  • the eyelid rupture area is greatly reduced, the lower eyelid ridge is increased; the lower triangular area of the frame is moved upward; the nasolabial groove is deepened, and the corner of the mouth moves outward.
  • the eye score 80, the lower triangle score 70, the mouth score 60, and the nasolabial score 60 is:
  • the method further includes:
  • the weight parameter is adjusted with the weight parameter adjustment value.
  • the adjustment value adjusts the weight parameter, and the adjusted weight parameter is used to calculate an empirical value of each feature region of the person to be identified to obtain the emotion score.
  • the step of acquiring the facial feature value of each feature region in the to-be-identified face image after the obtaining the facial feature value further includes:
  • the calculated emotion score is adjusted by the emotion score adjustment value.
  • the preset family feature expression includes the discovered family-related feature expressions, for example, when the object of the face image to be recognized belongs to the family A, the family A has a family of “the right side of the mouth is more raised than the left side when the joy is joyful”.
  • the feature expression when identifying, determines the difference in the curvature of the left and right corners of the face image to determine whether the family feature image appears in the face image to be recognized.
  • the preset family feature expressions determine, in the feature regions of the to-be-identified face image, the preset family feature expressions, and acquiring corresponding emotion component adjustment values according to the corresponding preset family feature expressions. For example, it has been confirmed that the corner of the feature area of the face image to be recognized is a determined family feature expression.
  • the corresponding emotion score adjustment value of the mouth angle family feature cousin is 20, that is, the emotion score adjustment value 20 is added to the calculated emotion group to become a new emotion score.
  • the emotion score adjusted by the corresponding emotion score of the family feature cousin is adjusted, and the corresponding emotion threshold value range is compared with the corresponding preset emotion threshold range to confirm the corresponding emotion of the emotion score.
  • the embodiment of the present application further provides a computer readable storage medium, where the micro-expression recognition program is stored, and when the micro-expression recognition program is executed by the processor, the following operations are implemented:
  • the user's emotion is determined based on the emotional gene threshold and the emotional score.
  • micro-expression recognition program when executed by the processor, the following operations are also implemented:
  • micro-expression recognition program when executed by the processor, the following operations are also implemented:
  • the emotion gene threshold corresponding to the detection result is acquired in a storage area of the preset emotion gene threshold.
  • micro-expression recognition program when executed by the processor, the following operations are also implemented:
  • An emotional gene threshold corresponding to the emotional gene type is acquired in a storage area of the preset emotional gene threshold on the condition of the emotional gene type.
  • micro-expression recognition program when executed by the processor, the following operations are also implemented:
  • micro-expression recognition program when executed by the processor, the following operations are also implemented:
  • micro-expression recognition program when executed by the processor, the following operations are also implemented:
  • micro-expression recognition program when executed by the processor, the following operations are also implemented:
  • the emotion score is calculated according to the weight parameter and the region feature value.
  • micro-expression recognition program when executed by the processor, the following operations are also implemented:
  • the weight parameter is adjusted with the weight parameter adjustment value.
  • micro-expression recognition program when executed by the processor, the following operations are also implemented:
  • the calculated emotion score is adjusted by the emotion score adjustment value.
  • micro-expression recognition program when executed by the processor, the following operations are also implemented:
  • the emotion gene threshold is calculated according to each of the acquired emotional gene thresholds.
  • micro-expression recognition program when executed by the processor, the following operations are also implemented:
  • the corresponding emotional gene adjustment threshold is acquired to adjust the calculated emotional gene threshold.
  • the embodiment of the present application further provides a micro-expression recognition system, where the micro-expression recognition device includes a monitoring module, a storage module, and a data module;
  • the monitoring module is configured to detect a micro-expression of the person to be identified, and submit micro-expression reminding information when detecting the micro-expression feature of the person to be recognized; wherein the micro-expression reminding information includes the to-be-identified Human micro-expression features;
  • the storage module is configured to store genetic features and micro-expression features of the person to be identified
  • the data module is configured to correct a genetic feature detection result of the person to be identified according to a result of the micro-expression recognition.
  • the technical solution of the present application which is essential or contributes to the prior art, may be embodied in the form of a software product stored in a storage medium (such as ROM/RAM as described above). , a disk, an optical disk, including a number of instructions for causing a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the methods described in the various embodiments of the present application.
  • a terminal device which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un procédé de reconnaissance de micro-expression, un appareil de reconnaissance de micro-expression, un support de stockage lisible par ordinateur et un système. Le procédé consiste à : acquérir un résultat de détection de gène d'émotion d'une personne à reconnaître, et acquérir une valeur de seuil de gène d'émotion correspondante en fonction du résultat de détection (S10) ; acquérir une valeur de caractéristique faciale d'une image faciale de la personne à reconnaître, et calculer un score d'émotion de la valeur de caractéristique faciale (S20) ; et déterminer une émotion d'utilisateur en fonction de la valeur de seuil de gène d'émotion et du score d'émotion (S30). Au moyen de la détection d'un gène d'émotion d'un utilisateur, une valeur de seuil de gène d'émotion est ajustée de manière correspondante, ce qui permet d'obtenir l'effet bénéfique d'amélioration de la précision de la reconnaissance de micro-expression faciale.
PCT/CN2018/091372 2017-11-01 2018-06-15 Procédé et appareil de reconnaissance de micro-expression, système et support de stockage lisible par ordinateur WO2019085495A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201711060466.0A CN107895146B (zh) 2017-11-01 2017-11-01 微表情识别方法、装置、系统及计算机可读存储介质
CN201711060466.0 2017-11-01

Publications (1)

Publication Number Publication Date
WO2019085495A1 true WO2019085495A1 (fr) 2019-05-09

Family

ID=61803013

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/091372 WO2019085495A1 (fr) 2017-11-01 2018-06-15 Procédé et appareil de reconnaissance de micro-expression, système et support de stockage lisible par ordinateur

Country Status (2)

Country Link
CN (1) CN107895146B (fr)
WO (1) WO2019085495A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110378228A (zh) * 2019-06-17 2019-10-25 深圳壹账通智能科技有限公司 面审视频数据处理方法、装置、计算机设备和存储介质
CN110555374A (zh) * 2019-07-25 2019-12-10 深圳壹账通智能科技有限公司 资源分享的方法、装置、计算机设备和存储介质
CN110781810A (zh) * 2019-10-24 2020-02-11 合肥盛东信息科技有限公司 一种人脸情绪识别方法
CN111488813A (zh) * 2020-04-02 2020-08-04 咪咕文化科技有限公司 视频的情感标注方法、装置、电子设备及存储介质

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107895146B (zh) * 2017-11-01 2020-05-26 深圳市科迈爱康科技有限公司 微表情识别方法、装置、系统及计算机可读存储介质
CN108810624A (zh) * 2018-06-08 2018-11-13 广州视源电子科技股份有限公司 节目反馈方法和装置、播放设备
CN110197107B (zh) * 2018-08-17 2024-05-28 平安科技(深圳)有限公司 微表情识别方法、装置、计算机设备及存储介质
CN109559193A (zh) * 2018-10-26 2019-04-02 深圳壹账通智能科技有限公司 智能识别的产品推送方法、装置、计算机设备及介质
CN111127830A (zh) * 2018-11-01 2020-05-08 奇酷互联网络科技(深圳)有限公司 基于监控设备的报警方法、报警系统和可读存储介质
CN109829362A (zh) * 2018-12-18 2019-05-31 深圳壹账通智能科技有限公司 安检辅助分析方法、装置、计算机设备和存储介质
CN109711300A (zh) * 2018-12-18 2019-05-03 深圳壹账通智能科技有限公司 盲人辅助沟通方法、装置、计算机设备和存储介质
CN110046955A (zh) * 2019-03-12 2019-07-23 平安科技(深圳)有限公司 基于人脸识别的营销方法、装置、计算机设备及存储介质
CN110097470A (zh) * 2019-03-19 2019-08-06 深圳壹账通智能科技有限公司 基于人工智能的理财产品推广方法、装置及计算机设备
CN110177205A (zh) * 2019-05-20 2019-08-27 深圳壹账通智能科技有限公司 终端设备、基于微表情的拍照方法及计算机可读存储介质
CN110222597B (zh) * 2019-05-21 2023-09-22 平安科技(深圳)有限公司 基于微表情调节屏幕显示的方法及装置
CN110377380A (zh) * 2019-06-21 2019-10-25 深圳壹账通智能科技有限公司 主题色调调整方法、装置、设备和计算机可读存储介质
CN110399837B (zh) * 2019-07-25 2024-01-05 深圳智慧林网络科技有限公司 用户情绪识别方法、装置以及计算机可读存储介质
CN110399836A (zh) * 2019-07-25 2019-11-01 深圳智慧林网络科技有限公司 用户情绪识别方法、装置以及计算机可读存储介质
CN110852220B (zh) * 2019-10-30 2023-08-18 深圳智慧林网络科技有限公司 人脸表情的智能识别方法、终端和计算机可读存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104616005A (zh) * 2015-03-10 2015-05-13 南京宜开数据分析技术有限公司 一种领域自适应的人脸表情分析方法
CN104766041A (zh) * 2014-01-07 2015-07-08 腾讯科技(深圳)有限公司 一种图像识别方法、装置及系统
CN106257489A (zh) * 2016-07-12 2016-12-28 乐视控股(北京)有限公司 表情识别方法及系统
CN107895146A (zh) * 2017-11-01 2018-04-10 深圳市科迈爱康科技有限公司 微表情识别方法、装置、系统及计算机可读存储介质

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101338337B (zh) * 2007-07-04 2011-11-02 北京华安佛医药研究中心有限公司 多态性位点基因型预测抑郁症及药效的用途、方法和试剂盒
CN105825192B (zh) * 2016-03-24 2019-06-25 深圳大学 一种人脸表情识别方法及系统
CN106600530B (zh) * 2016-11-29 2019-02-15 北京小米移动软件有限公司 照片合成方法及装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104766041A (zh) * 2014-01-07 2015-07-08 腾讯科技(深圳)有限公司 一种图像识别方法、装置及系统
CN104616005A (zh) * 2015-03-10 2015-05-13 南京宜开数据分析技术有限公司 一种领域自适应的人脸表情分析方法
CN106257489A (zh) * 2016-07-12 2016-12-28 乐视控股(北京)有限公司 表情识别方法及系统
CN107895146A (zh) * 2017-11-01 2018-04-10 深圳市科迈爱康科技有限公司 微表情识别方法、装置、系统及计算机可读存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ZHU, BI ET AL.: "5_HTTLPR, M (Interaction Effects of 5-HTTLPR, Gender, and Family Socioeconomic Status on Facial Expression Recognition in Chinese College Students", PSYCHOLOGICAL DEVELOPMENT AND EDUCATION, 15 March 2013 (2013-03-15), pages 136, ISSN: 1001-4918 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110378228A (zh) * 2019-06-17 2019-10-25 深圳壹账通智能科技有限公司 面审视频数据处理方法、装置、计算机设备和存储介质
CN110555374A (zh) * 2019-07-25 2019-12-10 深圳壹账通智能科技有限公司 资源分享的方法、装置、计算机设备和存储介质
CN110781810A (zh) * 2019-10-24 2020-02-11 合肥盛东信息科技有限公司 一种人脸情绪识别方法
CN110781810B (zh) * 2019-10-24 2024-02-27 合肥盛东信息科技有限公司 一种人脸情绪识别方法
CN111488813A (zh) * 2020-04-02 2020-08-04 咪咕文化科技有限公司 视频的情感标注方法、装置、电子设备及存储介质
CN111488813B (zh) * 2020-04-02 2023-09-08 咪咕文化科技有限公司 视频的情感标注方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
CN107895146A (zh) 2018-04-10
CN107895146B (zh) 2020-05-26

Similar Documents

Publication Publication Date Title
WO2019085495A1 (fr) Procédé et appareil de reconnaissance de micro-expression, système et support de stockage lisible par ordinateur
WO2019029261A1 (fr) Procédé, dispositif de reconnaissance de micro-expressions et support d'informations
WO2015184760A1 (fr) Procédé et appareil d'entrée de geste dans l'air
WO2016082267A1 (fr) Procédé et système de reconnaissance vocale
WO2019041406A1 (fr) Dispositif, terminal et procédé de reconnaissance d'image indécente et support de stockage lisible par ordinateur
WO2020190112A1 (fr) Procédé, appareil, dispositif et support permettant de générer des informations de sous-titrage de données multimédias
WO2018143707A1 (fr) Système d'evaluation de maquillage et son procédé de fonctionnement
WO2019051899A1 (fr) Procédé et dispositif de commande de terminaux, et support d'informations
WO2013022226A4 (fr) Procédé et appareil de génération d'informations personnelles d'un client, support pour leur enregistrement et système pos
WO2013009020A2 (fr) Procédé et appareil de génération d'informations de traçage de visage de spectateur, support d'enregistrement pour ceux-ci et appareil d'affichage tridimensionnel
WO2020138624A1 (fr) Appareil de suppression de bruit et son procédé
WO2021132851A1 (fr) Dispositif électronique, système de soins du cuir chevelu et son procédé de commande
WO2019216593A1 (fr) Procédé et appareil de traitement de pose
WO2019051683A1 (fr) Procédé de photographie de lumière de remplissage, terminal mobile et support de stockage lisible par ordinateur
WO2019231252A1 (fr) Dispositif électronique utilisé pour authentifier un utilisateur, et procédé de commande associé
WO2015009111A1 (fr) Procédé et appareil d'authentification basée sur de la biométrie
WO2019051895A1 (fr) Procédé et dispositif de commande de terminal, et support de stockage
WO2019051890A1 (fr) Procédé et dispositif de commande de terminal et support de stockage lisible par ordinateur
WO2018166236A1 (fr) Procédé, appareil et dispositif de reconnaissance de facture de règlement de revendication, et support d'informations lisible par ordinateur
EP3740936A1 (fr) Procédé et appareil de traitement de pose
WO2015184982A1 (fr) Procédé et appareil d'apprentissage de classficateur, procédé et système d'authentification d'identité
WO2015127859A1 (fr) Procédé et appareil de détection de texte sensible
WO2015133699A1 (fr) Appareil de reconnaissance d'objet, et support d'enregistrement sur lequel un procédé un et programme informatique pour celui-ci sont enregistrés
WO2018149191A1 (fr) Procédé, appareil et dispositif de souscription à des polices d'assurance, et support d'informations lisible par ordinateur
WO2019051905A1 (fr) Procédé de commande de climatiseur, climatiseur, et support d'informations lisible par ordinateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18874396

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18874396

Country of ref document: EP

Kind code of ref document: A1