CN111110256A - Emotion measuring method based on human face infrared thermal image - Google Patents

Emotion measuring method based on human face infrared thermal image Download PDF

Info

Publication number
CN111110256A
CN111110256A CN202010032776.7A CN202010032776A CN111110256A CN 111110256 A CN111110256 A CN 111110256A CN 202010032776 A CN202010032776 A CN 202010032776A CN 111110256 A CN111110256 A CN 111110256A
Authority
CN
China
Prior art keywords
infrared thermal
thermal image
standard
emotion
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010032776.7A
Other languages
Chinese (zh)
Inventor
李争光
武茜
施祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Lover Health Science and Technology Development Co Ltd
Original Assignee
Zhejiang Lover Health Science and Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Lover Health Science and Technology Development Co Ltd filed Critical Zhejiang Lover Health Science and Technology Development Co Ltd
Priority to CN202010032776.7A priority Critical patent/CN111110256A/en
Publication of CN111110256A publication Critical patent/CN111110256A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0062Arrangements for scanning
    • A61B5/0064Body surface scanning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Psychiatry (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Signal Processing (AREA)
  • Child & Adolescent Psychology (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The invention discloses an emotion measuring method based on human face comparison infrared thermal images, which comprises the following steps: collecting a standard infrared thermal image, a standard infrared thermal image and a target infrared thermal image of the face when a tested person is in a calm state and receives standard stimulation and target stimulation; determining a standard infrared thermal image false color image according to the standard infrared thermal image and the standard infrared thermal image, and determining a target infrared thermal image false color image according to the standard infrared thermal image and the target infrared thermal image; constructing an infrared thermal image-emotion intensity relation model according to the standard infrared thermal image pseudo-color image and the corresponding standard stimulated labeled emotion intensity; and acquiring the corresponding emotion intensity of the target infrared thermal image pseudo-color image by using the infrared thermal image-emotion intensity relation model, namely acquiring the emotion intensity caused by target stimulation. The method is simple and easy to implement, overcomes the defects of the current mainstream emotion measuring technology mainly based on self-expression subjective evaluation, and improves the accuracy of emotion measurement.

Description

Emotion measuring method based on human face infrared thermal image
Technical Field
The invention relates to a human body emotion measuring technology, in particular to an emotion measuring method based on human body face infrared thermal imagery.
Background
Human emotion measurement is an important technology in many fields such as neuroscience and psychology related research, product design evaluation, human-computer interaction and the like.
Currently, the mainstream emotion measuring method is a self-describing rating scale method, which is to make an emotion measuring object answer a question in a designed scale (such as a linkter scale) and feed back the sensed emotion condition and degree in a self-describing manner. The method has the defects of strong subjectivity, large individual difference, easy influence of other factors and the like. Therefore, many researchers are exploring ways to utilize physiological signals of the autonomic nervous system of the human body, which are not affected by the subjective will of the human body, as measures of emotion.
Since 1997 teaching of MIT media laboratory rosaled Picard suggested a new field of emotion calculation, great progress was made in recognizing human emotions using different signals. At present, different emotional states of a human body can be identified by utilizing signals such as voice, facial expressions, body postures, heart rates, skin myoelectricity, electroencephalogram, blood pressure, respiratory rhythm and the like, and the accuracy rate can reach more than 90%.
However, unlike emotion recognition, emotion measurement requires, although not so much, type recognition of emotion, an accurate quantitative determination of the degree of a given type of emotion. This determines that a large number of current emotion recognition methods and models are not suitable for emotion measurement.
On the other hand, many human body physiological signal detection, such as electroencephalogram (EEG), Electrocardiogram (ECG), functional magnetic resonance imaging (fMRI), functional near infrared spectroscopy (fNIRs), etc., require contact with the human body, even implantation of electrodes within the human body. This brings many limitations to practical application and inevitably has some influence on human emotion. And the detection equipment is generally complex in structure and expensive, so that the wide application of the detection equipment in practice is limited.
The thermal imaging technology can accurately measure the skin surface temperature of a human body in a non-contact mode, can reflect emotional states, is small and exquisite in measuring equipment, is easy to operate, and is not expensive compared with a plurality of large physiological signal detecting equipment. Thermal imaging techniques would therefore be very promising as a means of emotion measurement where environmental conditions are controllable.
Patent application publication No. CN110287895A discloses a method for emotion measurement based on convolutional neural network, which includes establishing a personal profile data packet and classifying personal facial expression categories by using face recognition, setting an image acquisition device in a working area to acquire personal facial images of each person in a unit in real time, and the method for emotion measurement further includes establishing a personal facial expression category recognition model, recognizing an expression of a predicted time period through the personal facial expression category recognition model to form an emotion state of each person, and finally obtaining a quantization table of the emotion state of each person in the unit. The emotion measuring method utilizes the convolutional neural network to carry out emotion measurement, a large number of data samples are needed, the richness of the data samples directly influences the recognition accuracy of the expression category recognition model, and visible light image data of a face is collected, so that the possibility of camouflage, environmental interference and the like exists, and therefore, the emotion measuring method has certain limitation.
Disclosure of Invention
The invention aims to provide an emotion measuring method based on human face infrared thermal imagery, which can improve the accuracy of emotion measurement.
In order to achieve the above purpose, the invention provides the following technical scheme:
an emotion measuring method based on human face infrared thermal imagery comprises the following steps:
collecting a standard infrared thermal image, a standard infrared thermal image and a target infrared thermal image of the face when a tested person is in a calm state and receives standard stimulation and target stimulation;
determining a standard infrared thermal image false color image according to the standard infrared thermal image and the standard infrared thermal image, and determining a target infrared thermal image false color image according to the standard infrared thermal image and the target infrared thermal image;
constructing an infrared thermal image-emotion intensity relation model according to the standard infrared thermal image pseudo-color image and the corresponding standard stimulated labeled emotion intensity;
and obtaining the emotion intensity corresponding to the target infrared thermal image false color image by using the infrared thermal image-emotion intensity relation model, namely obtaining the emotion intensity caused by target stimulation.
Compared with the prior art, the invention has the beneficial effects that:
the emotion measuring method does not need any direct contact with the body of the tested person, and can even hide and disguise the shooting process without the perception of the tested person, so that the influence of the experimental process on the emotion of the tested person is avoided; the characteristic quantity of infrared thermography emotion recognition can be quickly obtained on the basis of a small sample, the training cost is low, and massive large data support is not needed; the identification content is a human physiological spontaneous signal and cannot be disguised like information such as facial expression, voice and the like; the obtained result is a quantitative value of the designated emotion degree corresponding to the target stimulation, so that subsequent analysis and processing and comparison of results among different experiments are facilitated, and the method has a clearer and more specific guiding significance for research and design.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a flowchart of an emotion measuring method based on infrared thermography of a human face according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of establishing an infrared thermography-emotional intensity relationship model according to an embodiment of the present invention;
fig. 3 is a schematic diagram of human face region division according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the detailed description and specific examples, while indicating the scope of the invention, are intended for purposes of illustration only and are not intended to limit the scope of the invention.
Fig. 1 is a flowchart of an emotion measuring method based on infrared thermography of a human face according to an embodiment of the present invention. Fig. 2 is a schematic flow chart of establishing an infrared thermography-emotional intensity relationship model according to an embodiment of the present invention. Referring to fig. 1 and 2, the emotion measuring method based on infrared thermography of a human face provided by the embodiment includes the following steps:
s101, acquiring a standard infrared thermal image, a standard infrared thermal image and a target infrared thermal image of the face when the tested person is in a calm state, receives standard stimulation and receives target stimulation.
In order to ensure the accuracy and comparability of emotion measurement results, the acquisition environment needs to be strictly controlled, and when a reference thermal infrared image, a standard thermal infrared image and a target thermal infrared image are acquired, the temperature, the humidity, the air flow speed, the illumination intensity, the noise level, the distance and the relative relation between a camera and the face of a tested person and the facial shielding degree of the tested person are controlled to require the parameters. Specifically, the air temperature of the surrounding environment should be controlled to be 18-24 ℃, the relative humidity of air should be controlled to be 40% -60%, the air flow speed should be controlled to be below 0.2m/s, the illumination intensity should be controlled to be 150-300 lx, the noise level should be controlled to be 30-45 dB, and no meaningful sound (such as human voice, music voice and the like) should be generated on the tested sitting end and seat, the camera lens should face the tested face and be about 1m away from the face, the tested ear ornaments, earphones and caps should not be worn, the face should be in a natural cleaning state, and hair or other objects should not be allowed to shield the forehead, ears and the like. The parameters can directly influence the human face temperature measured by the infrared camera, if the parameters are not controlled, the influence factors of the human face temperature measurement result are very complex, the relation between the human face infrared thermal image and the human emotion is weakened, and the accuracy and precision of the method are influenced. If the object to be researched is one or more parameters, the parameter range should be controlled according to the specific research purpose, and if the research noise is the influence of human emotion, the noise level can be controlled according to the research requirement.
When the environment is stable, the tested person is calmed for a period of time, then an infrared thermal image of the human face is shot by using infrared camera equipment, and an infrared photo or video of the human face in a calm state is obtained and is used as a reference infrared thermal image;
the method comprises the steps of selecting series of stimuli (such as visual stimuli, auditory stimuli, tactile stimuli and the like) capable of inducing specific emotions of tested personnel with different intensities to induce the emotions of the tested personnel, and then synchronously recording facial infrared thermal images or videos of the tested personnel in different emotional states to serve as standard infrared thermal images. Wherein the series of stimuli is desirably taken from a database of generally recognized and emotionally annotated mood-inducing stimuli, such as the International Standard library of emoticons (IAPS).
And giving a target stimulus to be evaluated to the tested person, and synchronously recording the infrared thermal image or video of the face at the moment as the infrared thermal image of the target.
And S102, determining a standard infrared thermal image false color image according to the standard infrared thermal image and the standard infrared thermal image, and determining a target infrared thermal image false color image according to the standard infrared thermal image and the target infrared thermal image.
In an embodiment, determining the standard infrared thermal image false color image according to the standard infrared thermal image and the standard infrared thermal image includes:
and carrying out standardization processing on the standard infrared thermal image by taking the standard infrared thermal image as a background value, and reconstructing the standard infrared thermal image false color image by using data after the standardization processing.
Determining the target infrared thermal image false color image according to the reference infrared thermal image and the target infrared thermal image comprises the following steps:
and carrying out standardization processing on the target infrared thermal image by taking the reference infrared thermal image as a background value, and reconstructing the target infrared thermal image false color image by using data after the standardization processing.
Specifically, the normalization process is to use the standard infrared thermal image as a reference, calculate the relative values (such as the change rate) of the facial temperatures of different positions of the standard infrared thermal image and the target infrared thermal image with respect to the standard infrared thermal image, and draw the pseudo-color image by using the relative values.
S103, constructing an infrared thermal image-emotion intensity relation model according to the standard infrared thermal image pseudo-color image and the corresponding standard stimulated labeled emotion intensity.
In this embodiment, after obtaining the standard thermal infrared image pseudo-color image, establishing an thermal infrared image-emotion intensity relationship model by using the standard thermal infrared image pseudo-color image and the known labeled emotion intensity of the series of standard stimuli as data sources, and specifically, establishing the thermal infrared image-emotion intensity relationship model according to the standard thermal infrared image pseudo-color image and the labeled emotion intensity of the corresponding standard stimuli includes:
referring to distribution of arteriovenous vessels and capillary vessels on the surface layer of the human face and common human face partitions, dividing the human face into a plurality of regions, specifically comprising different regions such as a frontal region (left and right), an upper frontal region, a brow heart region, a brow region (left and right), an inner eye corner region (left and right), an outer eye corner region (left and right), a lower eyelid region (left and right), a nasal wing region (left and right), a nasal tip region, a cheekbone region (left and right), a nasal labrum region (left and right), an upper lip region (left and right), a mouth corner region (left and right), a jaw region (left and right), a chin region and an ear root region (left and right); fig. 3 shows a schematic diagram of dividing the face region of a human face. As shown in fig. 3, including the a-left forehead region; b-right frontal area; c-the prefrontal area; d-left eyebrow area; e-right eyebrow area; f-the eyebrow region; g-left inner corner of the eye region; h-right inner canthus area; i-left outer corner of the eye region; j-right external canthus region; k-the lower left eyelid area; l-lower right eyelid area; m-left alar region; n-right alar region; o-the nasal tip region; p-left zygomatic zone; q-the right zygomatic zone; r-left nasolabial sulcus region; S-Right nasolabial sulcus region; t-left upper lip region; u-upper right lip region; v-left corner of mouth region; w-right corner of mouth region; x-left jaw area; y-right jaw area; the Z-chin region; 1-left ear root zone; 2-right ear root zone.
In the standard infrared thermal image false color image, sorting according to the standard variance of the relative value of the temperature change of each area from large to small, selecting the first n (for example, 8) areas of the areas as characteristic areas, and taking the relative value of the facial temperature of the characteristic areas as characteristic quantity;
and determining the mapping relation between the characteristic quantity and the emotion intensity according to the characteristic quantity and the labeled emotion intensity of the standard stimulus, and further obtaining an infrared thermal image-emotion intensity relation model.
Specifically, multivariate linear regression can be performed on the labeled emotion intensity according to the characteristic quantity and the standard stimulus to determine the mapping relation between the characteristic quantity and the emotion intensity, and then an infrared thermal image-emotion intensity relation model is obtained.
S104, acquiring the corresponding emotion intensity of the target infrared thermal image false color image by using the infrared thermal image-emotion intensity relation model, namely acquiring the emotion intensity caused by target stimulation.
After the infrared thermal image-emotional intensity relationship model is obtained, the emotional intensity measurement can be performed by using the infrared thermal image-emotional intensity relationship model, and specifically, obtaining the corresponding emotional intensity of the target infrared thermal image false color image by using the infrared thermal image-emotional intensity relationship model comprises the following steps:
and extracting the characteristic quantity of the characteristic area of the target infrared thermal image pseudo-color image, inputting the characteristic quantity into an infrared thermal image-emotion intensity relation model, and outputting an emotion intensity quantized value after calculation.
The specific method steps for extracting the characteristic quantity of the characteristic region of the target infrared thermal image false color image in the S104 are the same as the steps for extracting the characteristic quantity of the characteristic region of the standard infrared thermal image false color image in the S103, and all the regions with the most significant change in the standard infrared thermal image false color image are taken as the characteristic regions, and the relative value of the facial temperature of the characteristic regions is taken as the characteristic quantity, which is not described herein again.
The emotion measuring method based on the human face comparison infrared thermal image is described in detail below by using a specific example:
(1) firstly, a tested person sits quietly in a quiet laboratory with stable and uniform temperature, humidity and illumination, fixes a position and a sitting posture, and shoots a facial infrared thermal image for 30 seconds after being quiet for 5 minutes to serve as a reference infrared thermal image.
(2) And playing a series of recognized emotion-induced acoustic stimuli marked with emotion intensity by using a loudspeaker or an earphone to induce subjective distressing emotional states of the tested person in different degrees. And when the acoustic stimulation is played every time, the infrared thermal image of the face of the tested person is shot in the whole process. Thus, n standard infrared thermography images can be obtained by n labeling stimulations.
(3) And (3) playing the acoustic stimulus to be evaluated by using the same method as the method in the step (2), and synchronously recording the infrared thermal image of the face of the tested person when the tested person receives the acoustic stimulus, namely the target infrared thermal image.
(4) Gridding all the obtained infrared thermal images, and reading each net by using infrared camera matched softwareThe temperature value of the grid. According to the following formula, solving the temperature value theta s and theta a of the same grid part in the standard infrared thermal image and the target infrared thermal image according to the temperature values theta s and theta a of a certain grid in the standard infrared thermal image and the target infrared thermal image0Relative value of (3) R. And drawing a pseudo-color image by using the R to obtain a standard infrared thermal image pseudo-color image and a target infrared thermal image pseudo-color image.
Rs=(θs—θ0)/θ0
Ra=(θa—θ0)/θ0
(5) And comparing and analyzing the obvious and representative areas of the pseudo-color images of the standard infrared thermography under the labeling stimulus of different labeling emotion intensities, taking the areas as characteristic areas, and taking the relative values of the pseudo-color images of the characteristic areas as characteristic quantity x. Thus, m characteristic quantities x of the false color image can be obtained1、x2……xm
(6) A relation model between the noise labeling annoyance degree Y and the m characteristic values is established by using methods such as regression analysis and the like, namely
Y=a1f(x1)+a2f(x2)+……+amf(xm)
(7) The value b of the characteristic quantity in the target infrared thermal image false color image is measured1、b2……bmSubstituting the relation obtained in the step (6) to obtain the subjective annoyance degree Y corresponding to the target sound stimulationb
a1f(b1)+a2f(b2)+……+amf(bm)。
The non-contact emotion measuring method based on the human face comparative infrared thermal image is simple and easy to implement, can make up for the defects of the current mainstream emotion measuring technology mainly based on self-describing subjective evaluation, improves the accuracy of emotion measurement, and can play a positive role in many fields such as neuroscience and psychology related research, product design evaluation, human-computer interaction and the like.
The above-mentioned embodiments are intended to illustrate the technical solutions and advantages of the present invention, and it should be understood that the above-mentioned embodiments are only the most preferred embodiments of the present invention, and are not intended to limit the present invention, and any modifications, additions, equivalents, etc. made within the scope of the principles of the present invention should be included in the scope of the present invention.

Claims (8)

1. An emotion measuring method based on human face infrared thermal imagery comprises the following steps:
collecting a standard infrared thermal image, a standard infrared thermal image and a target infrared thermal image of the face when a tested person is in a calm state and receives standard stimulation and target stimulation;
determining a standard infrared thermal image false color image according to the standard infrared thermal image and the standard infrared thermal image, and determining a target infrared thermal image false color image according to the standard infrared thermal image and the target infrared thermal image;
constructing an infrared thermal image-emotion intensity relation model according to the standard infrared thermal image pseudo-color image and the corresponding standard stimulated labeled emotion intensity;
and obtaining the emotion intensity corresponding to the target infrared thermal image false color image by using the infrared thermal image-emotion intensity relation model, namely obtaining the emotion intensity caused by target stimulation.
2. The emotion measuring method based on human face infrared thermal image of claim 1, wherein the parameters of laboratory temperature, humidity, air flow speed, illumination intensity, noise level, distance and relative relationship between the camera and the face of the tested person, and the degree of facial occlusion of the tested person are controlled when the reference infrared thermal image, the standard infrared thermal image and the target infrared thermal image are collected.
3. The method for emotion measurement based on human facial infrared thermal image of claim 1, wherein determining a standard infrared thermal image false color image from a reference infrared thermal image and a standard infrared thermal image comprises:
and carrying out standardization processing on the standard infrared thermal image by taking the standard infrared thermal image as a background value, and reconstructing the standard infrared thermal image false color image by using data after the standardization processing.
4. The method for emotion measurement based on human facial infrared thermal image of claim 1, wherein the determining the target infrared thermal image false color image from the reference infrared thermal image and the target infrared thermal image comprises:
and carrying out standardization processing on the target infrared thermal image by taking the reference infrared thermal image as a background value, and reconstructing the target infrared thermal image false color image by using data after the standardization processing.
5. The emotion measuring method based on human face infrared thermal images as claimed in claim 3 or 4, wherein the normalization process is to use the standard infrared thermal image as a reference, calculate the relative values of the facial temperatures of different positions of the standard infrared thermal image and the target infrared thermal image with respect to the standard infrared thermal image respectively, and use the relative values to draw a pseudo-color image.
6. The method for measuring emotion based on human facial infrared thermal image of claim 1, wherein the constructing of the thermal infrared image-emotion intensity relationship model according to the standard thermal infrared image pseudo-color image and the corresponding standard stimulated annotated emotion intensity comprises:
dividing the human face into a plurality of areas by referring to the distribution of arteriovenous and capillary vessels on the surface layer of the human face and the area of the human face, sequencing the areas according to the standard variance of the temperature change relative value of each area from large to small in a standard infrared thermography false color image, selecting n areas in front of the areas as characteristic areas, and taking the face temperature relative value of the characteristic areas as characteristic quantity;
and determining the mapping relation between the characteristic quantity and the emotion intensity according to the characteristic quantity and the labeled emotion intensity of the standard stimulus, and further obtaining an infrared thermal image-emotion intensity relation model.
7. The method for measuring emotion based on human facial infrared thermal imagery according to claim 6, wherein multiple linear regression is performed on labeled emotion intensities according to the feature quantity and standard stimuli to determine a mapping relationship between the feature quantity and the emotion intensities, thereby obtaining an infrared thermal imagery-emotion intensity relationship model.
8. The method for measuring emotion based on human facial infrared thermal image of claim 1, wherein the obtaining the target infrared thermal image false color image by using the infrared thermal image-emotion intensity relationship model to obtain the corresponding emotion intensity comprises:
and extracting the characteristic quantity of the characteristic area of the target infrared thermal image pseudo-color image, inputting the characteristic quantity into an infrared thermal image-emotion intensity relation model, and outputting an emotion intensity quantized value after calculation.
CN202010032776.7A 2020-01-13 2020-01-13 Emotion measuring method based on human face infrared thermal image Pending CN111110256A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010032776.7A CN111110256A (en) 2020-01-13 2020-01-13 Emotion measuring method based on human face infrared thermal image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010032776.7A CN111110256A (en) 2020-01-13 2020-01-13 Emotion measuring method based on human face infrared thermal image

Publications (1)

Publication Number Publication Date
CN111110256A true CN111110256A (en) 2020-05-08

Family

ID=70488936

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010032776.7A Pending CN111110256A (en) 2020-01-13 2020-01-13 Emotion measuring method based on human face infrared thermal image

Country Status (1)

Country Link
CN (1) CN111110256A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112507916A (en) * 2020-12-16 2021-03-16 苏州金瑞阳信息科技有限责任公司 Face detection method and system based on facial expression
CN113017634A (en) * 2021-03-22 2021-06-25 Oppo广东移动通信有限公司 Emotion evaluation method, emotion evaluation device, electronic device and computer-readable storage medium
CN114973354A (en) * 2022-04-27 2022-08-30 上海迎智正能文化发展有限公司 Individual emotion instant monitoring system and judgment method based on group face infrared thermal image
WO2022242245A1 (en) * 2021-05-19 2022-11-24 林纪良 Method for classifying physiological emotional responses by electroencephalograph

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105426812A (en) * 2015-10-27 2016-03-23 浪潮电子信息产业股份有限公司 Expression recognition method and apparatus

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105426812A (en) * 2015-10-27 2016-03-23 浪潮电子信息产业股份有限公司 Expression recognition method and apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘志磊: "概率图模型在情感计算中的应用研究", 《中国博士学位论文全文数据库 信息科技辑》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112507916A (en) * 2020-12-16 2021-03-16 苏州金瑞阳信息科技有限责任公司 Face detection method and system based on facial expression
CN113017634A (en) * 2021-03-22 2021-06-25 Oppo广东移动通信有限公司 Emotion evaluation method, emotion evaluation device, electronic device and computer-readable storage medium
WO2022242245A1 (en) * 2021-05-19 2022-11-24 林纪良 Method for classifying physiological emotional responses by electroencephalograph
CN114973354A (en) * 2022-04-27 2022-08-30 上海迎智正能文化发展有限公司 Individual emotion instant monitoring system and judgment method based on group face infrared thermal image

Similar Documents

Publication Publication Date Title
CN111110256A (en) Emotion measuring method based on human face infrared thermal image
US11154203B2 (en) Detecting fever from images and temperatures
Bulagang et al. A review of recent approaches for emotion classification using electrocardiography and electrodermography signals
US10791938B2 (en) Smartglasses for detecting congestive heart failure
CN106264568B (en) Contactless mood detection method and device
Krishna et al. An efficient mixture model approach in brain-machine interface systems for extracting the psychological status of mentally impaired persons using EEG signals
US10638938B1 (en) Eyeglasses to detect abnormal medical events including stroke and migraine
Donmez et al. Emotion classification from EEG signals in convolutional neural networks
WO2016004117A1 (en) System and signatures for a multi-modal physiological periodic biomarker assessment
KR20150098607A (en) Configuration and spatial placement of frontal electrode sensors to detect physiological signals
KR20150076167A (en) Systems and methods for sensory and cognitive profiling
CN110650685B (en) Method for assessing psychophysiological state of human
US20190313966A1 (en) Pain level determination method, apparatus, and system
Sengupta et al. A multimodal system for assessing alertness levels due to cognitive loading
WO2014150684A1 (en) Artifact as a feature in neuro diagnostics
CN107085464B (en) Emotion identification method based on P300 characters spells task
Kołodziej et al. Electrodermal activity measurements for detection of emotional arousal
CN106667467A (en) Child physiological parameter acquiring and emotion detecting system
Butt et al. Multimodal personality trait recognition using wearable sensors in response to public speaking
CN109998497A (en) System and plane of illumination illumination testing apparatus are sentenced in inspection of falling asleep in luminous environment
Abd Latif et al. Thermal imaging based affective state recognition
Lopez-Martinez et al. Multi-task multiple kernel machines for personalized pain recognition from functional near-infrared spectroscopy brain signals
Kesedžić et al. Classification of cognitive load based on neurophysiological features from functional near-infrared spectroscopy and electrocardiography signals on n-back task
Pandey et al. Detecting moments of distraction during meditation practice based on changes in the EEG signal
AU2021101097A4 (en) A system and method for automatic playlist generation by analysing human emotions through physiological signals

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination