CN117274694A - Emotion score determination model construction and emotion score determination method, device and equipment - Google Patents

Emotion score determination model construction and emotion score determination method, device and equipment Download PDF

Info

Publication number
CN117274694A
CN117274694A CN202311227037.3A CN202311227037A CN117274694A CN 117274694 A CN117274694 A CN 117274694A CN 202311227037 A CN202311227037 A CN 202311227037A CN 117274694 A CN117274694 A CN 117274694A
Authority
CN
China
Prior art keywords
emotion
class
score
determining
scores
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311227037.3A
Other languages
Chinese (zh)
Inventor
罗智勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dishi Technology Co ltd
Original Assignee
Beijing Dishi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dishi Technology Co ltd filed Critical Beijing Dishi Technology Co ltd
Priority to CN202311227037.3A priority Critical patent/CN117274694A/en
Publication of CN117274694A publication Critical patent/CN117274694A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/48Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
    • G10L25/51Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
    • G10L25/63Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for estimating an emotional state
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Psychiatry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Biophysics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Hospice & Palliative Care (AREA)
  • Social Psychology (AREA)
  • Surgery (AREA)
  • Child & Adolescent Psychology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Signal Processing (AREA)
  • Biomedical Technology (AREA)
  • Psychology (AREA)
  • Physiology (AREA)
  • Educational Technology (AREA)
  • Developmental Disabilities (AREA)
  • Mathematical Physics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Fuzzy Systems (AREA)
  • Computational Linguistics (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Acoustics & Sound (AREA)

Abstract

The invention relates to the technical field of computer vision and discloses a method, a device and equipment for constructing an emotion score determining model, wherein the model constructing method is used for characterizing the association relation between scores of first class labels and scores of second class labels, a first classification mode comprises a plurality of first class labels, the second classification mode comprises a plurality of second class labels, and when determining the scores of the first class labels of an emotion image to be analyzed, the emotion score determining model can be used for determining the corresponding second class labels and scores of the emotion image to be analyzed, so that the defect that the score results of the emotion score determined in the second classification mode based on a scale in the related art are inaccurate is overcome.

Description

Emotion score determination model construction and emotion score determination method, device and equipment
Technical Field
The invention relates to the technical field of computer vision, in particular to a method, a device and equipment for constructing a mood score determination model and determining a mood score.
Background
The seven emotion scale of traditional Chinese medicine is a tool for evaluating the emotional state of individuals, and based on the theory of the emotion of traditional Chinese medicine, the emotion is divided into seven basic emotions including happiness, anger, anxiety, thinking, sadness, terrorism and frightening. The use of these scales can help doctors or researchers understand the emotional state and intensity of an individual. For individuals, participation in the assessment may increase awareness of their own emotions and help the individual adjust the emotional state.
Currently, when emotion scoring is performed by using a seven emotion scale of traditional Chinese medicine, the emotion scale is generally filled in based on a manual mode, the filled emotion scale is analyzed, and emotion types and scores of participants are determined. However, the manner in which a tester's mood score is determined based on a scale suffers from the following drawbacks: the scale depends on subjective answers of participants, and the evaluation result is inaccurate due to the fact that the participants may have answer deviation, memory distortion or social expectation influence; in face-to-face or telephone interviews, participants may be affected by social response bias, i.e., they may be more inclined to give answers that are deemed correct or socially desirable than the real case; most scale detection methods rely on the recall and self-description of participants and cannot effectively observe and record some non-verbal or non-subjective behaviors, attitudes or emotions.
Disclosure of Invention
In view of the above, the invention provides a method, a device and equipment for constructing an emotion score determination model and determining emotion scores, which are used for solving the problem that the evaluation result of emotion scores by utilizing the seven emotion scales of traditional Chinese medicine is not accurate enough.
In a first aspect, the present invention provides a method for constructing an emotion score determination model, the method comprising: acquiring a plurality of emotion sample images; classifying the plurality of emotion sample images according to a first classification mode to obtain a plurality of sets, wherein each emotion sample image in the same set corresponds to the same first class label; determining a first class score for each emotion sample image in each set; classifying each emotion sample image in each set according to a second classification mode to obtain a plurality of subsets, wherein one set corresponds to the plurality of subsets, and each emotion sample image in the same subset corresponds to the same second class label; determining second class scores corresponding to the emotion sample images in the subset respectively; and constructing an emotion score determining model according to the second class scores and the first class scores respectively corresponding to the emotion sample images in the subsets, wherein the emotion score determining model is used for representing the association relation between each first class label score and each second class label score.
According to the emotion score determination model construction method provided by the invention, a plurality of emotion sample images are classified according to a first classification mode to obtain a plurality of sets, each emotion sample image in the same set corresponds to the same first class label, and the first class score of each emotion sample image in each set is determined; in each set, classifying each emotion sample image in the set according to a second classification mode to obtain a plurality of subsets, wherein each emotion sample image in the same subset corresponds to the same second class label, determining the second class score of each emotion sample image in each subset, constructing an emotion score determination model based on the first class score and the second class score corresponding to each emotion sample image in each subset, wherein the emotion score determination model is used for representing the association relation between each first class label score and each second class label score, and subsequently, when determining the first class label score of the emotion image to be analyzed, determining the second class label and score corresponding to the emotion image to be analyzed by using the emotion score determination model, thereby solving the defect that the scoring result of the emotion score determination in the second classification mode based on the scale in the related art is inaccurate.
In an alternative embodiment, the step of constructing the emotion score determination model according to the second class score and the first class score respectively corresponding to the emotion sample images in each subset includes: determining a first class scoring sequence and a second class scoring sequence for the corresponding subset based on the second class scores and the first class label scores for the emotion sample images in the subsets; calculating pearson coefficients between the first class scoring sequence and the second class scoring sequence corresponding to each subset; and constructing an emotion score determination model based on the Pelson coefficients between the first class score sequences and the second class score sequences corresponding to the subsets.
According to the method provided by the alternative embodiment, the emotion score determining model is built through the first class score sequences and the second class score sequences corresponding to the subsets, so that the building effect of the model is better.
In an alternative embodiment, the step of classifying the plurality of emotion sample images in a first classification mode to obtain a plurality of sets includes: inputting a plurality of emotion sample images into a preset emotion analysis model, so that the preset emotion analysis model outputs a first class label of each emotion sample image; a plurality of sets is determined based on the first category label for each emotion sample image.
According to the method provided by the alternative embodiment, the classification of each emotion sample image in the first classification mode is performed based on the preset emotion analysis model, so that the classification result is more accurate, and the classification efficiency is higher.
In an alternative embodiment, in each set, classifying each emotion sample image in the set according to a second classification mode to obtain a plurality of subsets and second class scores corresponding to each image in the subsets, where the step includes: identifying labeling information of each emotion sample image in the collection; determining a second class label corresponding to each emotion sample image in the collection based on the labeling information of each emotion sample image in the collection; a plurality of subsets in each set is determined based on the second class labels of each emotion sample image in the set.
In a second aspect, an embodiment of the present invention provides a mood score determining method, where the method further includes: acquiring a plurality of emotion images to be analyzed of a target tester; processing the plurality of emotion images to be analyzed according to a first classification mode to obtain a first class label and a score of each image to be analyzed; inputting the first class labels and the scores of each image to be analyzed into an emotion score determining model, so that the emotion score determining model outputs scores of the second class labels corresponding to different images to be analyzed respectively, and the emotion score determining model is constructed by the emotion score determining model construction method of the first aspect or any implementation mode corresponding to the first aspect; and determining the emotion scores of the target testers based on the scores of the second class labels corresponding to the different images to be analyzed.
According to the emotion score determining method provided by the invention, the second class labels and scores corresponding to the emotion images to be analyzed are determined by using the emotion score determining model, so that the defect that the scoring results are inaccurate when the emotion scores in the second classification mode are determined based on the scale in the related technology is overcome.
In an alternative embodiment, the step of acquiring a plurality of emotion images to be analyzed of the target tester includes: acquiring a human face visual image video of a target tester; and splitting the target face visual image video frame by frame to obtain a plurality of emotion images to be analyzed.
In an alternative embodiment, the step of determining the emotion score of the target tester based on the scores of the second class labels corresponding to the different images to be analyzed, respectively, includes: determining the distribution probability of each second-class label based on scores of the second-class labels respectively corresponding to different images to be analyzed; an emotion score for the target tester is determined based on the probability of distribution of each second category label.
In a third aspect, the present invention provides an emotion score determination model construction device, including: the first acquisition module is used for acquiring a plurality of emotion sample images; the first classification module is used for classifying the plurality of emotion sample images according to a first classification mode to obtain a plurality of sets, wherein each emotion sample image in the same set corresponds to the same first class label; a first determining module for determining a first class score for each emotion sample image in each set; the second classification module is used for classifying each emotion sample image in each set according to a second classification mode to obtain a plurality of subsets, wherein one set corresponds to the plurality of subsets, and each emotion sample image in the same subset corresponds to the same second class label; the second determining module is used for determining second class scores corresponding to the emotion sample images in the subset respectively; the construction module is used for constructing an emotion score determination model according to the second class scores and the first class scores which correspond to the emotion sample images in the subsets, and the emotion score determination model is used for representing the association relation between the first class label scores and each second class label score.
In an alternative embodiment, the building block comprises: a first determining sub-module for determining a first class score sequence and a second class score sequence for the corresponding subset based on the second class scores and the first class label scores for the emotion sample images in the subsets; the computing sub-module is used for computing the pearson coefficients between the first class scoring sequences and the second class scoring sequences corresponding to the subsets; and the construction submodule is used for constructing an emotion score determination model based on the Pelson coefficients between the first class score sequences and the second class score sequences corresponding to the subsets.
In an alternative embodiment, the first classification module includes: the processing sub-module is used for inputting a plurality of emotion sample images into the preset emotion analysis model, so that the preset emotion analysis model outputs a first class label of each emotion sample image; a second determination sub-module for determining a plurality of sets based on the first category labels of each emotion sample image.
In an alternative embodiment, the second classification module includes: the identification sub-module is used for identifying the labeling information of each emotion sample image in the collection; a third determining sub-module, configured to determine a second class label corresponding to each emotion sample image in the set based on the labeling information of each emotion sample image in the set; a fourth determination sub-module for determining a plurality of subsets in each set based on the second class labels of each emotion sample image in each set.
In a fourth aspect, the present invention provides an emotion score determination device, comprising: the second acquisition module is used for acquiring a plurality of emotion images to be analyzed of the target tester; the processing module is used for processing the plurality of emotion images to be analyzed according to a first classification mode to obtain a first class label and a score of each image to be analyzed; the third determining module is used for inputting the first class labels and the scores of each image to be analyzed into an emotion score determining model, so that the emotion score determining model outputs scores of the second class labels corresponding to the different images to be analyzed respectively, and the emotion score determining model is constructed by the emotion score determining model construction method of the first aspect or any implementation mode corresponding to the first aspect; and the fourth determining module is used for determining the emotion scores of the target testers based on the scores of the second class labels corresponding to the images to be analyzed.
In an alternative embodiment, the second acquisition module includes: the acquisition sub-module is used for acquiring a face visual image video of the target tester; and the molecule splitting module is used for splitting the target face visual image video frame by frame to obtain a plurality of emotion images to be analyzed.
In a fifth aspect, the present invention provides a computer device comprising: the processor executes the computer instructions to perform the emotion score determination model construction method according to the first aspect or any embodiment thereof, or to perform the emotion score determination method according to the second aspect or any embodiment thereof.
In a sixth aspect, the present invention provides a computer-readable storage medium having stored thereon computer instructions for causing a computer to execute the emotion score determination model construction method of the first aspect or any of the embodiments corresponding thereto, or the emotion score determination method of the second aspect or any of the embodiments corresponding thereto.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the present invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a mood score determination model construction method in accordance with an embodiment of the present invention;
FIG. 2 is a flow chart of another emotion score determination model construction method according to an embodiment of the present invention;
FIG. 3 is a flow chart of a mood score determination method in accordance with an embodiment of the present invention;
FIG. 4 is a flow chart of another emotion score determination method in accordance with an embodiment of the present invention;
FIG. 5 is a block diagram of a structure of an emotion score determination model construction device according to an embodiment of the present invention;
fig. 6 is a block diagram of a structure of an emotion score determination device according to an embodiment of the present invention;
fig. 7 is a schematic diagram of a hardware structure of a computer device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In the related art, when emotion scoring is performed by using a seven emotion scale of traditional Chinese medicine, the emotion scale is generally filled in based on a manual mode, and the filled emotion scale is analyzed to determine emotion types and scores of participants. However, the manner in which the emotion scores of the testers are determined based on the scales is greatly influenced by the subjective influence of the testers, and the scoring result is not accurate enough.
In view of this, the embodiment of the invention provides a method for constructing an emotion score determination model, which can be applied to a processor to construct the emotion score determination model. According to the method provided by the invention, a plurality of emotion sample images are classified according to a first classification mode to obtain a plurality of sets, each emotion sample image in the same set corresponds to the same first class label, and the first class score of each emotion sample image in each set is determined; in each set, classifying each emotion sample image in the set according to a second classification mode to obtain a plurality of subsets, wherein each emotion sample image in the same subset corresponds to the same second class label, determining the second class score of each emotion sample image in each subset, constructing an emotion score determination model based on the first class score and the second class score corresponding to each emotion sample image in each subset, wherein the emotion score determination model is used for representing the association relation between each first class label score and each second class label score, and subsequently, when determining the first class label score of the emotion image to be analyzed, determining the second class label and score corresponding to the emotion image to be analyzed by using the emotion score determination model, thereby solving the defect that the scoring result of the emotion score determination in the second classification mode based on the scale in the related art is inaccurate.
According to an embodiment of the present invention, there is provided an emotion score determination model construction method embodiment, it being noted that the steps shown in the flowchart of the drawings may be performed in a computer system such as a set of computer-executable instructions, and that, although a logical order is shown in the flowchart, in some cases, the steps shown or described may be performed in an order different from that herein.
In this embodiment, a method for constructing an emotion score determination model is provided, which may be used in the above processor, and fig. 1 is a flowchart of a method for constructing an emotion score determination model according to an embodiment of the present invention, as shown in fig. 1, where the flowchart includes the following steps:
step S101, a plurality of emotion sample images are acquired.
For example, the emotion sample image may contain emotion sample images of different emotion states and expressions, which may include a scene or emotion episode in which a person demonstrates different emotions, emotional excitement, in order to cover a wide variety of participants.
Step S102, classifying the plurality of emotion sample images according to a first classification mode to obtain a plurality of sets, wherein each emotion sample image in the same set corresponds to the same first category label.
The first classification mode may be any emotion classification mode, and in this embodiment of the present application, the first classification mode may include, but is not limited to, a method for performing emotion classification by using a mental health emotion analysis index, where the first classification mode includes a plurality of categories, and each category corresponds to a first category label; classifying the plurality of emotion sample images based on a first classification mode to obtain a plurality of sets, wherein the emotion sample images in each set correspond to the same first class labels, and each first class label and corresponding information in the first classification mode are shown in the following table 1.
TABLE 1
Step S103, determining a first class score of each emotion sample image in each set.
Illustratively, the first class score is used to characterize the intensity of the emotion of the corresponding class, and in the embodiment of the application, through a non-contact real-time visual data acquisition technology, an artificial intelligence AI deep learning technology, and technologies of psychology, physiology and the like, the emotion change is presumed through detecting head and neck movement and fluctuation in the emotion sample images, so that the emotion score of each emotion sample image is determined.
Step S104, classifying each emotion sample image in each set according to a second classification mode to obtain a plurality of subsets, wherein one set corresponds to the plurality of subsets, and each emotion sample image in the same subset corresponds to the same second class label.
Illustratively, the second classification mode may be an emotion classification mode, which is different from the first classification mode, and in this embodiment of the present application, the second classification mode may include, but is not limited to, emotion classification methods based on seven-emotion assessment in traditional Chinese medicine, where the general assessment items of the seven-emotion assessment scale in traditional Chinese medicine are as follows: a. preference is given to: emotion manifested by happiness, joy and optimism; b. anger: including anger, annoyance, irritability, etc. c. And (3) worry: including depression, wound, depression, etc.; d. the following steps: including thinking, anxiety, nervous tension, etc.; e. sad: including sadness, loss, etc.; f. and (3) terrorism: including fear, panic, and the like; g. frightening: including surprise, shock, restlessness, etc.
Step S105, determining second class scores corresponding to the emotion sample images in the subset respectively.
For example, in the embodiment of the present application, a preset scoring rule may determine a second class score corresponding to each emotion sample image in the subset.
And S106, constructing an emotion score determining model according to the second class scores and the first class scores respectively corresponding to the emotion sample images in the subsets, wherein the emotion score determining model is used for representing the association relation between each first class label score and each second class label score.
Illustratively, based on the second class score and the first class score respectively corresponding to the emotion sample images in each subset, a correlation between the second class score and the first class score in each subset may be determined, and an emotion score determination model may be constructed based on the correlation.
According to the emotion score determination model construction method provided by the embodiment, a plurality of emotion sample images are classified according to a first classification mode to obtain a plurality of sets, each emotion sample image in the same set corresponds to the same first class label, and the first class score of each emotion sample image in each set is determined; in each set, classifying each emotion sample image in the set according to a second classification mode to obtain a plurality of subsets, wherein each emotion sample image in the same subset corresponds to the same second class label, determining the second class score of each emotion sample image in each subset, constructing an emotion score determination model based on the first class score and the second class score corresponding to each emotion sample image in each subset, wherein the emotion score determination model is used for representing the association relation between each first class label score and each second class label score, and subsequently, when determining the first class label score of the emotion image to be analyzed, determining the second class label and score corresponding to the emotion image to be analyzed by using the emotion score determination model, thereby solving the defect that the scoring result of the emotion score determination in the second classification mode based on the scale in the related art is inaccurate.
In this embodiment, a method for constructing an emotion score determination model is provided, which may be used in the processor described above, and fig. 2 is a flowchart of a method for constructing an emotion score determination model according to an embodiment of the present invention, as shown in fig. 2, where the flowchart includes the following steps:
step S201, a plurality of emotion sample images are acquired. Please refer to step S101 in the embodiment shown in fig. 1 in detail, which is not described herein.
Step S202, classifying the plurality of emotion sample images according to a first classification mode to obtain a plurality of sets, wherein each emotion sample image in the same set corresponds to the same first category label. Please refer to step S102 in the embodiment shown in fig. 1 in detail, which is not described herein.
Specifically, the step S202 includes:
step S2021, inputting the plurality of emotion sample images into the preset emotion analysis model, so that the preset emotion analysis model outputs the first class label of each emotion sample image. Illustratively, in the embodiment of the present application, the preset emotion analysis model may be a model constructed by using a computer vision technology, artificial intelligence, a deep learning model convolutional neural network, a face recognition algorithm, or the like, and the model may implement classification of the emotion sample image in the first classification manner. Artificial intelligence (Artificial Intelligence, AI) refers to intelligence implemented by a computer, which is a field covering multiple disciplines, including machine learning, pattern recognition, natural language processing, intelligent control, computer vision, and the like. Artificial intelligence attempts to simulate the intelligent and mental processes of humans in theory and practice, enabling machines to understand, infer, learn and make decisions like humans. Artificial intelligence systems can process large amounts of data and extract patterns and rules therefrom through learning and use this information to autonomously make decisions or predict future conditions. The application of artificial intelligence has been very widespread today, including speech recognition, image recognition, intelligent customer service, automation control, etc. Deep learning is a machine learning technique that employs structures and algorithms similar to human brain neural networks, with the aim of learning a characteristic representation of data and classifying or predicting. Deep learning models are typically composed of multiple hierarchies, each hierarchy learning a higher level abstract feature representation of data through computation. The deep learning model is usually trained by adopting a back propagation algorithm, and the performance of the model is gradually improved by continuously fine-tuning parameters through repeated iterative training. Emotion recognition and assessment is performed using deep learning models, such as convolutional neural networks and recurrent neural networks. And carrying out emotion classification and quantification on the data such as facial expression, sound and the like in vision. These models can learn and infer emotional states through the training of large-scale data sets. Computer vision is a technical field that studies how to let a computer acquire, understand and interpret visual information by processing digital images or vision. It involves several disciplines of image processing, pattern recognition, machine learning, artificial intelligence, etc., with the aim of providing a computer with the ability to resemble the human visual system. By using computer vision techniques, facial expressions, body language, etc. in vision can be analyzed and identified. This involves facial expression recognition, motion recognition, and the like.
In the embodiment of the application, emotion analysis is performed by a non-contact real-time visual data acquisition technology, an artificial intelligence AI deep learning technology, and technologies such as psychology and physiology are combined, and emotion change is estimated by detecting head and neck movement and fluctuation. Human emotion (3D) head and neck movements and fluctuations are detected by three-dimensional control, accumulating frames as frame differences in several views. Vibration parameters (frequency and amplitude) of each element (pixel) of the research visual image are calculated to measure the micro motion, time and space fluctuation of the measured object, thereby providing a means for quantitatively measuring the emotion of the measured object.
The video image first needs to be acquired and broken down into a series of consecutive frames. Then, for each frame, information of head and neck motion and fluctuation can be obtained by calculating the difference between adjacent frames. The temporal and spatial fluctuations of each pixel are achieved by comparing pixel values between adjacent frames, and the gray value differences of the pixels or the euclidean distances between the pixels are calculated.
Second, these wave parameters can be further analyzed to measure minute movements and emotional changes of the measured object. By calculating vibration parameters such as frequency and amplitude. The frequency may represent the degree of head and neck movement and fluctuation, while the amplitude may represent its intensity or amplitude. By analyzing the changes in these vibration parameters, the emotional state of the object to be measured can be presumed.
By analyzing the pixel differences between adjacent frames, the emotional state may be determined. When the target moves rapidly or changes drastically, the pixel difference is large, reflecting a high emotional state. Conversely, when the target is in a relatively steady state, the pixel difference is smaller, reflecting a lower emotional state.
Different emotional states may lead to different head and neck movements and wave patterns, which in turn exhibit different vibration parameter variations. By collecting and analyzing vibration parameter data, a correlation model between emotional states and vibration parameter changes can be established. Thus, the psychological emotion condition of the target is calculated, and the relevant psychological/physiological indexes can be obtained through the test of the tested person for 30-60 seconds, wherein each emotion has a corresponding algorithm.
For example, in a researcher's anger emotional state: first, a set of sample data including an anger state and a non-anger state is collected, and a corresponding vibration parameter is extracted. Then, an emotion classification model is trained using the sample data, and vibration parameters are mapped to an anger state or a non-anger state. Finally, for newly collected vibration parameter data, classification is performed by using a trained model, the probability distribution of anger states or the intensity degree of anger emotion is calculated, and the anger emotion states of the targets can be determined based on the vibration parameters.
Step S2022, a plurality of sets are determined based on the first class label of each emotion sample image. In the embodiment of the application, the images belonging to the same first category are taken as one set based on the first category label of each emotion sample image, so that a plurality of sets are obtained.
Step S203, determining a first class score of each emotion sample image in each set. Please refer to step S103 in the embodiment shown in fig. 1 in detail, which is not described herein.
Step S204, classifying each emotion sample image in each set according to a second classification mode to obtain a plurality of subsets, wherein one set corresponds to the plurality of subsets, and each emotion sample image in the same subset corresponds to the same second class label. Please refer to step S104 in the embodiment shown in fig. 1 in detail, which is not described herein.
Specifically, the step S204 includes:
step S2041, identifying labeling information of each emotion sample image in the collection. Illustratively, in the embodiment of the application, the main emotion expressed in each vision is determined by labeling each emotion sample image. Each vision may be assigned a corresponding emotion score using the emotion tags provided by the seven emotion rating scale of traditional Chinese medicine.
Step S2042, determining a second class label corresponding to each emotion sample image in the collection based on the labeling information of each emotion sample image in the collection. For example, in an embodiment of the present application, the processor may determine the second class label and the corresponding second class score of each emotion sample image in each set based on the labeling information by identifying the labeling information of each emotion sample image.
Step S2043 determines a plurality of subsets in each set based on the second class labels of each emotion sample image in the set. Illustratively, the emotional sample images belonging to the same second category in each collection are taken as a subset based on the second category labels of the emotional sample images in each collection.
Step S205, determining second class scores corresponding to the emotion sample images in the subset respectively. Please refer to step S105 in the embodiment shown in fig. 1 in detail, which is not described herein.
And S206, constructing an emotion score determination model according to the second class scores and the first class scores respectively corresponding to the emotion sample images in the subsets, wherein the emotion score determination model is used for representing the association relationship between each first class label score and each second class label score. Please refer to step S106 in the embodiment shown in fig. 1 in detail, which is not described herein.
Specifically, the step S206 includes:
step S2061, determining a first class score sequence and a second class score sequence for the corresponding subset based on the second class scores and the first class label scores for the emotion sample images in the respective subsets.
Illustratively, a second class score sequence is generated based on the second class scores corresponding to the emotion sample images in each subset, and a first class score sequence is generated based on the first class scores of the emotion sample images in the corresponding subset.
Step S2062, calculating pearson coefficients between the first class score sequences and the second class score sequences for each subset.
Illustratively, the pearson coefficients between the first class score sequence and the second class score sequence corresponding to each subset are calculated, and the calculated pearson coefficients can be used to characterize the correlation between the first class score and the second class score; in the embodiment of the present application, the correlation between the mental health emotion analysis index and the seven emotion evaluation index of the traditional Chinese medicine is shown in the following table 2.
TABLE 2
Step S2063, constructing an emotion score determination model based on the pearson coefficients between the first class score sequences and the second class score sequences corresponding to the subsets.
In this embodiment, a method for determining emotion score is provided, which may be used in the above processor, and fig. 3 is a flowchart of a method for determining emotion score according to an embodiment of the present invention, as shown in fig. 3, where the flowchart includes the following steps:
step S301, a plurality of emotion images to be analyzed of the target tester are acquired.
Illustratively, the target tester may be any tester to be subjected to emotion assessment, and the plurality of images to be analyzed may be facial expression images of the target tester.
Step S302, processing the plurality of emotion images to be analyzed according to a first classification mode to obtain a first class label and a score of each image to be analyzed.
The first class label and the score of each image to be analyzed are determined by a first classification mode, and in the embodiment of the application, the first class label and the score of each image to be analyzed can be determined by a preset emotion analysis model.
Step S303, inputting the first category labels and the scores of each image to be analyzed into an emotion score determination model, so that the emotion score determination model outputs scores of different images to be analyzed corresponding to the second category labels respectively, and the emotion score determination model is constructed by the emotion score determination model construction method in the embodiment.
Illustratively, the first class label of each image to be analyzed is input into an emotion score determination model, which outputs the score of the image to be analyzed under the second class labels.
And S304, determining the emotion scores of the target testers based on the scores of the second category labels corresponding to the images to be analyzed.
According to the emotion score determining method, the second class labels and the scores corresponding to the emotion images to be analyzed are determined by using the emotion score determining model, and the defect that the scoring results of the emotion scores in the second classification mode are not accurate enough due to the fact that the emotion scores are determined in the related technology based on the scale mode is overcome.
In this embodiment, a method for determining emotion score is provided, which may be used in the above processor, and fig. 4 is a flowchart of a method for determining emotion score according to an embodiment of the present invention, as shown in fig. 4, where the flowchart includes the following steps:
step S401, a plurality of emotion images to be analyzed of a target tester are obtained. Please refer to step S301 in the embodiment shown in fig. 1 in detail, which is not described herein.
Specifically, the step S401 includes:
step S4011, obtaining a face visual image video of a target tester.
Illustratively, the target test may be any tester needing emotion detection, and in the embodiment of the application, the human face visual image video may be a facial expression change image of the target tester within 60 s.
Step S4012, splitting the target face visual image video frame by frame to obtain a plurality of emotion images to be analyzed.
Illustratively, the face visual image is split frame by frame, so that a plurality of split video images can be obtained, and the obtained multi-frame video images are used as a plurality of emotion images to be analyzed.
And step S402, processing the plurality of emotion images to be analyzed according to a first classification mode to obtain a first class label and a score of each image to be analyzed. Please refer to step S302 in the embodiment shown in fig. 1 in detail, which is not described herein.
Step S403, inputting the first category labels and the scores of each image to be analyzed into an emotion score determination model, so that the emotion score determination model outputs scores of the second category labels corresponding to the different images to be analyzed, and the emotion score determination model is constructed by the emotion score determination model construction method in the embodiment. Please refer to step S303 in the embodiment shown in fig. 1 in detail, which is not described herein.
And step S404, determining the emotion scores of the target testers based on the scores of the second category labels corresponding to the images to be analyzed. Please refer to step S304 in the embodiment shown in fig. 1 in detail, which is not described herein.
Specifically, the step S404 includes:
step S4041, determining the distribution probability of each second-class label based on the scores of the second-class labels corresponding to the images to be analyzed.
The probability of the distribution of the second-class labels in the plurality of images to be analyzed is determined based on the score of each image to be analyzed corresponding to the second-class labels.
Step S4042, determining the emotion score of the target tester based on the distribution probability of each second class label.
In an exemplary embodiment of the present application, the second class label with the largest distribution probability may be used as the second class label of the target tester based on the distribution probability of the second class label in the plurality of images to be analyzed, and the emotion score of the target tester may be determined based on the scoring data corresponding to the second class label with the largest distribution probability.
The embodiment also provides a device for constructing the emotion score determination model, which is used for realizing the embodiment and the preferred implementation manner, and the description is omitted. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
The embodiment provides a mood score determination model construction device, as shown in fig. 5, including:
a first obtaining module 501, configured to obtain a plurality of emotion sample images;
the first classification module 502 is configured to classify the plurality of emotion sample images according to a first classification manner to obtain a plurality of sets, where each emotion sample image in the same set corresponds to the same first class label;
a first determining module 503, configured to determine a first class score of each emotion sample image in each set;
a second classification module 504, configured to classify, in each set, each emotion sample image in the set according to a second classification manner, so as to obtain a plurality of subsets, where one set corresponds to the plurality of subsets, and each emotion sample image in the same subset corresponds to the same second class label;
a second determining module 505, configured to determine a second class score corresponding to each emotion sample image in the subset;
the construction module 506 is configured to construct an emotion score determination model according to the second category scores and the first category scores corresponding to the emotion sample images in the subsets, where the emotion score determination model is used to characterize an association relationship between each first category label score and each second category label score.
In some alternative embodiments, the build module 506 includes:
a first determining sub-module for determining a first class score sequence and a second class score sequence for the corresponding subset based on the second class scores and the first class label scores for the emotion sample images in the subsets;
the computing sub-module is used for computing the pearson coefficients between the first class scoring sequences and the second class scoring sequences corresponding to the subsets;
and the construction submodule is used for constructing an emotion score determination model based on the Pelson coefficients between the first class score sequences and the second class score sequences corresponding to the subsets.
In some alternative embodiments, the first classification module includes:
the processing sub-module is used for inputting a plurality of emotion sample images into the preset emotion analysis model, so that the preset emotion analysis model outputs a first class label of each emotion sample image;
a second determination sub-module for determining a plurality of sets based on the first category labels of each emotion sample image.
In some alternative embodiments, the second classification module includes:
the identification sub-module is used for identifying the labeling information of each emotion sample image in the collection;
a third determining sub-module, configured to determine a second class label corresponding to each emotion sample image in the set based on the labeling information of each emotion sample image in the set;
A fourth determination sub-module for determining a plurality of subsets in each set based on the second class labels of each emotion sample image in each set.
The present embodiment provides an emotion score determination device, as shown in fig. 6, including:
a second obtaining module 601, configured to obtain a plurality of emotion images to be analyzed of a target tester;
the processing module 602 is configured to process the plurality of emotion images to be analyzed according to a first classification manner to obtain a first class label and a score of each of the images to be analyzed;
the third determining module 603 is configured to input the first class label and the score of each image to be analyzed into an emotion score determining model, so that the emotion score determining model outputs scores of the second class labels corresponding to the different images to be analyzed, and the emotion score determining model is obtained by constructing the emotion score determining model constructing method in the above embodiment;
a fourth determining module 604, configured to determine an emotion score of the target tester based on scores of the second category labels corresponding to the images to be analyzed.
In some alternative embodiments, the second acquisition module 601 includes:
the acquisition sub-module is used for acquiring a face visual image video of the target tester;
And the molecule splitting module is used for splitting the target face visual image video frame by frame to obtain a plurality of emotion images to be analyzed.
In some alternative embodiments, the fourth determination module includes:
a fifth determining submodule, configured to determine a distribution probability of each second-class label based on scores of the second-class labels corresponding to the images to be analyzed;
and a sixth determination submodule for determining the emotion score of the target tester based on the distribution probability of each second category label.
Further functional descriptions of the above respective modules and units are the same as those of the above corresponding embodiments, and are not repeated here.
The mood score determination model construction means and mood score determination means in this embodiment are presented in the form of functional units, herein referred to as ASIC (Application Specific Integrated Circuit ) circuits, processors and memories executing one or more software or firmware programs, and/or other devices that can provide the above described functionality.
The embodiment of the invention also provides computer equipment, which is provided with the emotion score determination model construction device shown in the figure 5 or the emotion score determination device shown in the figure 6.
Referring to fig. 7, fig. 7 is a schematic structural diagram of a computer device according to an alternative embodiment of the present invention, as shown in fig. 7, the computer device includes: one or more processors 10, memory 20, and interfaces for connecting the various components, including high-speed interfaces and low-speed interfaces. The various components are communicatively coupled to each other using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions executing within the computer device, including instructions stored in or on memory to display graphical information of the GUI on an external input/output device, such as a display device coupled to the interface. In some alternative embodiments, multiple processors and/or multiple buses may be used, if desired, along with multiple memories and multiple memories. Also, multiple computer devices may be connected, each providing a portion of the necessary operations (e.g., as a server array, a set of blade servers, or a multiprocessor system). One processor 10 is illustrated in fig. 7.
The processor 10 may be a central processor, a network processor, or a combination thereof. The processor 10 may further include a hardware chip, among others. The hardware chip may be an application specific integrated circuit, a programmable logic device, or a combination thereof. The programmable logic device may be a complex programmable logic device, a field programmable gate array, a general-purpose array logic, or any combination thereof.
Wherein the memory 20 stores instructions executable by the at least one processor 10 to cause the at least one processor 10 to perform a method for implementing the embodiments described above.
The memory 20 may include a storage program area that may store an operating system, at least one application program required for functions, and a storage data area; the storage data area may store data created according to the use of the computer device, etc. In addition, the memory 20 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid-state storage device. In some alternative embodiments, memory 20 may optionally include memory located remotely from processor 10, which may be connected to the computer device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Memory 20 may include volatile memory, such as random access memory; the memory may also include non-volatile memory, such as flash memory, hard disk, or solid state disk; the memory 20 may also comprise a combination of the above types of memories.
The computer device also includes a communication interface 30 for the computer device to communicate with other devices or communication networks.
The embodiments of the present invention also provide a computer readable storage medium, and the method according to the embodiments of the present invention described above may be implemented in hardware, firmware, or as a computer code which may be recorded on a storage medium, or as original stored in a remote storage medium or a non-transitory machine readable storage medium downloaded through a network and to be stored in a local storage medium, so that the method described herein may be stored on such software process on a storage medium using a general purpose computer, a special purpose processor, or programmable or special purpose hardware. The storage medium can be a magnetic disk, an optical disk, a read-only memory, a random access memory, a flash memory, a hard disk, a solid state disk or the like; further, the storage medium may also comprise a combination of memories of the kind described above. It will be appreciated that a computer, processor, microprocessor controller or programmable hardware includes a storage element that can store or receive software or computer code that, when accessed and executed by the computer, processor or hardware, implements the methods illustrated by the above embodiments.
Although embodiments of the present invention have been described in connection with the accompanying drawings, various modifications and variations may be made by those skilled in the art without departing from the spirit and scope of the invention, and such modifications and variations fall within the scope of the invention as defined by the appended claims.

Claims (15)

1. A mood score determination model construction method, the method comprising:
acquiring a plurality of emotion sample images;
classifying the plurality of emotion sample images according to a first classification mode to obtain a plurality of sets, wherein each emotion sample image in the same set corresponds to the same first class label;
determining a first class score for each emotion sample image in each set;
classifying each emotion sample image in each set according to a second classification mode to obtain a plurality of subsets, wherein one set corresponds to the plurality of subsets, and each emotion sample image in the same subset corresponds to the same second class label;
determining second class scores corresponding to the emotion sample images in the subset respectively;
and constructing an emotion score determining model according to the second class scores and the first class scores respectively corresponding to the emotion sample images in the subsets, wherein the emotion score determining model is used for representing the association relation between each first class label score and each second class label score.
2. The method of claim 1, wherein the step of constructing the emotion score determination model from the second class scores and the first class scores respectively corresponding to the emotion sample images in the respective subsets comprises:
determining a first class scoring sequence and a second class scoring sequence for the corresponding subset based on the second class scores and the first class label scores for the emotion sample images in the subsets;
calculating pearson coefficients between the first class scoring sequence and the second class scoring sequence corresponding to each subset;
and constructing the emotion score determination model based on the pearson coefficients between the first class score sequences and the second class score sequences corresponding to the subsets.
3. The method of claim 1, wherein the step of classifying the plurality of emotion sample images in a first classification manner to obtain a plurality of sets comprises:
inputting the plurality of emotion sample images into a preset emotion analysis model, so that the preset emotion analysis model outputs a first class label of each emotion sample image;
a plurality of sets is determined based on the first category labels for each of the emotion sample images.
4. The method of claim 1, wherein in each set, classifying each emotion sample image in the set according to a second classification scheme to obtain a plurality of subsets and second class scores corresponding to each image in the subsets, respectively, comprises:
Identifying labeling information of each emotion sample image in the set;
determining a second class label corresponding to each emotion sample image in the set based on the labeling information of each emotion sample image in the set;
a plurality of subsets in each set is determined based on the second category labels of each emotion sample image in the set.
5. A mood score determination method, the method further comprising:
acquiring a plurality of emotion images to be analyzed of a target tester;
processing the plurality of emotion images to be analyzed according to a first classification mode to obtain a first class label and a score of each image to be analyzed;
inputting the first class labels and the scores of each image to be analyzed into an emotion score determination model, so that the emotion score determination model outputs scores of different images to be analyzed corresponding to the second class labels respectively, wherein the emotion score determination model is constructed by the emotion score determination model construction method according to any one of claims 1 to 4;
and determining the emotion scores of the target testers based on the scores of the second class labels corresponding to the different images to be analyzed.
6. The method of claim 5, wherein the step of acquiring a plurality of emotion images to be analyzed of the target tester comprises:
Acquiring a human face visual image video of a target tester;
and splitting the target face visual image video frame by frame to obtain the plurality of emotion images to be analyzed.
7. The method of claim 5, wherein determining the emotion score of the target tester based on the scores of the second category labels for the different images to be analyzed, respectively, comprises:
determining the distribution probability of each second-class label based on scores of the second-class labels respectively corresponding to different images to be analyzed;
and determining the emotion score of the target tester based on the distribution probability of each second class label.
8. An emotion score determination model construction apparatus, characterized by comprising:
the first acquisition module is used for acquiring a plurality of emotion sample images;
the first classification module is used for classifying the plurality of emotion sample images according to a first classification mode to obtain a plurality of sets, wherein each emotion sample image in the same set corresponds to the same first class label;
a first determining module for determining a first class score for each emotion sample image in each set;
the second classification module is used for classifying each emotion sample image in each set according to a second classification mode to obtain a plurality of subsets, wherein one set corresponds to the plurality of subsets, and each emotion sample image in the same subset corresponds to the same second class label;
The second determining module is used for determining second class scores corresponding to the emotion sample images in the subset respectively;
the construction module is used for constructing an emotion score determination model according to the second class scores and the first class scores which correspond to the emotion sample images in the subsets respectively, and the emotion score determination model is used for representing the association relation between each first class label score and each second class label score.
9. The apparatus of claim 8, wherein the build module comprises:
a first determining sub-module for determining a first class score sequence and a second class score sequence for the corresponding subset based on the second class scores and the first class label scores for the emotion sample images in the subsets;
the computing sub-module is used for computing the pearson coefficients between the first class scoring sequences and the second class scoring sequences corresponding to the subsets;
and the construction submodule is used for constructing the emotion score determination model based on the pearson coefficients between the first class score sequences and the second class score sequences corresponding to the subsets.
10. The apparatus of claim 8, wherein the first classification module comprises:
The processing sub-module is used for inputting the plurality of emotion sample images into a preset emotion analysis model, so that the preset emotion analysis model outputs a first class label of each emotion sample image;
a second determining sub-module for determining a plurality of sets based on the first class labels of each emotion sample image.
11. The apparatus of claim 8, wherein the second classification module comprises:
the identification sub-module is used for identifying the labeling information of each emotion sample image in the set;
a third determining sub-module, configured to determine a second class label corresponding to each emotion sample image in the set based on labeling information of each emotion sample image in the set;
a fourth determination sub-module for determining a plurality of subsets in each set based on the second class labels of each emotion sample image in the set.
12. An emotion score determination device, characterized in that the device comprises:
the second acquisition module is used for acquiring a plurality of emotion images to be analyzed of the target tester;
the processing module is used for processing the plurality of emotion images to be analyzed according to a first classification mode to obtain a first class label and a score of each image to be analyzed;
A third determining module, configured to input the first class label and the score of each image to be analyzed into an emotion score determining model, so that the emotion score determining model outputs scores of different images to be analyzed corresponding to the second class labels respectively, where the emotion score determining model is constructed by the emotion score determining model construction method according to any one of claims 1 to 4;
and the fourth determining module is used for determining the emotion scores of the target testers based on the scores of the second class labels corresponding to the images to be analyzed.
13. The apparatus of claim 12, wherein the second acquisition module comprises:
the acquisition sub-module is used for acquiring a face visual image video of the target tester;
and the molecule splitting module is used for splitting the target face visual image video frame by frame to obtain the plurality of emotion images to be analyzed.
14. A computer device, comprising:
a memory and a processor, the memory and the processor being communicatively connected to each other, the memory having stored therein computer instructions, the processor executing the computer instructions to perform the mood score determination model construction method of any one of claims 1 to 4 or to perform the mood score determination method of any one of claims 5 to 7.
15. A computer-readable storage medium having stored thereon computer instructions for causing a computer to perform the emotion score determination model construction method of any one of claims 1 to 4 or the emotion score determination method of any one of claims 5 to 7.
CN202311227037.3A 2023-09-21 2023-09-21 Emotion score determination model construction and emotion score determination method, device and equipment Pending CN117274694A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311227037.3A CN117274694A (en) 2023-09-21 2023-09-21 Emotion score determination model construction and emotion score determination method, device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311227037.3A CN117274694A (en) 2023-09-21 2023-09-21 Emotion score determination model construction and emotion score determination method, device and equipment

Publications (1)

Publication Number Publication Date
CN117274694A true CN117274694A (en) 2023-12-22

Family

ID=89215391

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311227037.3A Pending CN117274694A (en) 2023-09-21 2023-09-21 Emotion score determination model construction and emotion score determination method, device and equipment

Country Status (1)

Country Link
CN (1) CN117274694A (en)

Similar Documents

Publication Publication Date Title
Singh et al. Deep learning and machine learning based facial emotion detection using CNN
CN111134664B (en) Epileptic discharge identification method and system based on capsule network and storage medium
CN110570941A (en) System and device for assessing psychological state based on text semantic vector model
CN113673244B (en) Medical text processing method, medical text processing device, computer equipment and storage medium
CN117198468B (en) Intervention scheme intelligent management system based on behavior recognition and data analysis
CN115906002B (en) Learning input state evaluation method based on multi-granularity data fusion
CN115607156B (en) Multi-mode-based psychological cognitive screening evaluation method, system and storage medium
CN111080624A (en) Sperm movement state classification method, device, medium and electronic equipment
Meshram et al. Diagnosis of depression level using multimodal approaches using deep learning techniques with multiple selective features
CN110192860B (en) Brain imaging intelligent test analysis method and system for network information cognition
US11955245B2 (en) Method and system for mental index prediction
Villegas-Ch et al. Identification of emotions from facial gestures in a teaching environment with the use of machine learning techniques
CN114203300A (en) Health state evaluation method and system, server and storage medium
CN117257302A (en) Personnel mental health state assessment method and system
CN116864128A (en) Psychological state assessment system and method based on physical activity behavior pattern monitoring
CN117334337A (en) Cancer patient pain intelligent evaluation and early warning system based on image recognition technology
CN116383618A (en) Learning concentration assessment method and device based on multi-mode data
CN117274694A (en) Emotion score determination model construction and emotion score determination method, device and equipment
CN113017634B (en) Emotion evaluation method, emotion evaluation device, electronic device, and computer-readable storage medium
CN111582404B (en) Content classification method, device and readable storage medium
CN114022698A (en) Multi-tag behavior identification method and device based on binary tree structure
Kumar et al. Exploring Multi-Class Stress Detection Using Deep Neural Networks
CN117786600A (en) Cognitive evaluation method, device, electronic equipment and storage medium
Saxena Feature Space Augmentation: Improving Prediction Accuracy of Classical Problems in Cognitive Science and Computer Vison
CN118197630A (en) Psychological and artificial intelligence-based human brain simulation learning system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination