CN117034137A - Emotion type determination method, emotion type determination device, wearable device and computer medium - Google Patents

Emotion type determination method, emotion type determination device, wearable device and computer medium Download PDF

Info

Publication number
CN117034137A
CN117034137A CN202310911512.2A CN202310911512A CN117034137A CN 117034137 A CN117034137 A CN 117034137A CN 202310911512 A CN202310911512 A CN 202310911512A CN 117034137 A CN117034137 A CN 117034137A
Authority
CN
China
Prior art keywords
feature
emotion
child
probabilities
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310911512.2A
Other languages
Chinese (zh)
Inventor
苏杉
杜少杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Binzhou Polytechnic
Original Assignee
Binzhou Polytechnic
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Binzhou Polytechnic filed Critical Binzhou Polytechnic
Priority to CN202310911512.2A priority Critical patent/CN117034137A/en
Publication of CN117034137A publication Critical patent/CN117034137A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/2431Multiple classes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data
    • G06F18/256Fusion techniques of classification results, e.g. of results related to same input data of results relating to different input data, e.g. multimodal recognition

Abstract

The application discloses a method, a device, a wearable device and a computer medium for determining emotion types, wherein the method provides an emotion prediction model based on a naive Bayesian algorithm, and can predict long-term emotion types of users according to personal basic information such as marital states, educational backgrounds and the like. Therefore, the embodiment of the application realizes the prediction of the long-term emotion type according to the relevant basic information of the user, so that psychological adjustment can be carried out according to the emotion type later, and good parent-child relationship can be established.

Description

Emotion type determination method, emotion type determination device, wearable device and computer medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and apparatus for determining an emotion type, a wearable device, and a computer medium.
Background
The concept of emotion prediction is often found in the field of psychology for judging the emotional response of people affected by an event. In the field of computers, emotion prediction refers to identifying, analyzing and predicting emotion of a person through a certain computer algorithm according to related information. The current emotion recognition and prediction realized by the computer technology means is to recognize the current emotion state of a user according to short-time emotion data, and predict possible emotion response at the next moment, wherein the short-time emotion data comprises facial expression, voice intonation, physiological characteristics and the like.
Disclosure of Invention
The embodiment of the application provides a method and a device for determining emotion types, wearable equipment and a computer storage medium.
In a first aspect, an embodiment of the present application provides a method for determining an emotion type, where the method includes:
acquiring sample user data; wherein the sample user data comprises: the method comprises the steps of user identification, gender, marital status, child behavior causes, number of children, educational background, social support intensity and emotion type corresponding to the user identification; the emotion types include: the type of sadness, anger, self-responsibility, breakdown;
determining prior probabilities corresponding to the various emotion types respectively based on the total number of the user identifications and the number of the various emotion types;
extracting the characteristics of the sample user data to obtain gender characteristics, marital status characteristics, child behavior incentive characteristics, number of children characteristics, education background characteristics and social support strength characteristics;
determining a gender feature probability, a marital status feature probability, a child number feature probability, an educational background feature probability, and a social support strength feature probability based on the total number of user identifications, the number of gender features, the number of marital status features, the number of child behavioral predisposition features, the number of child number features, the number of educational background features, and the number of social support strength features;
Determining a conditional probability of the sex feature, the marital status feature, the child behavior cause feature, the child behavioral context feature, the educational background feature, and the social support strength feature, respectively, based on the number of the sex features, the marital status feature, the child behavioral cause feature, the child behavioral context feature, the educational background feature, and the social support strength feature, respectively;
establishing an emotion prediction model based on the prior probabilities respectively corresponding to the various emotion types, the sex characteristic probabilities, the marital state characteristic probabilities, the child behavior cause characteristic probabilities, the child number characteristic probabilities, the educational background characteristic probabilities, the social support intensity characteristic probabilities, the conditional probabilities of the sex characteristics respectively corresponding to the various emotion types, the conditional probabilities of the marital state characteristics respectively corresponding to the various emotion types, the conditional probabilities of the child behavior cause characteristics respectively corresponding to the various emotion types, the conditional probabilities of the child number characteristics respectively corresponding to the various emotion types, the conditional probabilities of the educational background characteristics respectively corresponding to the various emotion types, and the conditional probabilities of the social support intensity characteristics respectively corresponding to the various emotion types;
Inputting user data to be tested into the emotion prediction model to obtain emotion types of the user data to be tested; wherein, the user data to be tested includes: the gender, marital status, behavioral causes, number of children, educational background, and social support intensity of the user to be tested.
In a second aspect, an embodiment of the present application provides a scene type determining apparatus, including:
the sample user data acquisition module is used for acquiring sample user data; wherein the sample user data comprises: the method comprises the steps of user identification, gender, marital status, child behavior causes, number of children, educational background, social support intensity and emotion type corresponding to the user identification; the emotion types include: the type of sadness, anger, self-responsibility, breakdown;
the prior probability determining module is used for determining prior probabilities corresponding to the various emotion types respectively based on the total number of the user identifications and the number of the various emotion types;
the feature extraction module is used for carrying out feature extraction on the sample user data to obtain gender features, marital state features, child behavior incentive features, child number features, education background features and social support strength features;
A feature probability determination module for determining a gender feature probability, a marital status feature probability, a child number feature probability, an educational background feature probability, and a social support strength feature probability based on the total number of user identifications, the number of gender features, the number of marital status features, the number of child behavior incentive features, the number of child number features, the number of educational background features, and the number of social support strength features;
a conditional probability determining module, configured to determine a conditional probability of the sex feature, the marital status feature, the child behavioral cause feature, the child behavioral background feature, the educational background feature, and the social support intensity feature, which are respectively corresponding to the various emotion types, a conditional probability of the marital behavioral cause feature, the child behavioral cause feature, the educational background feature, and the social support intensity feature, which are respectively corresponding to the various emotion types, based on the number of the sex features, the marital status feature, the child behavioral cause feature, the educational background feature, and the social support intensity feature, which are respectively corresponding to the various emotion types;
The model building module is used for building an emotion prediction model based on the prior probabilities respectively corresponding to the various emotion types, the sex characteristic probabilities, the marital state characteristic probabilities, the child behavior incentive characteristic probabilities, the child quantity characteristic probabilities, the education background characteristic probabilities, the social support intensity characteristic probabilities, the conditional probabilities of the sex characteristics respectively corresponding to the various emotion types, the conditional probabilities of the marital state characteristics respectively corresponding to the various emotion types, the conditional probabilities of child behavior incentives respectively corresponding to the various emotion types, the conditional probabilities of the child quantity characteristics respectively corresponding to the various emotion types, the conditional probabilities of the education background characteristics respectively corresponding to the various emotion types and the conditional probabilities of the social support intensity characteristics respectively corresponding to the various emotion types;
the emotion prediction obtaining module is used for inputting user data to be detected into the emotion prediction model to obtain emotion types of the user data to be detected; wherein, the user data to be tested includes: the gender, marital status, child behavior causes, number of children, educational background, and social support intensity of the user to be tested.
In a third aspect, embodiments of the present application provide a computer storage medium storing a plurality of instructions adapted to be loaded by a processor and to perform the above-described method steps.
In a fourth aspect, embodiments of the present application provide a wearable device, which may include: gloves, virtual reality VR headsets, processors, and memory;
wherein the memory stores a computer program adapted to be loaded by the processor and to perform the above-mentioned method steps.
The technical scheme provided by the embodiments of the application has the beneficial effects that at least:
the embodiment of the application realizes the prediction of the long-term emotion type according to the marital state, the educational background and other personal basic information, and compared with the current most technical means for recognizing and predicting the short-time emotion according to the emotion data such as voice intonation, physiological characteristics and the like, the prediction method provided by the embodiment of the application not only widens the emotion range, but also is easier to realize. In addition, the long-term influence of the child behaviors of the user on the emotion is predicted, the emotion of the user is adjusted in advance, the occurrence of the overstress event is prevented, and the method has great benefit for improving the parent-child relationship.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a system architecture diagram of a method for determining emotion type according to an embodiment of the present application;
fig. 2 is a schematic flow chart of a method for determining emotion type according to an embodiment of the present application;
fig. 3 is a flowchart of another emotion type determining method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an emotion type determining device according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a wearable device according to an embodiment of the present application.
Detailed Description
When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the application as detailed in the accompanying claims.
In the description of the present application, it should be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. The specific meaning of the above terms in the present application will be understood in specific cases by those of ordinary skill in the art. Furthermore, in the description of the present application, unless otherwise indicated, "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
Fig. 1 illustrates a system architecture diagram to which a scene type determination method according to an embodiment of the present application is applied. The scene type determining method provided by the embodiment of the application can be applied to the terminal. Specifically, the server may be connected to the terminal through a network. The network is used to provide a communication link between the terminal and the server. The network may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others. Terminals include, but are not limited to: wearable devices, monitoring devices, handheld devices, personal computers, tablet computers, vehicle-mounted devices, smart phones, computing devices, or other processing devices connected to a wireless modem, etc. Terminal devices in different networks may be called different names, for example: monitoring devices, user equipment, access terminals, subscriber units, subscriber stations, mobile stations, remote terminals, mobile devices, user terminals, wireless communication devices, user agents or user equipment, cellular telephones, cordless telephones, personal digital assistants (personal digital assistant, PDA), terminal devices in fifth generation mobile communication technology (5th generation mobile networks,5G) networks or future evolution networks, and the like. The terminal system refers to an operating system capable of running on the terminal, is a program for managing and controlling terminal hardware and terminal applications, and is an indispensable system application for the terminal. The system comprises an Android system, an IOS system, a Windows Phone (WP) system, a Ubuntu mobile version operating system and the like, which are not limited to the Android system.
It should be understood that the number of terminals, networks and servers in fig. 1 is merely illustrative. There may be any number of terminals, networks and servers as practical. For example, the server may be a server cluster formed by a plurality of servers. The user can interact with the terminal through the network using the server to obtain an optimized version, etc.
Next, a system architecture diagram of the emotion type determining method described in connection with fig. 1 is described, to describe an emotion type determining method provided in an embodiment of the present application.
In one embodiment, shown in FIG. 2, a flow diagram of a method of emotion type determination is provided. As shown in fig. 2, the emotion type determination method may include the steps of:
s201, sample user data is acquired.
Wherein the sample user data may include: user identification, gender, marital status, child behavior causes, number of children, educational background, social support intensity, and mood type corresponding to the user identification. The emotion types may include: the type of sadness, the type of anger, the type of self-responsibility, and the type of breakdown.
Possibly, the main characteristics of the four emotion types in the embodiment of the present application are shown in table 1.
TABLE 1
Type(s) Characteristics (1)
Sadness type The most common types of reactions are those that users feel unassisted, solitary and lost due to the child's abnormal behavior.
Anger type The user considers that the abnormal behavior of the child is caused by external factors (spouse, school or work environment, others) and creates anger emotion.
Self-responsibility type Users complain of being presented with themselves and think that they do not do so well enough to cause the child's abnormal behavior.
Crash type The emotion of the user is out of control, and the user cannot live normally.
Specifically, the sample user data in the embodiment of the application represents the relevant data of the user with negative behaviors of children obtained from news reports and nearby occurrence events. The user identification represents the serial number of the user in the sample database. Child behavioral inducements may include: child self-correlation, parent-child correlation, and third party correlation. The number of children indicates whether or not it is a solitary child. Educational background indicates whether the sample user is highly educated. The social support strength indicates whether the sample user has work or has friends, wherein the social support strength-strong indicates that the sample user has work and friends, the social support strength-middle indicates that the sample user has work or does not have friends, or does not have work or friends, and the social support strength-weak indicates that the sample user does not have work or friends.
Possibly, the sample user data in an embodiment of the present application may take 96 samples shown in table 2 (only part shown).
TABLE 2
S202, determining prior probabilities corresponding to various emotion types respectively based on the total number of user identifications and the number of various emotion types.
For example, if the 96 samples in table 2 include 37 sad sample users, 9 anger sample users, 11 self-responsibility sample users, and 39 collapse sample users, the prior probabilities of the emotional reactions of the respective types are respectively: sadness type 37/96=38.5%, anger type 9/96=9.4%, self-responsibility type 11/96=11.5%, breakdown type 39/96=40.6%.
And S203, carrying out feature extraction on the sample user data to obtain gender features, marital status features, child behavior incentive features, child number features, education background features and social support strength features.
Specifically, the feature extraction results of each item of data in the sample user data shown in table 3 can be referred to.
TABLE 3 Table 3
S204, determining sex characteristic probability, marital status characteristic probability, child number characteristic probability, educational background characteristic probability and social support intensity characteristic probability based on the total number of user identifications, the number of sex characteristics, the number of marital status characteristics, the number of child behavior incentive characteristics, the number of child number characteristics, the number of educational background characteristics and the number of social support intensity characteristics.
See, for example, the probabilities for various features in the 96 sample user data shown in Table 4.
TABLE 4 Table 4
S205, determining the conditional probability of sex characteristics corresponding to each emotion type, the conditional probability of marital state characteristics corresponding to each emotion type, the conditional probability of child behavior inducement characteristics corresponding to each emotion type, the conditional probability of educational background characteristics corresponding to each emotion type, and the conditional probability of social support strength characteristics corresponding to each emotion type based on the number corresponding to each emotion type, the number of sex characteristics, the number of child behavior inducement characteristics corresponding to each emotion type, the number of educational background characteristics corresponding to each emotion type, and the number of social support strength characteristics corresponding to each emotion type.
Specifically, the embodiment of the application can firstly determine the respective corresponding quantity of various emotion types, and then determine the respective corresponding quantity of various characteristics based on the respective corresponding quantity of various emotion types, for example, 37 sample data corresponding to the grignard type in 96 sample user data, wherein the 37 sample data comprise 21 male sample data and 16 female sample data; 30 marriage sample data, 3 single parent sample data, 4 remark sample data. Further, specific numbers of the respective features corresponding to the respective emotion types are shown in table 5 below.
TABLE 5
Further, the conditional probability of each feature may be determined by calculating the duty cycle, for example, in the case of the grignard type, the conditional probability of the male feature is 21/37=56.8% and the conditional probability of the female feature is 16/37=43.2%. Under the type of the grignard, the conditional probability corresponding to the primary wedding feature is 30/37=81.1%, the conditional probability corresponding to the single parent feature is 3/37=8.1%, and the conditional probability corresponding to the secondary wedding feature is 4/37=10.8%. Specifically, conditional probabilities of respective features of the respective emotion types are referred to table 6 as shown below.
TABLE 6
S206, establishing an emotion prediction model based on the prior probabilities, sex characteristic probabilities, marital state characteristic probabilities, child behavior cause characteristic probabilities, child number characteristic probabilities, education background characteristic probabilities, social support intensity characteristic probabilities, conditional probabilities of sex characteristics corresponding to various emotion types, conditional probabilities of marital state characteristics corresponding to various emotion types, conditional probabilities of child behavior cause characteristics corresponding to various emotion types, conditional probabilities of child number characteristics corresponding to various emotion types, conditional probabilities of education background characteristics corresponding to various emotion types, and conditional probabilities of social support intensity characteristics corresponding to various emotion types.
Possibly, the embodiment of the application can use a naive Bayesian algorithm to build an emotion prediction model, namely, the emotion response prediction of the user.
Possibly, the embodiment of the application can predict the probability of the event not occurring according to the prior probability, determine the type according to the maximum probability value, and particularly establish an emotion prediction model according to a prediction formula shown as follows:
s207, inputting the user data to be tested into the emotion prediction model to obtain the emotion type of the user data to be tested.
Wherein, the user data to be measured includes: the gender, marital status, child behavior causes, number of children, educational background, and social support intensity of the user to be tested.
For example, the user data to be tested is the reaction type of "sex is female", "marital status is marriage", "behavior causes are related to relationships between children", "no other children", "no higher education background", "strong social support strength", and the specific calculation process is as follows:
similarly, the probabilities of other emotion types, namely, the probability of anger and the correlation of marriage and child, the probability of self and marriage and child, and the probability of collapse and the correlation of marriage and child, are respectively 0.068, 0.606 and 0.779, can be calculated. And determining the emotion type-collapse type corresponding to 0.779 as the emotion type of the user data to be detected according to the principle of maximum probability.
Therefore, the application can input the personal related information of the user to be tested into the emotion prediction model to predict the long-term emotion response possibly generated by the influence of the behaviors of other people. Therefore, the embodiment of the application not only realizes the prediction of the long-term emotional response of the user to be detected, but also avoids the inaccuracy caused by the prediction of the emotional response according to the historical emotional response of the user to be detected or the face information of the user to be detected in the video image.
In addition, the embodiment of the application can also determine the virtual scene corresponding to the emotion type. Further, constructing a corresponding virtual scene according to the emotion type.
Specifically, virtual Reality (VR) is a phenomenon that uses real life data, and electronic signals generated by computer technology are combined with various output devices to convert the electronic signals into human-felt phenomena, where the phenomena may be real and tangential objects in Reality, or substances that cannot be seen by naked eyes, and are represented by a three-dimensional model.
Possibly, according to the embodiment of the application, corresponding virtual scenes can be respectively established according to 4 emotion types, for example, a virtual scene corresponding to a sad type is established aiming at a sad emotion, and the sad emotion needs to be displayed by a virtual character in the virtual scene, for example, a wounded heart tears and the like. A virtual scene corresponding to the anger type is created for the anger emotion, in which virtual scene the virtual character needs to exhibit the anger emotion, such as quarry, etc.
According to the embodiment of the application, the emotion change of the user to be tested of the experienter can be caused by the virtual scene established by the virtual reality technology, and the experience feeling of the experienter is improved by the vivid virtual scene and the interactive mode, so that the tension emotion between the experienter and the user to be tested is relieved.
In some implementations, fig. 3 schematically shows a flowchart of a method for determining an emotion type according to an embodiment of the present application. As shown in fig. 3, the emotion type determination method may include at least the steps of:
s301, sample user data is acquired.
Specifically, S301 corresponds to S201, and will not be described here again.
S302, based on the total number of user identifications and the number of various emotion types, the prior probabilities respectively corresponding to the various emotion types are determined.
Specifically, S302 corresponds to S202, and will not be described here.
S303, carrying out feature extraction on the sample user data to obtain gender features, marital status features, child behavior incentive features, child number features, education background features and social support strength features.
Specifically, S303 corresponds to S203, and will not be described here.
S304, determining sex characteristic probability, marital status characteristic probability, child number characteristic probability, educational background characteristic probability and social support intensity characteristic probability based on the total number of user identifications, the number of sex characteristics, the number of marital status characteristics, the number of child behavior incentive characteristics, the number of child number characteristics, the number of educational background characteristics and the number of social support intensity characteristics.
Specifically, S304 corresponds to S204, and will not be described here.
S305, determining the conditional probability of sex characteristics corresponding to each emotion type, the conditional probability of marital state characteristics corresponding to each emotion type, the conditional probability of child behavior inducement characteristics corresponding to each emotion type, the conditional probability of educational background characteristics corresponding to each emotion type, and the conditional probability of social support strength characteristics corresponding to each emotion type based on the number corresponding to each emotion type, the number of sex characteristics, the number of child behavior inducement characteristics corresponding to each emotion type, the number of educational background characteristics corresponding to each emotion type, and the number of social support strength characteristics corresponding to each emotion type.
Specifically, S305 corresponds to S205, and will not be described here.
S306, establishing an emotion prediction model based on prior probabilities, sex characteristic probabilities, marital state characteristic probabilities, child behavior cause characteristic probabilities, child number characteristic probabilities, education background characteristic probabilities, social support intensity characteristic probabilities, conditional probabilities of sex characteristics corresponding to various emotion types, conditional probabilities of marital state characteristics corresponding to various emotion types, conditional probabilities of child behavior cause characteristics corresponding to various emotion types, conditional probabilities of child number characteristics corresponding to various emotion types, conditional probabilities of education background characteristics corresponding to various emotion types, and conditional probabilities of social support intensity characteristics corresponding to various emotion types.
Specifically, S306 corresponds to S206, and will not be described here.
S307, inputting the user data to be tested into the emotion prediction model to obtain the emotion type of the user data to be tested.
Specifically, S307 is identical to S207, and will not be described here again.
S308, acquiring a plurality of life images and text information of the user to be tested.
Specifically, the embodiment of the application can create a virtual scene by an experienter acquiring a plurality of life photos and related text information of a user of the experienter.
S309, generating a background environment and a character image corresponding to the user to be tested based on the multiple life images of the user to be tested.
Possibly, the embodiment of the application can automatically create the three-dimensional model of the user image, the place environment and the articles in the place according to the life photo provided by the experimenter by using software.
S310, generating language information of the personage in the virtual scene based on the text information of the user to be detected and the emotion type of the user data to be detected.
It can be understood that the purpose of the application is to simulate the language system of the user, so that the experienter can feel the feeling of being in the scene, and the feeling of being away can not be generated due to the fixed mode of the language.
S311, displaying the virtual scene corresponding to the emotion type based on the background environment, the character image and the language information of the character image corresponding to the user to be tested.
Possibly, the embodiment of the application can determine the preset emotion change simulation state of the character image corresponding to the user data to be tested based on the emotion type of the user data to be tested; and presetting an emotion change simulation state based on the background environment, the character image, the language information of the character image and the character image corresponding to the user to be tested, and displaying the virtual scene corresponding to the emotion type.
The preset emotion change simulation state of the character image in the embodiment of the application indicates that a user may generate a series of behavior actions under the corresponding emotion under a dynamic scene. For example, a user with a gritty feel may react by continually tearing off, sitting on the ground, etc.; an angry user may react to a violent quarry, responsibilities, etc.
Further, the embodiment of the application can determine the interaction mode corresponding to the preset emotion change simulation state of the figure based on the virtual scene corresponding to the emotion type and the preset emotion change simulation state of the figure; and displaying the virtual scene corresponding to the emotion type based on the background environment, the character image, the language information of the character image, the preset emotion change simulation state of the character image and the interaction mode corresponding to the preset emotion change simulation state of the character image corresponding to the user to be tested.
Therefore, the embodiment of the application can construct a virtual scene according to the pictures of the living places of the users to be tested provided by the experimenters so as to display the emotional response of the users to be tested, and interact with the users to be tested displayed in the virtual scene through the preset interaction mode corresponding to each emotional response, so that the experimenters can feel the emotional change of the users by himself, the character information and the text information are fused to establish an emotion prediction model so as to improve the availability of acquired information and avoid wasting of resources, and on the other hand, the characteristics can be extracted from multiple angles so as to establish a more effective emotion prediction model, thereby relieving the tension emotion between the experimenters and the users to be tested.
Fig. 4 is a schematic diagram of a emotion type determination device according to an exemplary embodiment of the present application. The emotion type determining device may be provided in a terminal or the like, and execute the emotion type determining method according to any of the above embodiments of the present application. As shown in fig. 4, the emotion type determining device may include:
a sample user data acquisition module 41 for acquiring sample user data; wherein the sample user data comprises: the method comprises the steps of user identification, gender, marital status, child behavior causes, number of children, educational background, social support intensity and emotion type corresponding to the user identification; the emotion types include: the type of sadness, anger, self-responsibility, breakdown;
A prior probability determining module 42, configured to determine prior probabilities that the various emotion types respectively correspond based on the total number of the user identities and the number of the various emotion types;
a feature extraction module 43, configured to perform feature extraction on the sample user data to obtain a gender feature, a marital status feature, a child behavior incentive feature, a number of children feature, an educational background feature, and a social support intensity feature;
a feature probability determination module 44 for determining a gender feature probability, a marital status feature probability, a child number feature probability, an educational background feature probability, and a social support strength feature probability based on the total number of user identifications, the number of gender features, the number of marital status features, the number of child behavioral incentive features, the number of child number features, the number of educational background features, and the number of social support strength features;
a conditional probability determining module 45, configured to determine a conditional probability of the sex feature, the marital status feature, the child behavioral cause feature, the child behavioral background feature, the educational background feature, and the social support intensity feature, which are respectively corresponding to the various emotion types, a conditional probability of the marital behavioral cause feature, a conditional probability of the child behavioral cause feature, a conditional probability of the educational background feature, and a conditional probability of the social support intensity feature, which are respectively corresponding to the various emotion types, based on the number of the various emotion types, the number of the sex features, the marital status feature, the number of the child behavioral cause features, the child behavioral cause feature, the number of the educational background feature, and the social support intensity feature, which are respectively corresponding to the various emotion types;
A model building module 46, configured to build an emotion prediction model based on the prior probabilities respectively corresponding to the various emotion types, the sex feature probabilities, the marital state feature probabilities, the child behavior cause feature probabilities, the child number feature probabilities, the educational background feature probabilities, the social support intensity feature probabilities, the conditional probabilities of the sex features respectively corresponding to the various emotion types, the conditional probabilities of the marital state features respectively corresponding to the various emotion types, the conditional probabilities of the child behavior cause features respectively corresponding to the various emotion types, the conditional probabilities of the child number features respectively corresponding to the various emotion types, the conditional probabilities of the educational background features respectively corresponding to the various emotion types, and the conditional probabilities of the social support intensity features respectively corresponding to the various emotion types;
a mood prediction obtaining module 47, configured to input user data to be tested into the mood prediction model to obtain a mood type of the user data to be tested; wherein, the user data to be tested includes: the gender, marital status, child behavior causes, number of children, educational background, and social support intensity of the user to be tested.
In some embodiments, after the emotion prediction module 47, the apparatus further comprises:
and the virtual scene determining module is used for determining the virtual scene corresponding to the emotion type.
In some embodiments, the apparatus further comprises:
the first acquisition module is used for acquiring a plurality of life images and text information of the user to be tested;
the first generation module is used for generating a background environment and a figure image corresponding to the user to be detected based on the plurality of life images of the user to be detected;
after the emotion type determination module, the apparatus further includes:
the second generation unit is used for generating language information of the character image in the virtual scene based on the text information of the user to be detected and the emotion type of the user data to be detected;
the first display unit is used for displaying the virtual scene corresponding to the emotion type based on the background environment, the character image and the language information of the character image corresponding to the user to be detected.
In some embodiments, before the scene type determination module, the apparatus further comprises:
the first determining module is used for determining a preset emotion change simulation state of the character image corresponding to the user data to be detected based on the emotion type of the user data to be detected;
The emotion type display module is specifically configured to: and displaying the virtual scene corresponding to the emotion type based on the background environment corresponding to the user to be detected, the character image, the language information of the character image and the preset emotion change simulation state of the character image.
It should be noted that, when the emotion type determining apparatus provided in the foregoing embodiment performs the information pushing method, only the division of the foregoing functional modules is used as an example, and in practical application, the foregoing functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the emotion type determining device provided in the above embodiment and the emotion type determining method embodiment belong to the same concept, which embody the detailed implementation process in the method embodiment, and are not repeated here.
The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
Referring to fig. 5, a schematic structural diagram of a wearable device is provided in an embodiment of the present application. As shown in fig. 5, the wearable device 50 may include: at least one processor 51, at least one network interface 54, a user interface 53, a memory 55, at least one communication bus 52.
Wherein the communication bus 52 is used to enable connected communication between these components.
The user interface 53 may comprise a glove, VR headset, among other things, and the optional user interface 53 may also comprise a standard wired interface, a wireless interface.
The network interface 54 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others.
Wherein the processor 51 may comprise one or more processing cores. The processor 51 utilizes various interfaces and lines to connect various portions of the overall electronic device 50, perform various functions of the electronic device 50 and process data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 55, and invoking data stored in the memory 55. Alternatively, the processor 51 may be implemented in at least one hardware form of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programmable Logic Array, PLA). The processor 51 may integrate one or a combination of several of a central processing unit (Central Processing Unit, CPU), an image processor (Graphics Processing Unit, GPU), and a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the VR helmet; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor 51 and may be implemented by a single chip.
The Memory 55 may include a random access Memory (Random Access Memory, RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 55 includes a non-transitory computer readable medium (non-transitory computer-readable storage medium). Memory 55 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 55 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the above-described respective method embodiments, etc.; the storage data area may store data or the like referred to in the above respective method embodiments. The memory 55 may optionally be at least one memory device located remotely from the aforementioned processor 51. As shown in fig. 5, an operating system, a network communication module, a user interface module, and an information push application program may be included in the memory 55 as one type of computer storage medium.
In the electronic device 50 shown in fig. 5, the user interface 53 is mainly used as an interface for providing input for a user, and obtains data input by the user; and processor 51 may be operative to invoke the emotion type determination application stored in memory 55 and to specifically perform the following operations:
Acquiring sample user data; wherein the sample user data comprises: the method comprises the steps of user identification, gender, marital status, child behavior causes, number of children, educational background, social support intensity and emotion type corresponding to the user identification; the emotion types include: the type of sadness, anger, self-responsibility, breakdown;
determining prior probabilities corresponding to the various emotion types respectively based on the total number of the user identifications and the number of the various emotion types;
extracting the characteristics of the sample user data to obtain gender characteristics, marital status characteristics, child behavior incentive characteristics, number of children characteristics, education background characteristics and social support strength characteristics;
determining a gender feature probability, a marital status feature probability, a child number feature probability, an educational background feature probability, and a social support strength feature probability based on the total number of user identifications, the number of gender features, the number of marital status features, the number of child behavioral predisposition features, the number of child number features, the number of educational background features, and the number of social support strength features;
Determining a conditional probability of the sex feature, the marital status feature, the child behavior cause feature, the child behavioral context feature, the educational background feature, and the social support strength feature, respectively, based on the number of the sex features, the marital status feature, the child behavioral cause feature, the child behavioral context feature, the educational background feature, and the social support strength feature, respectively;
establishing an emotion prediction model based on the prior probabilities respectively corresponding to the various emotion types, the sex characteristic probabilities, the marital state characteristic probabilities, the child behavior cause characteristic probabilities, the child number characteristic probabilities, the educational background characteristic probabilities, the social support intensity characteristic probabilities, the conditional probabilities of the sex characteristics respectively corresponding to the various emotion types, the conditional probabilities of the marital state characteristics respectively corresponding to the various emotion types, the conditional probabilities of the child behavior cause characteristics respectively corresponding to the various emotion types, the conditional probabilities of the child number characteristics respectively corresponding to the various emotion types, the conditional probabilities of the educational background characteristics respectively corresponding to the various emotion types, and the conditional probabilities of the social support intensity characteristics respectively corresponding to the various emotion types;
Inputting user data to be tested into the emotion prediction model to obtain emotion types of the user data to be tested; wherein, the user data to be tested includes: the gender, marital status, child behavior causes, number of children, educational background, and social support intensity of the user to be tested.
In some embodiments, the processor 51 executes a method for creating a virtual scene corresponding to the emotion type based on the emotion type of the user data to be tested.
In some embodiments, the processor 51 further performs:
acquiring a plurality of life images and text information of the user to be tested;
generating a background environment and a character image corresponding to the user to be detected based on the multiple life images of the user to be detected;
after executing the determining the virtual scene corresponding to the emotion type, the processor 51 further executes:
generating language information of the character image in the virtual scene based on the text information of the user to be detected and the emotion type of the user data to be detected;
and displaying the virtual scene corresponding to the emotion type based on the background environment, the character image and the language information of the character image corresponding to the user to be tested.
In some embodiments, before executing the displaying the virtual scene corresponding to the emotion type, the processor 51 further executes:
determining a preset emotion change simulation state of the character image corresponding to the user data to be detected based on the emotion type of the user data to be detected;
the processor 51 specifically performs, when executing the virtual scene corresponding to the emotion type based on the background environment, the character image, and the language information of the character image corresponding to the user to be tested, the following steps:
and displaying the virtual scene corresponding to the emotion type based on the background environment corresponding to the user to be detected, the character image, the language information of the character image and the preset emotion change simulation state of the character image.
Embodiments of the present application also provide a computer-readable storage medium having instructions stored therein, which when executed on a computer or processor, cause the computer or processor to perform one or more of the steps of the embodiments shown in fig. 2 and 3 described above. The above-described constituent modules of the information push device may be stored in the computer-readable storage medium if implemented in the form of software functional units and sold or used as independent products.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted across a computer-readable storage medium. The computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line (Digital Subscriber Line, DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy Disk, a hard Disk, a magnetic tape), an optical medium (e.g., a digital versatile Disk (Digital Versatile Disc, DVD)), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
Those skilled in the art will appreciate that implementing all or part of the above-described embodiment methods may be accomplished by way of a computer program, which may be stored in a computer-readable storage medium, instructing relevant hardware, and which, when executed, may comprise the embodiment methods as described above. And the aforementioned storage medium includes: a Read Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk or an optical disk, or the like. The technical features in the present examples and embodiments may be arbitrarily combined without conflict.
The above-described embodiments are merely illustrative of the preferred embodiments of the present application and are not intended to limit the scope of the present application, and various modifications and improvements made by those skilled in the art to the technical solution of the present application should fall within the scope of protection defined by the claims of the present application without departing from the design spirit of the present application.

Claims (7)

1. A method of emotion type determination, the method comprising:
acquiring sample user data; wherein the sample user data comprises: the method comprises the steps of user identification, gender, marital status, child behavior causes, number of children, educational background, social support intensity and emotion type corresponding to the user identification; the emotion types include: the type of sadness, anger, self-responsibility, breakdown;
Determining prior probabilities corresponding to the various emotion types respectively based on the total number of the user identifications and the number of the various emotion types;
extracting the characteristics of the sample user data to obtain gender characteristics, marital status characteristics, child behavior incentive characteristics, number of children characteristics, education background characteristics and social support strength characteristics;
determining a gender feature probability, a marital status feature probability, a child number feature probability, an educational background feature probability, and a social support strength feature probability based on the total number of user identifications, the number of gender features, the number of marital status features, the number of child behavioral predisposition features, the number of child number features, the number of educational background features, and the number of social support strength features;
determining a conditional probability of the sex feature, the marital status feature, the child behavior cause feature, the child behavioral context feature, the educational background feature, and the social support strength feature, respectively, based on the number of the sex features, the marital status feature, the child behavioral cause feature, the child behavioral context feature, the educational background feature, and the social support strength feature, respectively;
Establishing an emotion prediction model based on the prior probabilities respectively corresponding to the various emotion types, the sex characteristic probabilities, the marital state characteristic probabilities, the child behavior cause characteristic probabilities, the child number characteristic probabilities, the educational background characteristic probabilities, the social support intensity characteristic probabilities, the conditional probabilities of the sex characteristics respectively corresponding to the various emotion types, the conditional probabilities of the marital state characteristics respectively corresponding to the various emotion types, the conditional probabilities of the child behavior cause characteristics respectively corresponding to the various emotion types, the conditional probabilities of the child number characteristics respectively corresponding to the various emotion types, the conditional probabilities of the educational background characteristics respectively corresponding to the various emotion types, and the conditional probabilities of the social support intensity characteristics respectively corresponding to the various emotion types;
inputting user data to be tested into the emotion prediction model to obtain emotion types of the user data to be tested; wherein, the user data to be tested includes: the gender, marital status, child behavior causes, number of children, educational background, and social support intensity of the user to be tested.
2. The method of claim 1, wherein after the obtaining the emotion type of the user data to be tested, the method further comprises: and determining the virtual scene corresponding to the emotion type.
3. The method of claim 2, wherein the method further comprises:
acquiring a plurality of life images and text information of the user to be tested;
generating a background environment and a character image corresponding to the user to be detected based on the multiple life images of the user to be detected;
after the virtual scene corresponding to the emotion type is determined, the method further comprises:
generating language information of the character image in the virtual scene based on the text information of the user to be detected and the emotion type of the user data to be detected;
and displaying the virtual scene corresponding to the emotion type based on the background environment, the character image and the language information of the character image corresponding to the user to be tested.
4. The method of claim 3, wherein prior to displaying the virtual scene corresponding to the emotion type, the method further comprises:
determining a preset emotion change simulation state of the character image corresponding to the user data to be detected based on the emotion type of the user data to be detected;
the displaying the virtual scene corresponding to the emotion type based on the background environment, the character image and the language information of the character image corresponding to the user to be tested comprises the following steps:
And displaying the virtual scene corresponding to the emotion type based on the background environment corresponding to the user to be detected, the character image, the language information of the character image and the preset emotion change simulation state of the character image.
5. An emotion type determination device, characterized in that the device comprises:
the sample user data acquisition module is used for acquiring sample user data; wherein the sample user data comprises: the method comprises the steps of user identification, gender, marital status, child behavior causes, number of children, educational background, social support intensity and emotion type corresponding to the user identification; the emotion types include: the type of sadness, anger, self-responsibility, breakdown;
the prior probability determining module is used for determining prior probabilities corresponding to the various emotion types respectively based on the total number of the user identifications and the number of the various emotion types;
the feature extraction module is used for carrying out feature extraction on the sample user data to obtain gender features, marital state features, child behavior incentive features, child number features, education background features and social support strength features;
a feature probability determination module for determining a gender feature probability, a marital status feature probability, a child number feature probability, an educational background feature probability, and a social support strength feature probability based on the total number of user identifications, the number of gender features, the number of marital status features, the number of child behavior incentive features, the number of child number features, the number of educational background features, and the number of social support strength features;
A conditional probability determining module, configured to determine a conditional probability of the sex feature, the marital status feature, the child behavioral cause feature, the child behavioral background feature, the educational background feature, and the social support intensity feature, which are respectively corresponding to the various emotion types, a conditional probability of the marital behavioral cause feature, the child behavioral cause feature, the educational background feature, and the social support intensity feature, which are respectively corresponding to the various emotion types, based on the number of the sex features, the marital status feature, the child behavioral cause feature, the educational background feature, and the social support intensity feature, which are respectively corresponding to the various emotion types;
the model building module is used for building an emotion prediction model based on the prior probabilities respectively corresponding to the various emotion types, the sex characteristic probabilities, the marital state characteristic probabilities, the child behavior cause characteristic probabilities, the child quantity characteristic probabilities, the educational background characteristic probabilities, the social support intensity characteristic probabilities, the conditional probabilities of the sex characteristics respectively corresponding to the various emotion types, the conditional probabilities of the marital state characteristics respectively corresponding to the various emotion types, the conditional probabilities of the child behavior cause characteristics respectively corresponding to the various emotion types, the conditional probabilities of the child quantity characteristics respectively corresponding to the various emotion types, the conditional probabilities of the educational background characteristics respectively corresponding to the various emotion types and the conditional probabilities of the social support intensity characteristics respectively corresponding to the various emotion types;
The emotion prediction obtaining module is used for inputting user data to be detected into the emotion prediction model to obtain emotion types of the user data to be detected; wherein, the user data to be tested includes: the gender, marital status, child behavior causes, number of children, educational background, and social support intensity of the user to be tested.
6. A wearable device, the wearable device comprising: gloves, virtual reality VR headsets, processors, and memory; wherein,
the processor being in communication with the glove and VR headset, respectively, the memory storing a computer program adapted to be loaded by the processor and to perform the method steps of any of claims 1-4.
7. A computer storage medium storing a plurality of instructions adapted to be loaded by a processor and to perform the method steps of any of claims 1-4.
CN202310911512.2A 2023-07-24 2023-07-24 Emotion type determination method, emotion type determination device, wearable device and computer medium Pending CN117034137A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310911512.2A CN117034137A (en) 2023-07-24 2023-07-24 Emotion type determination method, emotion type determination device, wearable device and computer medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310911512.2A CN117034137A (en) 2023-07-24 2023-07-24 Emotion type determination method, emotion type determination device, wearable device and computer medium

Publications (1)

Publication Number Publication Date
CN117034137A true CN117034137A (en) 2023-11-10

Family

ID=88634535

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310911512.2A Pending CN117034137A (en) 2023-07-24 2023-07-24 Emotion type determination method, emotion type determination device, wearable device and computer medium

Country Status (1)

Country Link
CN (1) CN117034137A (en)

Similar Documents

Publication Publication Date Title
CN111788621B (en) Personal virtual digital assistant
CN108735204A (en) Equipment for executing task corresponding with user spoken utterances
US20170046748A1 (en) Method and system for personifying a brand
CN107592255B (en) Information display method and equipment
US11341336B2 (en) Recording medium, conversation control method, and information processing apparatus
CN108573306B (en) Method for outputting reply information, and training method and device for deep learning model
CN113407850B (en) Method and device for determining and acquiring virtual image and electronic equipment
CN111767394A (en) Abstract extraction method and device based on artificial intelligence expert system
CN113160819A (en) Method, apparatus, device, medium and product for outputting animation
CN113392197A (en) Question-answer reasoning method and device, storage medium and electronic equipment
CN111512617A (en) Device and method for recommending contact information
CN110008926B (en) Method and device for identifying age
CN108268936A (en) For storing the method and apparatus of convolutional neural networks
CN114140814A (en) Emotion recognition capability training method and device and electronic equipment
CN108549681B (en) Data processing method and device, electronic equipment and computer readable storage medium
CN112910761B (en) Instant messaging method, device, equipment, storage medium and program product
CN111797822A (en) Character object evaluation method and device and electronic equipment
US11012382B2 (en) State display information transmission system using chatbot
CN103297611A (en) Method and system masking message on electronic device
CN117034137A (en) Emotion type determination method, emotion type determination device, wearable device and computer medium
CN113408571B (en) Image classification method and device based on model distillation, storage medium and terminal
CN113468857B (en) Training method and device for style conversion model, electronic equipment and storage medium
KR101743999B1 (en) Terminal and method for verification content
CN110895558A (en) Dialog reply method and related device
CN107577664A (en) Method and apparatus for display information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination