CN112906399B - Method, apparatus, device and storage medium for determining emotional state - Google Patents

Method, apparatus, device and storage medium for determining emotional state Download PDF

Info

Publication number
CN112906399B
CN112906399B CN202110195018.1A CN202110195018A CN112906399B CN 112906399 B CN112906399 B CN 112906399B CN 202110195018 A CN202110195018 A CN 202110195018A CN 112906399 B CN112906399 B CN 112906399B
Authority
CN
China
Prior art keywords
information
user
response
emotional state
interaction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110195018.1A
Other languages
Chinese (zh)
Other versions
CN112906399A (en
Inventor
龚鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202110195018.1A priority Critical patent/CN112906399B/en
Publication of CN112906399A publication Critical patent/CN112906399A/en
Application granted granted Critical
Publication of CN112906399B publication Critical patent/CN112906399B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/35Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Marketing (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Computational Linguistics (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The disclosure discloses a method, a device, equipment and a storage medium for determining an emotional state, which are applied to the technical field of computers, and are particularly applied to the fields of natural language processing, big data and intelligent recommendation. The specific implementation scheme is as follows: in response to detecting the generation of the first interaction information, determining an emotional state of the user at the moment of the generation of the first interaction information according to the first interaction information; and updating the time sequence information of the emotional state of the user according to the emotional state and the generation moment of the first interaction information.

Description

Method, apparatus, device and storage medium for determining emotional state
Technical Field
The present disclosure relates to the field of computer technology, and more particularly to the field of natural language processing, big data, and intelligent recommendation, and more particularly to a method, apparatus, device, and storage medium for determining an emotional state.
Background
With the development of economy, the life rhythm of people is faster and faster. Along with the acceleration of the pace of life, the pressure felt by people is also increasing. Especially for job staffs, the stress is not only from work, but also from families and life etc. Contemporary society is generally less concerned about the mental health of job staffs, but job staffs tend to have a significant position for both home and society compared to other people. Therefore, there is a mismatch between the importance of the physiological health of the staff and the attention received.
Disclosure of Invention
Provided are a method, apparatus, device, medium, and program product capable of determining an emotion state simply and efficiently.
According to a first aspect, there is provided a method of determining an emotional state, comprising: in response to detecting the generation of the first interaction information, determining an emotional state of the user at the moment of the generation of the first interaction information according to the first interaction information; and updating the time sequence information of the emotional state of the user according to the emotional state and the generation moment of the first interaction information.
According to a second aspect, there is provided an apparatus for determining an emotional state, comprising: the emotional state determining module is used for determining the emotional state of the user at the moment of generating the first interaction information according to the first interaction information in response to the detection of the generation of the first interaction information; and the time sequence information updating module is used for updating the time sequence information of the emotion state of the user according to the emotion state and the generation moment of the first interaction information.
According to a third aspect, there is provided an electronic device comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of determining an emotion state provided by the present disclosure.
According to a fourth aspect, there is provided a non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method of determining an emotion state provided by the present disclosure.
According to a fifth aspect, there is provided a computer program product comprising a computer program which, when executed by a processor, implements the method of determining an emotional state provided by the present disclosure.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
The drawings are for a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 is a schematic diagram of an application scenario of a method, apparatus, device and storage medium for determining emotion status according to an embodiment of the present disclosure;
FIG. 2 schematically illustrates a flow chart of a method of determining a emotional state according to an embodiment of the disclosure;
FIG. 3 schematically illustrates a schematic diagram of a method of determining a emotional state according to an embodiment of the disclosure;
FIG. 4 schematically illustrates a schematic diagram of a method of determining a emotional state according to another embodiment of the disclosure;
FIG. 5 schematically illustrates a change schematic of a presentation page during execution of a method of determining an emotional state;
FIG. 6 schematically illustrates a change schematic of a presentation page during the process of inputting information;
FIG. 7 schematically illustrates a schematic diagram of a method of determining a emotional state according to yet another embodiment of the disclosure;
FIG. 8 schematically illustrates a presentation interface diagram of a client application adapted to implement a method of determining an emotional state;
FIG. 9 is a block diagram of an apparatus for determining an emotion state according to an embodiment of the present disclosure; and
fig. 10 is a block diagram of an electronic device for implementing a method of determining an emotion state in accordance with an embodiment of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present disclosure to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
The present disclosure provides a method of determining an emotional state, the method including an emotional state determination process and a timing information update process. In the emotion state determination process, in response to detecting the generation of the first interaction information, determining the emotion state of the user at the moment of the generation of the first interaction information according to the first interaction information. In the time sequence information updating process, the time sequence information of the emotion state of the user is updated according to the emotion state and the generation moment of the first interaction information.
An application scenario of the method and apparatus provided by the present disclosure will be described below with reference to fig. 1.
FIG. 1 is an application scenario diagram of a method, apparatus, device, medium, and program product for determining emotion status according to an embodiment of the present disclosure.
As shown in fig. 1, the application scenario 100 of this embodiment may include a terminal device 110, and the terminal device 110 may be installed with various client applications, such as a shopping class application, a web browser application, a search class application, a web disk class application, a mailbox client, a social class application, and the like (by way of example only). Terminal device 110 may be a variety of electronic devices having a display screen and having processing capabilities including, but not limited to, smart phones, tablet computers, laptop computers, desktop computers, and the like.
Illustratively, as shown in FIG. 1, the social class application of the terminal device 110 may, for example, present the page 111 in response to a user operation. The social class application may generate interaction information 120 in response to a user's selection of a "post" control in page 111, the interaction information 120 including text information "even image encountered by itself, activating" in page 111 obtained in response to a user operation. It is to be understood that this page 111 is merely an example to facilitate an understanding of the present disclosure, which is not limited thereto by the present disclosure.
According to an embodiment of the present disclosure, the terminal device 110 may have, for example, a processing function for determining, according to the interaction information 120, an emotional state 130 of the user when the text information is released, and maintaining, according to the emotional state 130, timing information of the emotional state of the user, resulting in updated timing information 140. As such, terminal device 110 may, for example, present page 112 as shown in fig. 1 in response to a user's operation to view emotional state timing information, the page 112 having timing information of the user's emotional state over time presented therein for a predetermined period of time. The predetermined period of time may be, for example, a period of time between a user applying to the current time from using the social class, or may be a week, month, quarter, half year, or the like, which is closest to the current time.
As shown in fig. 1, in an embodiment, the application scenario 100 may further include a server 150, and the terminal device 110 and the server 150 may communicate through a network, where the network may include a wired or wireless communication network, for example.
Illustratively, a user may interact with server 150 via a network using terminal device 110. Server 150 may be, for example, an application server for providing support for social class applications running on terminal device 110. In this embodiment, after terminal device 110 generates interaction information 120 in response to a user operation, interaction information 120 may be transmitted to server 150, for example, via a network, which processes the interaction information to determine emotional state 130 of the user. The server 150 may also feed back the determined emotional state 130 to the terminal device 110, so that the terminal device 110 updates the timing information according to the emotional state 130 and displays the timing information.
In one embodiment, server 150 may be, for example, a server that incorporates a blockchain. Alternatively, the server 150 may be a virtual server or a cloud server.
It should be noted that, the method for determining the emotion status provided in the present disclosure may be performed by the terminal device 110, or part of operations may be performed by the terminal device 110, and part of operations may be performed by the server 150. Accordingly, the apparatus for determining a mood state provided in the present disclosure may be provided in the terminal device 110, or may be provided with a part of modules in the terminal device 110 and a part of modules in the server 150.
It should be understood that the terminal device, presentation page and server in fig. 1 are merely illustrative. Any type of terminal device, presentation page, and server may be provided, as desired for implementation.
The method for determining the emotion state provided by the present disclosure will be described in detail below with reference to the application scenario described in fig. 1 through fig. 2 to 8.
FIG. 2 is a flow chart of a method of determining an emotion state according to an embodiment of the present disclosure. As shown in fig. 2, the method 200 of determining a mood state according to this embodiment includes operations S210 and S230.
In response to detecting the generation of the first interaction information, an emotional state of the user at the time of the generation of the first interaction information is determined according to the first interaction information in operation S210.
According to an embodiment of the present disclosure, the first interaction information is information generated in response to a user's operation of the social class application. For example, the first interaction information may be generated in response to a user browsing, posting, uploading, selecting, and/or sharing information via the social-type application. Accordingly, the first interactive information includes browsing, publishing, uploading, selecting and/or sharing information. The information that the user browses, selects, and/or shares may include, for example, at least one of: information published by other users, emotion test class information, recommended articles, and the like. The information uploaded by the user may include, for example, emotional test class information or recommended articles, etc. The information published by the user may include, for example, at least one of: personal dynamic information shared to other users, information commenting on information posted by other users, information submitted when the commenting information of other users is replied, and the like.
For example, when a user performs emotion testing through a social class application, first interaction information may be generated in response to the user selecting information of each test question, and the first interaction information includes answer information of the test question selected by the user. The first interaction information may be generated in response to the user sharing online activity in the social-type application with other users, and accordingly, the first interaction information may include activity profile content of the online activity shared by the user, and the like.
According to the embodiment of the disclosure, when the first interaction information is generated, the first interaction information can be identified, and emotion words in the first interaction information are extracted. The emotional state of the user is then determined from the emotional words. For example, if the meaning of the affective word expression is an active optimistic meaning, then the emotional state of the user may be determined to be a pleasant state. If the meaning of the affective word expression is a negative pessimistic meaning, the emotional state of the user can be determined to be a sunken state.
According to embodiments of the present disclosure, the degree of emotional state of a user may be determined according to the meaning of the emotional word expression. For example, if the affective words include words such as "happiness", "like", etc., the emotional state of the user may be determined to be a first level of pleasure state, and if the affective words include words such as "happiness", "free", etc., the emotional state of the user may be determined to be a second level of pleasure state. The first level is higher than the second level. It will be appreciated that the above-described level determination and classification rules of emotional states are merely examples to facilitate an understanding of the present disclosure, which is not limited in this disclosure.
According to embodiments of the present disclosure, a predetermined classification model may be employed to process the first interaction information, and an emotional state of the user may be determined according to the classification result. The predetermined classification model can be, for example, an emotion classification model, and the emotion classification model can be constructed based on a semantic processing model, so that semantic understanding of interaction information is realized, and the accuracy of the determined emotion state is improved.
In operation S230, time series information of the emotional state of the user is updated according to the emotional state and the generation time of the first interaction information.
According to embodiments of the present disclosure, the timing information may be information of a user's emotional state that varies with time. In this embodiment, the emotional state may be digitized, the digitized emotional state may be a value on the ordinate, the generation time may be a value on the abscissa, and the value at the first interaction information generation time may be added to the existing time series information, thereby obtaining updated time series information.
According to embodiments of the present disclosure, a histogram with a digitized emotional state on the ordinate and a time on the abscissa may be constructed first when no existing timing information is present. The updated time series information can be obtained by adding the numerical value of the emotional state and the first interaction information generating time determined in operation S210 as a coordinate pair to the histogram.
According to embodiments of the present disclosure, an emotional state may be digitized according to a mapping relationship between the emotional state and a numerical value representing the emotion. Alternatively, the numerical value of the emotional state may be implemented according to a mapping relationship between the level of the emotional state and the numerical value representing the emotion.
According to embodiments of the present disclosure, the social class application may present the updated timing information in response to user operation so that the user knows about changes in emotional state in real time. For example, the social class application may be provided with an operation control that exposes timing information, which updated timing information is exposed in response to a user's selected operation of the operation control.
According to the method and the device for recording the emotional states of the users, the emotional states of the users are determined according to the interaction information, and the time sequence information diagram of the emotional states is updated in real time, so that the emotional states of the users can be recorded in the process that the users conduct social contact through the social contact application. Compared with the related art, the method and the device have the advantages that the user does not need to actively select or upload emotion state information, and the accuracy of the recorded emotion states of the user can be improved to a certain extent.
Fig. 3 schematically illustrates a schematic diagram of a method of determining a emotional state according to an embodiment of the disclosure.
According to embodiments of the present disclosure, an emotion value may be given to a user when determining an emotion state. The mood value may be adjusted based on interaction information associated with the user generated by the social class application. For example, the embodiment may obtain an emotion value given to the user in response to detecting generation of the interaction information associated with the user. The mood value is then adjusted based on the generated interaction information associated with the user. For example, when the interactive information associated with the user is positive energy or positive emotion information, the emotion value is increased, and when the interactive information associated with the user is negative energy or negative emotion information, the emotion value is decreased. In this way, the user can be motivated to improve the personal emotional state to a certain extent, facilitating the propagation of positive information in social applications.
According to embodiments of the present disclosure, the interaction information associated with the user may include, for example, the first interaction information described previously. In an embodiment, the interaction information associated with the user may also include second interaction information, which may be generated in response to other user operations on the first interaction information. For example, the second interaction information may be generated in response to an operation of browsing information shared by the user by other users, or in response to an operation of commenting, replying, praying, etc. on information published by the user by other users. Through the adjustment of the emotion value by the second interaction information, the user can be stimulated to share positive energy and positive information to a certain extent, and the times of sharing negative energy and negative information by the user are reduced.
For example, a predetermined initial emotion value may be given to a user when the user first uses a social class application. The mood value may be adjusted high upon detecting other users commenting, praying, and/or rewarding positive information posted by the user. When the user publishes negative information through a social-type application, or uses a psychological counseling service, the mood value may be lowered.
According to an embodiment of the disclosure, when the emotion value is reduced, for example, a part of the emotion value may be deducted in a mortgage manner, and if it is determined that negative energy and negative information are transmitted according to comment information of other users on the negative energy and negative information, the part of the emotion value of the mortgage is deducted from the emotion value of the user. By the method, the situation that emotion values are unreasonably adjusted due to low information identification accuracy can be avoided.
According to the embodiment of the present disclosure, when determining the emotional state of the user, the adjustment amount of the emotional value may be referred to, thereby improving the accuracy of the determined emotional state. For example, in the embodiment 300 shown in fig. 3, in response to detecting the first interaction information 301, the adjustment amount 302 of the emotion value may be determined first based on the first interaction information 301. After the adjustment amount 302 is determined, the emotion value 303 is adjusted. Meanwhile, the emotional state 304 of the user at the first interaction information generation moment of the user may be determined according to the adjustment amount 302.
Illustratively, the first interaction information may include information posted by the user, such as dynamic information posted by the user or comment information posted by the user. After obtaining the first interaction information, the embodiment may determine the category of the information published by the user using the predetermined classification model described above. And determining an adjustment amount according to the category of the information issued by the user so as to adjust the emotion value. The predetermined classification model may be constructed based on Long-short term memory (Long-Short Term Memory, LSTM) network models, for example, or may be a naive Bayesian classifier, or the like. In an embodiment, the predetermined classification model may identify information issued by the user, perform word segmentation processing and word sense analysis on the information issued by the user, and output and obtain an emotion tag for each word in the information issued by the user. When the adjustment amount is determined, statistics can be carried out on the emotion tags, and the number of words with various tags in information issued by a user is counted to obtain a statistical result. Then, a weighted sum of statistical results is calculated based on the weights assigned to the words having the various types of tags, and the weighted sum is used as an adjustment amount. The embodiment may maintain a mapping relationship between the adjustment amount and the emotional state in advance, and determine, after the adjustment amount is obtained, the emotional state of the user at the time of generating the first interaction information according to the mapping relationship.
Fig. 4 schematically illustrates a schematic diagram of a method of determining a emotional state according to another embodiment of the disclosure.
According to the embodiments of the present disclosure, while maintaining time series information of an emotional state, the emotional state of a user may be predicted according to the emotional state, and information recommended to the user via a social application may be determined according to the prediction result. In this way, positive, energetic information can be recommended to the user when the user's emotional depression is predicted, thereby adjusting the user's emotional state. When the emotion rise of the user is predicted, information is recommended to the user according to personal preference of the user.
Illustratively, as shown in fig. 4, after updated time series information 410 is obtained by the method described above, the embodiment 400 may analyze, for example, the trend of change of the emotional state represented by the updated time series information 410, and determine the predicted emotional state 420 of the user according to the analysis result, because the emotional state of the user generally changes again for a longer time. For example, if it is determined that the user's recent emotional state is an active state based on the updated timing information 410, it may be determined that the predicted emotional state 420 is an active state 421. If it is determined that the user's recent emotional state is a negative state based on the updated timing information 410, it may be determined that the predicted emotional state 420 is a negative state 422. It will be appreciated that the above-described method of determining a predicted emotional state is merely by way of example to facilitate an understanding of the disclosure, which is not limited thereto. In one embodiment, a deep learning model such as a Back-propagation neural network (Back-Propagation Network) model may also be employed to predict the emotional state of the user.
For example, if it is determined that the predicted emotional state is a negative state, the positive information 430 may be obtained as recommendation information to push to the user when the user uses the social-type application. The emotional state of the user is improved to a certain extent, and the stress from various aspects felt by the user is relieved. For example, when a user uses a social class application, the user may be presented with a page 440 as shown in FIG. 4 having a plurality of pieces of recommendation information therein, the recommendation information may include a positive energy class article, such as "Congestion with world warmth", etc., the recommendation information may include a fairy tale story, and/or the recommendation information may include a short joke, etc.
Fig. 5 schematically shows a schematic diagram of a change of a presentation page during execution of a method of determining an emotional state.
According to the embodiment of the disclosure, when the time sequence information of the emotion state is maintained, the time sequence information can be associated with the first interaction information for determining the emotion state at each moment, so that a user can conveniently and quickly jump to the first interaction information when browsing the time sequence information, and the user can conveniently review the generation cause of the emotion state at the historical moment.
Illustratively, in the embodiment 500 shown in fig. 5, after determining the emotional state of the user in response to detecting the generation of the first interaction information, a link between the timing information and the interaction text may be established according to an association relationship between the emotional state and the interaction text included in the first interaction information. After the link is established, in response to a user operation of an operation control for displaying timing information, a page 510 in fig. 5 may be displayed, which is displayed with updated timing information. The interactive text is text browsed, published and/or shared by the user. In the time sequence information, the information of each displayed moment and the first interaction information generated at each moment are linked. In this way, in response to an operation for information at each time in the time series information, the interactive text in the first interactive information generated at each time can be presented. For example, in response to a user selecting the information at the latest moment in the time sequence information, the page 510 may jump to the page 520, where the interactive text "encounters its even image and excites the" of the interactive text "showing the first interactive information generated at the latest moment.
Fig. 6 schematically shows a change schematic of a presentation page in the process of inputting information.
According to the embodiment of the disclosure, in the process that the user inputs information through the social application, the input information can be obtained in response to the input operation. And then carrying out real-time semantic analysis on the input information, determining the emotion value of the input information, and displaying the emotion value to the user. Therefore, the user can be conveniently and timely informed of the emotion conveyed by the input information, the user is stimulated to release the positive information, and the transmissibility of the positive information is improved.
For example, a predetermined semantic analysis model may be employed to determine the sentiment value of the information that has been entered. The predetermined semantic analysis model may extract, for example, emotion words in the inputted information and relationship words having a specified dependency relationship with the emotion words through the dependency syntax. After the emotion words and the relation words are obtained, the emotion value of the input information is calculated according to the preset emotion value of the emotion words and the relation weight of each type of relation words.
Illustratively, in the embodiment 600 shown in fig. 6, in response to a user's input operation to the page 610, the input information "encountering an own even image" 611 may be acquired. For the input information, emotion value 612 of the input information may be presented at any location around the input information. It can be understood that the value range of the emotion value can be set according to actual requirements. For example, the emotion value may be within an arbitrary range such as [ -10, 10] or [ -1,1 ].
According to the embodiment of the disclosure, in the process of inputting information by a user through a social application, for example, an image matched with the input information can be recommended to the user according to the input information, so that the user can conveniently inquire the image capable of reflecting emotion expressed by the input information when the image needs to be added to the input information, and convenience of information input and interestingness of information transmission are improved.
For example, a predetermined keyword extraction model may be employed to extract target keywords in the inputted information. The target keyword may be an emotion word, a verb capable of representing emotion, or the like. The embodiment may maintain a keyword lexicon on which the predetermined keyword extraction model is constructed. The predetermined keyword extraction model may perform word segmentation processing on the input information, compare the words obtained by the word segmentation with the keyword lexicon, and use the words belonging to the keyword lexicon as target keywords.
For example, after obtaining the target keyword, image information matching the target keyword may be obtained from an image library, for example. Each image information in the image library may be provided with a tag for characterizing emotion expressed by the image information. And determining the image information matched with the target keyword by matching the target keyword with the label of the image information. After the image information is obtained, the image information may be presented at any location in the page 610, as shown in FIG. 6. In the matching process, whether the target keywords are matched with the labels of the image information or not can be determined according to the similarity between the target keywords and the labels, and if the similarity is larger than a preset threshold value, the image information indicated by the labels is determined to be matched with the target keywords.
For example, when the image information includes a plurality of images, the number of images that can be displayed at one time in the page 610 may be determined according to the size of the area in the page 610 in which the image information is displayed and the size of the individual image. If the number of the plurality of images included in the image information is larger than the number of the images which can be displayed at one time, the images indicated by the labels with larger similarity with the target keywords can be displayed first. Meanwhile, the page 610 may switch the currently displayed image, for example, in response to a user sliding operation on the area displaying the image information, for example, if the user slides from right to left in the area displaying the image information, an image that is not displayed in the plurality of images is gradually displayed. In one embodiment, the display effect of the image information is indicated as 613 in FIG. 6.
According to embodiments of the present disclosure, the social class application may also insert image information into the input information, for example, in response to a user's selection operation of any one of the images. For example, when the user selects the left image illustrated in FIG. 6, page 610 may be switched to page 620, where input information 621 includes the user selected image in addition to the aforementioned "encountering own even image".
For example, if the number of the plurality of images included in the image information is greater than the number of images that can be displayed at one time, for example, the selected image in the displayed image information may be deleted, and the image that is not displayed yet may be displayed, the display effect of the image information is switched from the effect indicated by 613 in fig. 6 to the effect indicated by 623 in fig. 6.
Fig. 7 schematically illustrates a schematic diagram of a method of determining a emotional state according to still another embodiment of the disclosure.
According to the embodiment of the disclosure, the social application can maintain a plurality of information partitions, and each information partition corresponds to one preset storage partition, so that classified storage and classified display of generated interaction information are realized. When the user issues information, the social class application can analyze the information issued by the user, determine the category of the information issued by the user, and recommend the information partition corresponding to the category to the user. The predetermined storage partition corresponding to each information partition may be used to store one or more close types of release information, for example.
Illustratively, in the embodiment 700 shown in FIG. 7, after obtaining the information 710 published by the user (e.g., may be in response to the user selecting a "publish" control), the category of the information published by the user may be determined using the predetermined classification model 720 described previously. For example, the predetermined classification model may identify information issued by the user, perform word segmentation processing and word sense analysis on the information issued by the user, output and obtain emotion tags for each word in the information issued by the user, and use the tag with the largest number in the determined emotion tags as the category of the information issued by the user.
After obtaining the category of the information issued by the user, the storage partition storing the information of the category is regarded as a storage partition (i.e. the target storage partition 740) matching the category of the information issued by the user. Each storage partition is indexed, for example, by a category of stored information, and this embodiment may compare category 730 with the index of each storage partition to determine the storage partition indexed by category 730 as the target storage partition.
According to an embodiment of the present disclosure, when a user completes input of information and selects the "publish" control illustrated in page 750 in fig. 7 for information publication, for example, a plurality of maintained information partitions may be presented to the user as selection information for a plurality of predetermined storage partitions, and selection information for a target storage partition (i.e., an information partition corresponding to the target storage partition) is highlighted. Thereby facilitating the user to select the matched storage partition for information storage. The selection information for the target storage partition may be highlighted by, for example, adding a shadow effect, a zoom-in effect, a ghost effect, or the like. In an embodiment, as shown in fig. 7, when the information issued by the user is "encountering own even figure and happing", the target storage partition may be a storage partition corresponding to "excited partition", and the multiple maintained information partitions may include, for example, "excited partition", "calm partition", and "wounded partition". After the selection information for the target storage partition is presented, for example, the input information may be stored in a predetermined storage partition for which any selection information is intended in response to a user's selection operation for the any selection information. For example, if the user selects "excitation partition", information issued by the user is stored in a storage partition corresponding to the "excitation partition".
According to the embodiment of the disclosure, when the selection information selected by the user is other selection information than the selection information for the target storage partition, particularly when the target storage partition is the storage partition for which the "wounded partition" is aimed, but the user selects the "excited partition", the emotion value given to the user can be deducted, so that the user can issue the negative information in the partition for transmitting the positive information to perform corresponding punishment or warning, and the transmission of the negative information in the positive information partition is reduced.
Any of the interaction information described above may be generated, for example, in response to storing information published by a user. When adjusting the emotion value based on any of the interaction information, for example, it may be determined whether or not a storage partition storing information issued by the user is a target storage partition, and if not, the emotion value given to the user may be reduced.
In an embodiment, if the target storage partition is a storage partition corresponding to an information partition that propagates negative information, but the user selects a storage partition corresponding to an information partition that propagates positive information, more emotion values may be deducted. If the target storage partition is a storage partition (e.g., an excited partition) corresponding to an information partition that propagates positive information, but the user selects a storage partition (e.g., a wounded partition) corresponding to an information partition that propagates negative information, then fewer emotion values may be deducted. By the method, the accuracy of information partition storage can be ensured, and meanwhile, the transmissibility of active information is improved.
In an embodiment, when the user selects other storage partitions except the target storage partition, for example, prompt information may be popped up to prompt the user to deduct a plurality of emotion values if the user determines that the other storage partition is selected, so as to prompt the user to store information into the target storage partition.
Fig. 8 schematically shows a presentation interface diagram of a client application adapted to implement a method of determining an emotional state.
According to embodiments of the present disclosure, the social class application may also provide emotional test class information to the user in order for the user to know the personal mental health status in time. The first interaction information described above may also be generated in response to a user selection operation of predetermined emotion test information, for example. Accordingly, the first interactive information includes option information in emotion test information selected by the user. For example, in embodiment 800 as shown in fig. 8, if the emotional test information includes a test title presented by page 810, "i feel the best time of day is? "test options that may be selected include" morning "," noon "," afternoon "," evening ". If the user selects "morning," the first interactive information may include "I feel that the best time of day is morning.
According to the embodiment of the disclosure, after the user completes the selection operation of the emotion test information, for example, a test result for the predetermined emotion test information may be obtained. In this embodiment, for example, a virtual badge may be given to the user according to the test result, thereby improving the interest of the emotion test.
For example, the social class application maintains a plurality of predetermined virtual badges for each emotional test, which may include, for example, a "sun" pattern, a "white cloud" pattern, an "black cloud" pattern, a "raindrop" pattern, etc., to correspond to test results of "sunlight", "sunk", "light sadness", and "heavy sadness", respectively. It will be appreciated that the above test results and virtual badges are merely examples to facilitate an understanding of the present disclosure, which is not limited in this disclosure. And it is understood that the social class application may maintain a predetermined virtual badge for only a portion of the emotional tests that may be set according to actual needs, which is not limited by the present disclosure.
According to embodiments of the present disclosure, after a virtual badge is given to a user, the user may wear the virtual badge, for example, in a personal home presentation page 820, in order to present the virtual badge that they own to other users, enhancing the user's interest in using social class applications. By this way of imparting a virtual badge, the user may be motivated to get more positive energy badge to some extent, thereby motivating the user to improve personal mood. In the embodiment shown in FIG. 8, the worn virtual badge 821 may be displayed, for example, under the personal user name "XXXXX".
Based on the method of determining an emotional state described above, the present disclosure also provides an apparatus for determining an emotional state. The device will be described in detail below in connection with fig. 9.
Fig. 9 is a block diagram of an apparatus for determining an emotion state according to an embodiment of the present disclosure.
As shown in fig. 9, the apparatus 900 for determining an emotional state according to this embodiment may include an emotional state determining module 910 and a timing information updating module 930.
The emotional state determining module 910 is configured to determine, in response to detecting the generation of the first interaction information, an emotional state of the user at a time of the generation of the first interaction information according to the first interaction information. In an embodiment, the emotional state determination module 910 may be configured to perform the operation S210 described above, which is not described herein.
The time sequence information updating module 930 is configured to update time sequence information of the emotional state of the user according to the emotional state and the generation time of the first interaction information. In an embodiment, the timing information updating module 930 may be configured to perform the operation S230 described above, which is not described herein.
The apparatus 900 for determining an emotional state may further include an emotion value acquisition module and an emotion value adjustment module, for example. The emotion value acquisition module is used for acquiring emotion values given to the user in response to the detection of generation of any one of the first interaction information and the second interaction information. The emotion value adjustment module is used for adjusting emotion values based on any interaction information. The second interaction information is generated in response to the operation of other users on the first interaction information, and the first interaction information is generated in response to the operation of the users.
According to an embodiment of the present disclosure, the first interaction information includes information published by the user. The emotion value adjustment module may include an information classification sub-module and an adjustment sub-module. The information classification submodule is used for determining the category of information issued by the user by adopting a preset classification model. The adjusting submodule is used for adjusting the emotion value according to the category of the information issued by the user.
The apparatus 900 for determining an emotional state may further include, for example, an emotion prediction module and a recommendation information acquisition module according to an embodiment of the present disclosure. And the emotion prediction module is used for determining a predicted emotion state of the user according to the updated time sequence information. The recommendation information acquisition module is used for acquiring active information as recommendation information under the condition that the predicted emotion state is a negative state.
According to an embodiment of the present disclosure, the first interaction information includes interaction text, and the apparatus 900 for determining an emotional state may further include a link establishment module and an information presentation module, for example. The link establishment module is used for establishing a link between the time sequence information and the interactive text according to the association relation between the emotion state and the interactive text in time. The information display module is used for displaying the interactive text in response to the operation aiming at the target information in the time sequence information. The target information is information of the first interaction information generation time in the time sequence information.
The apparatus 900 for determining an emotional state may further include, for example, an input information obtaining module, an emotion value determining module, and an information presentation module according to embodiments of the present disclosure. The input information obtaining module is used for responding to the input operation and obtaining input information. The emotion value determining module is used for determining emotion values of the input information by adopting a preset semantic analysis model. The information display module is used for displaying the emotion value on a display page of the input information.
The apparatus 900 for determining an emotional state may further include a keyword extraction module and an information insertion module, for example, according to an embodiment of the present disclosure. The keyword extraction module is used for extracting target keywords in the input information by adopting a preset keyword extraction model. The information display module is also used for displaying the image information matched with the target keywords on a display page of the input information. The information inserting module is used for inserting the image information into the input information in response to the selected operation of the image information.
The apparatus 900 for determining an emotional state may further include, for example, a storage partition determination module, an information presentation module, and an information storage module according to embodiments of the disclosure. The storage partition determining module is used for determining a storage partition matched with the category of the information issued by the user in a plurality of preset storage partitions as a target storage partition in response to the information issued by the user. The information display module is used for displaying the selection information aiming at a plurality of preset storage partitions and highlighting the selection information aiming at the target storage partition. The information storage module is used for responding to the selection operation of any one of the selection information of the plurality of preset storage partitions, and storing the information issued by the user into the preset storage partition for which any one of the selection information is aimed.
According to an embodiment of the present disclosure, the emotion value adjustment module is specifically configured to reduce the emotion value when a predetermined storage partition in which information issued by a user is stored is a predetermined storage partition other than the target storage partition.
The apparatus 900 for determining an emotional state described above may further include, for example, a test result determination module, a virtual badge determination module, and a virtual badge assignment module according to embodiments of the present disclosure. The test result determining module is used for determining a test result aiming at the preset emotion test information in response to the selection operation of the preset emotion test information. The virtual badge determination module is to determine a virtual badge of the plurality of predetermined virtual badges that matches the test result. The virtual badge allocation module is used for giving the user a virtual badge matched with the test result.
According to an embodiment of the present disclosure, the interaction information includes at least one of: interaction information generated in response to the release of the information; interactive information generated in response to a selection operation of predetermined emotion test information; the method comprises the steps of responding to interaction information generated by sharing operation of display information; interaction information generated in response to an operation on an online activity of the interaction class application.
According to embodiments of the present disclosure, the present disclosure also provides an electronic device, a readable storage medium and a computer program product.
Fig. 10 illustrates a schematic block diagram of an electronic device 1000 that may be used to implement a method of determining a mood state in accordance with an embodiment of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 10, the apparatus 1000 includes a computing unit 1001 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 1002 or a computer program loaded from a storage unit 1008 into a Random Access Memory (RAM) 1003. In the RAM 1003, various programs and data required for the operation of the device 1000 can also be stored. The computing unit 1001, the ROM 1002, and the RAM 1003 are connected to each other by a bus 1004. An input/output (I/O) interface 1005 is also connected to bus 1004.
Various components in device 1000 are connected to I/O interface 1005, including: an input unit 1006 such as a keyboard, a mouse, and the like; an output unit 1007 such as various types of displays, speakers, and the like; a storage unit 1008 such as a magnetic disk, an optical disk, or the like; and communication unit 1009 such as a network card, modem, wireless communication transceiver, etc. Communication unit 1009 allows device 1000 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunications networks.
The computing unit 1001 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 1001 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 1001 performs the respective methods and processes described above, for example, a method of determining an emotional state. For example, in some embodiments, the method of determining an emotional state may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as the storage unit 1008. In some embodiments, part or all of the computer program may be loaded and/or installed onto device 1000 via ROM 1002 and/or communication unit 1009. When a computer program is loaded into RAM 1003 and executed by computing unit 1001, one or more steps of the method of determining a emotional state described above may be performed. Alternatively, in other embodiments, the computing unit 1001 may be configured to perform the method of determining the emotional state by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps recited in the present disclosure may be performed in parallel, sequentially, or in a different order, provided that the desired results of the disclosed aspects are achieved, and are not limited herein.
The above detailed description should not be taken as limiting the scope of the present disclosure. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present disclosure are intended to be included within the scope of the present disclosure.

Claims (10)

1. A method of determining an emotional state, comprising:
in response to detecting generation of first interaction information, determining an emotional state of a user at the moment of generating the first interaction information according to the first interaction information;
updating time sequence information of the emotional state of the user according to the emotional state and the generation moment of the first interaction information;
in response to obtaining the information issued by the user, determining a storage partition matched with the category of the information issued by the user in a plurality of preset storage partitions as a target storage partition;
displaying selection information for a plurality of preset storage partitions, and highlighting the selection information for the target storage partition;
responding to the selection operation of any one of the selection information aiming at the plurality of preset storage partitions, and storing the information issued by the user to the preset storage partition aiming at any one of the selection information;
In response to detecting generation of any one of the first interactive information and the second interactive information, obtaining an emotion value given to the user;
determining the category of the information issued by the user by adopting a preset classification model; and
according to the category of the information issued by the user, adjusting the emotion value;
the first interaction information is generated in response to the operation of the user, and comprises information issued by the user; the second interaction information is generated in response to the operation of other users on the first interaction information; any interaction information includes information generated in response to storing information published by the user;
wherein said adjusting said mood value comprises: and reducing the emotion value under the condition that the preset storage partition stored with the information issued by the user is other preset storage partitions except the target storage partition.
2. The method of claim 1, further comprising:
determining a predicted emotional state of the user according to the updated time sequence information; and
and acquiring active information as recommended information in the case that the predicted emotional state is a negative state.
3. The method of claim 1, wherein the first interaction information comprises interaction text, the method further comprising:
establishing a link between the time sequence information and the interactive text according to the association relation between the emotion state and the interactive text in time; and
in response to an operation for target information in the timing information, the interactive text is presented,
the target information is information of the first interaction information generation time in the time sequence information.
4. The method of claim 1, further comprising:
obtaining input information in response to an input operation;
determining an emotion value of the input information by adopting a preset semantic analysis model; and
and displaying the emotion value on the display page of the input information.
5. The method of claim 4, further comprising:
extracting target keywords in the input information by adopting a preset keyword extraction model;
displaying the image information matched with the target keyword on a display page of the input information; and
the image information is inserted into the input information in response to a selection operation of the image information.
6. The method of claim 1, further comprising:
Determining a test result for predetermined emotion test information in response to a selection operation of the predetermined emotion test information;
determining a virtual badge of a plurality of predetermined virtual badges that matches the test result; and
the user is given a virtual badge that matches the test results.
7. The method of any of claims 1-6, wherein the first interaction information comprises at least one of:
interaction information generated in response to the release of the information;
interactive information generated in response to a selection operation of predetermined emotion test information;
the method comprises the steps of responding to interaction information generated by sharing operation of display information;
interaction information generated in response to an operation on an online activity of the social class application.
8. An apparatus for determining an emotional state, comprising:
the emotional state determining module is used for determining the emotional state of the user at the moment of generating the first interactive information according to the first interactive information in response to the detection of the generation of the first interactive information; and
the time sequence information updating module is used for updating the time sequence information of the emotional state of the user according to the emotional state and the generation moment of the first interaction information;
A target storage partition determining module, configured to determine, as a target storage partition, a storage partition matching a category of information issued by the user from among a plurality of predetermined storage partitions in response to obtaining the information issued by the user;
the display module is used for displaying the selection information aiming at a plurality of preset storage partitions and prominently displaying the selection information aiming at the target storage partition;
a storage module, configured to store information issued by the user to a predetermined storage partition for which any one of selection information is targeted, in response to a selection operation of any one of selection information for the plurality of predetermined storage partitions;
the acquisition module is used for responding to the generation of any one of the first interaction information and the second interaction information to acquire an emotion value given to the user;
the category determining module is used for determining the category of the information issued by the user by adopting a preset classification model; and
the adjusting module is used for adjusting the emotion value according to the category of the information issued by the user;
the first interaction information is generated in response to the operation of the user, and comprises information issued by the user; the second interaction information is generated in response to the operation of other users on the first interaction information; any interaction information includes information generated in response to storing information published by the user;
Wherein said adjusting said mood value comprises: and reducing the emotion value under the condition that the preset storage partition stored with the information issued by the user is other preset storage partitions except the target storage partition.
9. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
10. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of claims 1-7.
CN202110195018.1A 2021-02-20 2021-02-20 Method, apparatus, device and storage medium for determining emotional state Active CN112906399B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110195018.1A CN112906399B (en) 2021-02-20 2021-02-20 Method, apparatus, device and storage medium for determining emotional state

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110195018.1A CN112906399B (en) 2021-02-20 2021-02-20 Method, apparatus, device and storage medium for determining emotional state

Publications (2)

Publication Number Publication Date
CN112906399A CN112906399A (en) 2021-06-04
CN112906399B true CN112906399B (en) 2023-11-10

Family

ID=76124202

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110195018.1A Active CN112906399B (en) 2021-02-20 2021-02-20 Method, apparatus, device and storage medium for determining emotional state

Country Status (1)

Country Link
CN (1) CN112906399B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104538043A (en) * 2015-01-16 2015-04-22 北京邮电大学 Real-time emotion reminder for call
CN106730234A (en) * 2017-01-11 2017-05-31 上海北辰软件股份有限公司 A kind of intelligent mood persuasion system
KR20180121069A (en) * 2017-04-28 2018-11-07 이화여자대학교 산학협력단 Music content providing method and music content creation method for managing ptsd(post-traumatic stress disorder)
CN109308466A (en) * 2018-09-18 2019-02-05 宁波众鑫网络科技股份有限公司 The method that a kind of pair of interactive language carries out Emotion identification
CN109446378A (en) * 2018-11-08 2019-03-08 北京奇艺世纪科技有限公司 Information recommendation method, Sentiment orientation determine method and device and electronic equipment
CN109885713A (en) * 2019-01-03 2019-06-14 刘伯涵 Facial expression image recommended method and device based on voice mood identification
CN109977100A (en) * 2016-05-24 2019-07-05 甘肃百合物联科技信息有限公司 A method of enhancing positive psychology state
CN110519617A (en) * 2019-07-18 2019-11-29 平安科技(深圳)有限公司 Video comments processing method, device, computer equipment and storage medium
CN110674300A (en) * 2019-09-30 2020-01-10 京东城市(北京)数字科技有限公司 Method and apparatus for generating information
CN110717542A (en) * 2019-10-12 2020-01-21 广东电网有限责任公司 Emotion recognition method, device and equipment
US10817316B1 (en) * 2017-10-30 2020-10-27 Wells Fargo Bank, N.A. Virtual assistant mood tracking and adaptive responses
CN112233698A (en) * 2020-10-09 2021-01-15 中国平安人寿保险股份有限公司 Character emotion recognition method and device, terminal device and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170046496A1 (en) * 2015-08-10 2017-02-16 Social Health Innovations, Inc. Methods for tracking and responding to mental health changes in a user
US10699104B2 (en) * 2018-05-03 2020-06-30 International Business Machines Corporation Image obtaining based on emotional status

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104538043A (en) * 2015-01-16 2015-04-22 北京邮电大学 Real-time emotion reminder for call
CN109977100A (en) * 2016-05-24 2019-07-05 甘肃百合物联科技信息有限公司 A method of enhancing positive psychology state
CN106730234A (en) * 2017-01-11 2017-05-31 上海北辰软件股份有限公司 A kind of intelligent mood persuasion system
KR20180121069A (en) * 2017-04-28 2018-11-07 이화여자대학교 산학협력단 Music content providing method and music content creation method for managing ptsd(post-traumatic stress disorder)
US10817316B1 (en) * 2017-10-30 2020-10-27 Wells Fargo Bank, N.A. Virtual assistant mood tracking and adaptive responses
CN109308466A (en) * 2018-09-18 2019-02-05 宁波众鑫网络科技股份有限公司 The method that a kind of pair of interactive language carries out Emotion identification
CN109446378A (en) * 2018-11-08 2019-03-08 北京奇艺世纪科技有限公司 Information recommendation method, Sentiment orientation determine method and device and electronic equipment
CN109885713A (en) * 2019-01-03 2019-06-14 刘伯涵 Facial expression image recommended method and device based on voice mood identification
CN110519617A (en) * 2019-07-18 2019-11-29 平安科技(深圳)有限公司 Video comments processing method, device, computer equipment and storage medium
CN110674300A (en) * 2019-09-30 2020-01-10 京东城市(北京)数字科技有限公司 Method and apparatus for generating information
CN110717542A (en) * 2019-10-12 2020-01-21 广东电网有限责任公司 Emotion recognition method, device and equipment
CN112233698A (en) * 2020-10-09 2021-01-15 中国平安人寿保险股份有限公司 Character emotion recognition method and device, terminal device and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于微博数据的社交情绪研究;余吉飞;凌巍;王威;田彩丽;文玉;梁美燕;;电脑编程技巧与维护(06);66-68 *
微信朋友圈中健康信息传播行为研究;金晓玲;冯慧慧;周中允;;管理科学(01);77-86 *

Also Published As

Publication number Publication date
CN112906399A (en) 2021-06-04

Similar Documents

Publication Publication Date Title
US10534635B2 (en) Personal digital assistant
CN109522483B (en) Method and device for pushing information
WO2022141861A1 (en) Emotion classification method and apparatus, electronic device, and storage medium
WO2020253503A1 (en) Talent portrait generation method, apparatus and device, and storage medium
CN113590776B (en) Knowledge graph-based text processing method and device, electronic equipment and medium
CN113722438B (en) Sentence vector generation method and device based on sentence vector model and computer equipment
CN114265979A (en) Method for determining fusion parameters, information recommendation method and model training method
CN112330455A (en) Method, device, equipment and storage medium for pushing information
US10909323B2 (en) Automatic generation of scientific article metadata
CN111400613A (en) Article recommendation method, device, medium and computer equipment
CN114036398A (en) Content recommendation and ranking model training method, device, equipment and storage medium
CN112560461A (en) News clue generation method and device, electronic equipment and storage medium
CN112926308B (en) Method, device, equipment, storage medium and program product for matching text
CN113392920B (en) Method, apparatus, device, medium, and program product for generating cheating prediction model
KR102438679B1 (en) Operating method of server for providing media marketing service
CN117057855A (en) Data processing method and related device
EP4307136A1 (en) Sorting method and apparatus for search results, and electronic device and storage medium
CN112906399B (en) Method, apparatus, device and storage medium for determining emotional state
CN113239273B (en) Method, apparatus, device and storage medium for generating text
CN115248890A (en) User interest portrait generation method and device, electronic equipment and storage medium
CN113221534B (en) Text emotion analysis method and device, electronic equipment and storage medium
US11907508B1 (en) Content analytics as part of content creation
KR102547098B1 (en) Systems and methods to support overseas direct purchase services
CN108628861B (en) Method and device for pushing information
CN117313677A (en) Comment generation method and device, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant