CN111326222A - Emotion analysis method, device and system based on user diary - Google Patents
Emotion analysis method, device and system based on user diary Download PDFInfo
- Publication number
- CN111326222A CN111326222A CN202010071623.3A CN202010071623A CN111326222A CN 111326222 A CN111326222 A CN 111326222A CN 202010071623 A CN202010071623 A CN 202010071623A CN 111326222 A CN111326222 A CN 111326222A
- Authority
- CN
- China
- Prior art keywords
- user
- diary
- emotion
- emotional state
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000008451 emotion Effects 0.000 title claims abstract description 149
- 238000004458 analytical method Methods 0.000 title claims abstract description 32
- 230000002996 emotional effect Effects 0.000 claims abstract description 174
- 238000000034 method Methods 0.000 claims abstract description 38
- 230000008859 change Effects 0.000 claims abstract description 16
- 238000012549 training Methods 0.000 claims description 14
- 230000004044 response Effects 0.000 claims description 6
- 238000004891 communication Methods 0.000 description 17
- 238000010586 diagram Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 5
- 206010015535 Euphoric mood Diseases 0.000 description 4
- 230000036651 mood Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 206010012374 Depressed mood Diseases 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 241001539473 Euphoria Species 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000005284 excitation Effects 0.000 description 2
- 238000012015 optical character recognition Methods 0.000 description 2
- 230000007723 transport mechanism Effects 0.000 description 2
- LQIAZOCLNBBZQK-UHFFFAOYSA-N 1-(1,2-Diphosphanylethyl)pyrrolidin-2-one Chemical compound PCC(P)N1CCCC1=O LQIAZOCLNBBZQK-UHFFFAOYSA-N 0.000 description 1
- 208000027534 Emotional disease Diseases 0.000 description 1
- 206010022998 Irritability Diseases 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000005728 strengthening Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/20—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for electronic clinical trials or questionnaires
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/60—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Epidemiology (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Biomedical Technology (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Pathology (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The embodiment of the invention provides a method, a device and a system for emotion analysis based on a user diary, which can analyze and record the emotion state of a user and send out a warning that the user may have emotion problems under the condition of not revealing the content of the user diary. The method comprises the following steps: the user terminal uploads the first diary data and the second diary data to the server; the first journal data includes a first emotional state filled in by the user; the second diary data does not include the first emotional state filled in by the user; receiving a second emotional state fed back by the server according to the second diary data; generating a user emotion record according to the first emotion state and the second emotion state; and when detecting that the preset number of days for which the low emotion lasts exceeds a first preset value or the emotion change amplitude exceeds a second preset value in the emotion record of the user, sending out warning information.
Description
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a method, a device and a system for emotion analysis based on a user diary.
Background
With the popularity of electronic devices, people increasingly prefer to replace paper notebook diaries with diary applications (apps) on mobile devices. The diary app has the advantages of convenience in storage, editing and classification, and can be used for setting background pictures, cover pictures, fonts, music and the like of the diary and the diary, so that the diary is rich and colorful. These advantages attract a lot of diary application app users.
The functionality typically provided by existing diary apps includes, and is not limited to: 1. establishing a diary or diary book; 2. editing a diary or diary; 3. setting pictures, fonts and music for the diary/diary; 4. the synchronization of diaries or diaries among different devices ensures that data is backed up at the cloud without loss; 5. various abundant diary templates are provided for users, and the diary template is vivid and interesting; 6. automatically insert weather information for the user, etc.; 7. privacy lock, i.e. a paper notebook-like lock function.
However, developers of diary apps have ignored that user writing a diary is typically used to record their daily experiences and true mood, i.e., the diary content reflects the emotional state of the user. The existing diary app cannot recognize the emotional state of the diary and feed back the emotional state to the user, so that the user is helped to manage the emotion.
Disclosure of Invention
To this end, the present invention provides a method, apparatus and system for emotion analysis based on a user's diary in an attempt to solve or at least alleviate at least one of the problems presented above.
According to an aspect of an embodiment of the present invention, there is provided a method for emotion analysis based on a user diary, including:
the user terminal uploads the first diary data and the second diary data to the server; the first journal data includes a first emotional state filled in by the user; the second diary data does not include the first emotional state filled in by the user;
receiving a second emotional state fed back by the server according to the second diary data;
generating a user emotion record according to the first emotion state and the second emotion state;
and when detecting that the preset number of days for which the low emotion lasts exceeds a first preset value or the emotion change amplitude exceeds a second preset value in the emotion record of the user, sending out warning information.
Optionally, the method further comprises:
and drawing the generated emotion records of the users into a chart form.
Optionally, the first emotional state or the second emotional state comprises one of at least three predefined emotional states.
Optionally, issuing alert information, comprising:
and sending out warning information to a contact preset by the user.
According to another aspect of the embodiments of the present invention, there is provided a method for emotion analysis based on a user diary, including:
the server acquires diary data uploaded by the user terminal;
analyzing a second emotional state corresponding to the diary content contained in the diary data according to a pre-trained model;
if the diary data is detected to contain a first emotion state filled by the user and the first emotion state is detected to be inconsistent with a second emotion state, training the diary content and the first emotion state input model;
and if the diary data is detected not to contain the first emotional state filled by the user, returning the second emotional state to the user terminal.
Optionally, the first emotional state or the second emotional state comprises one of at least three predefined emotional states.
Optionally, analyzing a second emotional state corresponding to the diary content included in the diary data according to a pre-trained model, including:
extracting keywords in the diary content according to a pre-trained model, and grading the keywords;
integrating scores of the plurality of keywords and determining an emotion value corresponding to the diary content;
and determining a second emotional state according to the emotional value and the corresponding relation between the predetermined emotional value and the emotional state.
According to still another aspect of the present invention, there is provided a user terminal device including:
the diary data uploading unit is used for uploading the first diary data and the second diary data to the server; the first journal data includes a first emotional state filled in by the user; the second diary data does not include the first emotional state filled in by the user;
a feedback receiving unit, configured to receive a second emotional state fed back by the server according to the second diary data;
the recording unit is used for generating a user emotion record according to the first emotion state and the second emotion state;
and the warning unit is used for sending warning information when the preset number of days for which the emotion is dropped exceeds a first preset value or the emotion change amplitude exceeds a second preset value in the detected emotion records of the user.
Optionally, the user terminal device further includes:
and the drawing unit is used for drawing the generated emotion records of the users into a chart form.
Optionally, the first emotional state or the second emotional state comprises one of at least three predefined emotional states.
Optionally, the alert unit is specifically configured to:
and sending out warning information to a contact preset by the user.
According to still another aspect of an embodiment of the present invention, there is provided a server apparatus including:
the diary data receiving unit is used for acquiring diary data uploaded by the user terminal;
the analysis unit is used for analyzing a second emotional state corresponding to the diary content contained in the diary data according to the pre-trained model;
the first response unit is used for training the diary content and the first emotion state input model if the diary data is detected to contain the first emotion state filled by the user and the first emotion state is detected to be inconsistent with the second emotion state;
and the second response unit is used for returning the second emotional state to the user terminal if the diary data is detected not to contain the first emotional state filled by the user.
Optionally, the first emotional state or the second emotional state comprises one of at least three predefined emotional states.
Optionally, the analysis unit is specifically configured to:
extracting keywords in the diary content according to a pre-trained model, and grading the keywords;
integrating scores of the plurality of keywords and determining an emotion value corresponding to the diary content;
and determining a second emotional state according to the emotional value and the corresponding relation between the predetermined emotional value and the emotional state.
According to still another aspect of an embodiment of the present invention, there is provided an emotion analysis system based on a user diary, including:
the user terminal equipment is used for uploading the first diary data and the second diary data to the server; the first journal data includes a first emotional state filled in by the user; the second diary data does not include the first emotional state filled in by the user; receiving a second emotional state fed back by the server according to the second diary data; generating a user emotion record according to the first emotion state and the second emotion state; when detecting that the number of days for which the emotion is dropped in the emotion records of the user exceeds a first preset value or the emotion change amplitude exceeds a second preset value, sending out warning information;
the server equipment is used for acquiring diary data uploaded by the user terminal; analyzing a second emotional state corresponding to the diary content contained in the diary data according to a pre-trained model; if the diary data is detected to contain a first emotion state filled by the user and the first emotion state is inconsistent with a second emotion state, the diary content and the first emotion state are input into the model for training; and if the diary data is detected not to contain the first emotional state filled by the user, returning the second emotional state to the user terminal.
According to the embodiment of the invention, a user terminal uploads first diary data and second diary data to a server, wherein the first diary data comprises a first emotional state filled in by a user; the second diary data does not include the first emotion state filled by the user, the receiving server generates a user emotion record according to the second emotion state fed back by the second diary data and the first emotion state and the second emotion state, and when the preset number of days of low emotion in the user emotion record exceeds a first preset value or the emotion change amplitude exceeds a second preset value, warning information is sent out; therefore, the emotion of the user is continuously analyzed and recorded, when the emotion of the user is monitored to be in a problem, the warning information can be sent out under the condition that diary content is not leaked, the server can continuously optimize the model according to data uploaded by the user, and the accuracy of the model is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention and together with the description serve to explain the principles of the invention.
FIG. 1 is a block diagram of a computing device according to an embodiment of the invention;
FIG. 2 is a flow diagram of a method of sentiment analysis based on a user diary according to an embodiment of the present invention;
FIG. 3 is yet another flow diagram of a method of emotion analysis based on a user's diary, in accordance with an embodiment of the present invention;
FIG. 4 is a flow diagram of a method of sentiment analysis based on a user diary, in accordance with a specific embodiment of the present invention;
fig. 5 is a block diagram of a structure of a user terminal device according to an embodiment of the present invention;
fig. 6 is a block diagram of a server apparatus according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
Fig. 1 is a block diagram of an example computing device 100 arranged to implement a method of emotion analysis based on a user diary in accordance with the present invention. In a basic configuration 102, computing device 100 typically includes system memory 106 and one or more processors 104. A memory bus 108 may be used for communication between the processor 104 and the system memory 106.
Depending on the desired configuration, the processor 104 may be any type of processing, including but not limited to: the processor 104 may include one or more levels of cache, such as a level one cache 110 and a level two cache 112, a processor core 114, and registers 116. the example processor core 114 may include an Arithmetic Logic Unit (ALU), a Floating Point Unit (FPU), a digital signal processing core (DSP core), or any combination thereof.
Depending on the desired configuration, system memory 106 may be any type of memory, including but not limited to: volatile memory (such as RAM), non-volatile memory (such as ROM, flash memory, etc.), or any combination thereof. System memory 106 may include an operating system 120, one or more programs 122, and program data 124. In some implementations, the program 122 can be configured to execute instructions on an operating system by one or more processors 104 using program data 124.
Computing device 100 may also include an interface bus 140 that facilitates communication from various interface devices (e.g., output devices 142, peripheral interfaces 144, and communication devices 146) to the basic configuration 102 via the bus/interface controller 130. The example output device 142 includes a graphics processing unit 148 and an audio processing unit 150. They may be configured to facilitate communication with various external devices, such as a display terminal or speakers, via one or more a/V ports 152. Example peripheral interfaces 144 may include a serial interface controller 154 and a parallel interface controller 156, which may be configured to facilitate communication with external devices such as input devices (e.g., keyboard, mouse, pen, voice input device, touch input device) or other peripherals (e.g., printer, scanner, etc.) via one or more I/O ports 158. An example communication device 146 may include a network controller 160, which may be arranged to facilitate communications with one or more other computing devices 162 over a network communication link via one or more communication ports 164.
A network communication link may be one example of a communication medium. Communication media may typically be embodied by computer readable instructions, data structures, program modules, and may include any information delivery media, such as carrier waves or other transport mechanisms, in a modulated data signal. A "modulated data signal" may be a signal that has one or more of its data set or its changes made in such a manner as to encode information in the signal. By way of non-limiting example, communication media may include wired media such as a wired network or private-wired network, and various wireless media such as acoustic, Radio Frequency (RF), microwave, Infrared (IR), or other wireless media. The term computer readable media as used herein may include both storage media and communication media.
Specifically, the computing device 100 may be implemented as a personal computer including a desktop computer and a notebook computer configuration, may be implemented as a vertical or cabinet server device, and may be implemented as an industrial personal computer device.
Computing device 100 may also be implemented as part of a small-form factor portable (or mobile) electronic device such as a cellular telephone, a Personal Digital Assistant (PDA), a personal media player device, a wireless web-watch device, a personal headset device, an application specific device, or a hybrid device that include any of the above functions.
Wherein the one or more programs 122 of computing device 100 include instructions for performing a method of emotion analysis based on a user diary according to an embodiment of the present invention.
Referring to fig. 2, a method 200 for emotion analysis based on a user diary provided by the embodiment of the present invention starts at step S210, and includes:
s210, the user terminal uploads the first diary data and the second diary data to a server; the first journal data includes a first emotional state filled in by the user; the second diary data does not include the first emotional state filled in by the user.
Since the diary app does not generally force the user to manually select or fill in the emotional state, the user terminal may or may not acquire the first emotional state, and the first emotional state cannot be acquired for an old existing electronic log or paper log of the user. Therefore, the diary data uploaded by the user terminal may or may not include the first emotional state filled by the user.
And S220, receiving a second emotion state fed back by the server according to the second diary data.
The server directly returns the first emotion state if the user uploads the first emotion state, and returns a second emotion state obtained by analyzing the diary text by a preset model if the user does not upload the first emotion state.
Optionally, the user terminal receives a first emotional state fed back by the server according to the first diary data, and receives a second emotional state fed back by the server according to the second diary data. And the user terminal uniformly records according to the emotional state data returned by the server.
Optionally, the user terminal receives only the second emotional state fed back by the server according to the second diary data; after the user selects or fills in the first emotional state by himself, the user terminal directly records the first emotional state selected or filled in by the user locally.
And S230, generating a user emotion record according to the first emotion state and the second emotion state.
Optionally, the generated user emotion record is drawn into a chart form, so that the user can observe daily changes of the emotional state conveniently.
S240, when detecting that the preset number of days for which the emotion is dropped in the emotion records of the user exceeds a first preset value or the emotion change amplitude exceeds a second preset value, sending out warning information.
Optionally, the first emotional state or the second emotional state comprises one of at least three predefined emotional states. The emotional state belonging to the low-lying emotion is predefined among the at least three emotional states.
Further, the first emotional state or the second emotional state comprises one of seven emotional states, or the first emotional state or the second emotional state comprises one of ten emotional states, the greater the number of predefined emotional states, the more accurate the results of the emotional analysis.
Optionally, the emotional state may be a specific emotional value, or may be a predefined special character corresponding to a segment of emotional value interval.
Optionally, issuing alert information, comprising: and sending out warning information to a contact preset by the user.
Alternatively, the first preset value may be 3 days, or 5 days.
Alternatively, the second preset value may be a change in emotional state of 3 or more levels.
For example, there are 5 predetermined emotional states: difficulty, depression, calmness, euphoria and excitation, and the corresponding emotion value interval is as follows: [0,20), [20,40), [40,60), [60,80), [80, 100); the user can select the emotional state by himself when writing a diary, wherein the difficult and depressed mood is set as a low mood; and when the user does not select the emotional state by himself, the server analyzes the emotional state of the user according to the diary text, for example, if the emotional value obtained after text analysis is 62, the emotional state is a happy mood. When the user is in a difficult or depressed state for 3 consecutive days, or the emotion of the user spans over 3 levels (from depression to excitation) at least 3 times, the user is considered to have an emotion problem and needs help, and corresponding warning information is sent out.
Referring to fig. 3, a method 300 for emotion analysis based on a user diary according to an embodiment of the present invention starts at step S310, and includes:
s310, the server obtains the diary data uploaded by the user terminal.
In step S310, the user terminal uploads diary data to the server, where the uploaded diary data includes a diary text of the user and may also include an emotional state that the user selects in the app by himself, where the emotional state that the user selects by himself is defined as a first emotional state. Further, the user may not select the emotional state and only upload the diary text by the terminal.
And S320, analyzing a second emotional state corresponding to the diary content contained in the diary data according to the pre-trained model.
In step S320, the model is a pre-trained artificial intelligence model, and the artificial intelligence model trained by a large amount of data can analyze an emotion value corresponding to a text according to the text, and then map the emotion value to an emotion state, where each emotion state corresponds to an interval of the emotion value.
S330, if the diary data is detected to contain the first emotion state filled by the user and the first emotion state is detected to be inconsistent with the second emotion state, training the diary content and the first emotion state input model.
The user terminal preferentially records a first emotional state selected by a user, but not records a second emotional state analyzed by the model; meanwhile, when the first emotional state is detected to be inconsistent with the second emotional state, the model is not accurate enough, personalized adjustment needs to be carried out according to the user condition, and then the model is subjected to strengthening training according to the first emotional state uploaded by the user.
Optionally, if it is detected that the diary data includes the first emotional state filled by the user, returning the first emotional state to the user terminal, and recording the emotional state data returned by the user terminal by using the server as final emotional state data.
Optionally, if it is detected that the diary data includes the first emotional state filled by the user, the first emotional state is not returned to the user terminal, and the diary content and the first emotional state input model are trained only when it is detected that the first emotional state is inconsistent with the second emotional state. The server no longer returns the first emotional state to the user terminal, since the user terminal already has a record of the first emotional state locally.
And S340, if the diary data is detected not to contain the first emotional state filled by the user, returning the second emotional state to the user terminal.
When the user terminal only uploads the diary text, the server analyzes the emotional state corresponding to the diary text of the user according to the pre-trained model and feeds the emotional state back to the user terminal, so that the emotional state of the user is always timely and accurately recorded, and the diary app can timely respond when the emotion of the user possibly has problems.
Specific examples of the present invention are given below.
As shown in fig. 4, the emotion analysis method based on a user diary according to the embodiment of the present invention includes the following steps:
the first method is as follows: the user can use the diary application to record the diary and can select the mood after writing. The selectable mood seen by the user in the application is different descriptors corresponding to a plurality of different emoticons, the user selects the icons, the icons are changed into emotion numerical value corresponding scores by the application client side and are assigned as ScoreClient, and the application client side obtains texts and the ScoreClient from a diary written by the user and records the texts and the ScoreClient in a database.
In one embodiment, the application client is preset with a plurality of emoticons, and the emotion score of each emoticon is the same as the setting of the server, for example, the client and the server use 6 descriptors of excitement-euphoria-calmness-glory-irritability-anger, which respectively correspond to the scores of 100 (excitement) -80 (euphoria) -60 (calmness) -40 (glory) -20 (uneasy) -0 (anger). After a user clicks one emoticon, the application client pops up descriptors expressing emotion content corresponding to the emoticon, so that the user can know the emotion content corresponding to the selected emoticon, the emoticon selected by the user is ensured to accurately express the emotion of the user, and data input into the model for intensive training is ensured to be accurate.
The second method comprises the following steps: scanning the user's previous paper diary, and converting the characters on the paper diary into character strings which can be processed by computer codes through an Optical Character Recognition (OCR) technology, namely, the text content mentioned in the first mode. This approach does not produce a score recording of the sentiment value.
And 2, adopting a machine learning technology on the server to train the model in advance.
The basic training principle is as follows: the words are classified in advance, and are classified into several grades from 0 to 100, in this document, 6 grades are taken as an example, each grade is assigned with a score, 0 represents bad emotion, and 100 represents good emotion, for example: excitement (100) -euphoria (80) -calmness (60) -smoldering (40) -hurry (20) -anger (0) Excited (100) -delayed (80) -cam (60) -down (40) -sad (20) -angry (0), of course, the words are not limited to these words, and the more sufficient the pre-training, the more accurate the analysis.
And 3, uploading the diary text to a server by the client.
Every time the client finishes saving one diary, the diary data is uploaded (wherein the diary data necessarily comprises text data and possibly an emotion value ScoreClient) to the server.
And 4, analyzing the diary text by the server through the trained model, and returning the emotion value to the client.
If the client uploads the emotion data, returning the emotion data uploaded by the client to the client; if the client does not upload the emotion data, returning the analysis result of the model to the client, which specifically comprises the following steps:
4.1, for a diary, the model needs to screen out keywords from the diary and score the keywords, and then the keywords are accumulated to obtain an average value to obtain a comprehensive score, wherein the comprehensive score is the emotion value of the diary and is assigned as ScoreServer;
the score of each keyword is set according to the intensity value of each emotion, and needs to be continuously learned and modified in the training process.
And 4.2, when the diary has no ScoreSystem, returning the ScoreServer to the client.
And 4.3, returning the ScoreClient to the client when the ScoreClient exists in the diary.
4.4, when ScoreServer exists in the diary, comparing the ScoreServer with the ScoreClient, if the ScoreClient is inconsistent, carrying out reinforced training on the model according to the ScoreClient, and correcting the original model.
The more data the user submits, the more training is sufficient, and the more accurate the model analysis is.
And 5, updating the emotion returned by the server to a database by the client to generate a change curve, so that the user can see the emotion curves of all diaries at any time and know the emotion change of the user to a certain extent.
And 6, the user can select an early warning function at the client, and when the emotion is low for many days or the emotion change difference is large, the user can inform a preset emotion care guardian, and the guardian can take corresponding measures to soothe or help the user so as to avoid the problems of depression and the like. Therefore, the emotion care guardian can still receive early warning without watching the diary content of the user, and the user can be helped as much as possible on the premise that the privacy is not invaded.
Referring to fig. 5, according to still another aspect of the present invention, there is provided a user terminal device 500 including:
a diary data uploading unit 510, configured to upload the first diary data and the second diary data to a server; the first journal data includes a first emotional state filled in by the user; the second diary data does not include the first emotional state filled in by the user;
a feedback receiving unit 520, configured to receive a second emotional state fed back by the server according to the second diary data;
a recording unit 530 for generating a user emotion record according to the first emotional state and the second emotional state;
and the warning unit 540 is configured to send warning information when detecting that the preset number of days for which the emotion is dropped in the emotion record of the user exceeds a first preset value or when the emotion change amplitude exceeds a second preset value.
Optionally, the user terminal device further includes:
and the drawing unit is used for drawing the generated emotion records of the users into a chart form.
Optionally, the first emotional state or the second emotional state comprises one of at least three predefined emotional states, of which a low mood is predefined.
Optionally, the warning unit 540 is specifically configured to:
and sending out warning information to a contact preset by the user.
For the definition of the user terminal device 500, refer to the definition of the emotion analysis method 200 based on the user diary, which is not described herein again.
Referring to fig. 6, according to still another aspect of an embodiment of the present invention, there is provided a server apparatus 600 including:
a diary data receiving unit 610, configured to obtain diary data uploaded by a user terminal;
an analyzing unit 620, configured to analyze, according to a pre-trained model, a second emotional state corresponding to the diary content included in the diary data;
a first response unit 630, configured to train the diary content and the first emotional state input model if it is detected that the diary data includes the first emotional state filled by the user and it is detected that the first emotional state is inconsistent with the second emotional state;
and a second response unit 640, configured to return the second emotional state to the user terminal if it is detected that the diary data does not include the first emotional state filled by the user.
Optionally, the first emotional state or the second emotional state comprises one of at least three predefined emotional states.
Optionally, the analysis unit 620 is specifically configured to:
extracting keywords in the diary content according to a pre-trained model, and grading the keywords;
integrating scores of the plurality of keywords and determining an emotion value corresponding to the diary content;
and determining a second emotional state according to the emotional value and the corresponding relation between the predetermined emotional value and the emotional state.
For specific limitations of the server device 600, see the above-mentioned specific limitations of the emotion analysis method 300 based on a user diary, and will not be described herein again.
According to still another aspect of an embodiment of the present invention, there is provided an emotion analysis system based on a user diary, including:
the user terminal equipment is used for uploading the first diary data and the second diary data to the server; the first journal data includes a first emotional state filled in by the user; the second diary data does not include the first emotional state filled in by the user; receiving a first emotional state fed back by the server according to the first diary data, and receiving a second emotional state fed back by the server according to the second diary data; generating a user emotion record according to the first emotion state and the second emotion state; when detecting that the number of days for which the emotion is dropped in the emotion records of the user exceeds a first preset value or the emotion change amplitude exceeds a second preset value, sending out warning information;
the server equipment is used for acquiring diary data uploaded by the user terminal; analyzing a second emotional state corresponding to the diary content contained in the diary data according to a pre-trained model; if the diary data is detected to contain a first emotional state filled by the user, returning the first emotional state to the user terminal, and if the first emotional state is detected to be inconsistent with a second emotional state, training the diary content and the first emotional state input model; and if the diary data is detected not to contain the first emotional state filled by the user, returning the second emotional state to the user terminal.
In conclusion, the diary application designed according to the invention realizes the analysis of each diary of the user to obtain the emotion value, and the curve marks the emotion change of the user, so that the user can know the emotion change of the user at any time. When a certain condition threshold is reached, the application will alert the user to the emotional problem.
In addition, the application can also inform the preset family, friends, doctors and other contacts of the user, so that the user can still acquire the emotional state information of the user and inform related personnel under the condition that the diary content is kept secret.
It should be understood that the various techniques described herein may be implemented in connection with hardware or software or, alternatively, with a combination of both. Thus, the methods and apparatus of the present invention, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium, wherein, when the program is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention.
In the case of program code execution on programmable computers, the computing device will generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Wherein the memory is configured to store program code; the processor is configured to perform the various methods of the present invention according to instructions in the program code stored in the memory.
By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer-readable media includes both computer storage media and communication media. Computer storage media store information such as computer readable instructions, data structures, program modules or other data. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. Combinations of any of the above are also included within the scope of computer readable media.
It should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the invention and aiding in the understanding of one or more of the various disclosed aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, disclosed aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules or units or components of the devices in the examples disclosed herein may be arranged in a device as described in this embodiment or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may be further divided into multiple sub-modules.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
Furthermore, some of the described embodiments are described herein as a method or combination of method elements that can be performed by a processor of a computer system or by other means of performing the described functions. A processor having the necessary instructions for carrying out the method or method elements thus forms a means for carrying out the method or method elements. Further, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is used to implement the functions performed by the elements for the purposes of this disclosure.
As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The present invention has been disclosed in an illustrative rather than a restrictive sense, and the scope of the present invention is defined by the appended claims.
Claims (10)
1. A method for emotion analysis based on a user diary, comprising:
the user terminal uploads the first diary data and the second diary data to the server; the first journal data includes a first emotional state filled in by a user; the second diary data does not include the first emotional state filled in by the user;
receiving a second emotional state fed back by the server according to the second diary data;
generating a user emotion record according to the first emotion state and the second emotion state;
and when detecting that the preset number of days for which the low emotion lasts exceeds a first preset value or the emotion change amplitude exceeds a second preset value in the emotion records of the user, sending out warning information.
2. The method of claim 1, further comprising:
and drawing the generated emotion records of the users into a chart form.
3. The method of claim 1, wherein the first emotional state or the second emotional state comprises one of at least three predefined emotional states.
4. The method of claim 1, wherein said issuing an alert message comprises:
and sending out warning information to a contact preset by the user.
5. A method for emotion analysis based on a user diary, comprising:
the server acquires diary data uploaded by the user terminal;
analyzing a second emotional state corresponding to the diary content contained in the diary data according to a pre-trained model;
if the diary data is detected to contain a first emotional state filled by a user and the first emotional state is detected to be inconsistent with the second emotional state, inputting the diary content and the first emotional state into the model for training;
and if the diary data is detected not to contain the first emotional state filled by the user, returning the second emotional state to the user terminal.
6. The method of claim 5, wherein the first emotional state or the second emotional state comprises one of at least three predefined emotional states.
7. The method of claim 5, wherein analyzing the diary data for a second emotional state corresponding to the diary content according to a pre-trained model comprises:
extracting keywords in the diary content according to a pre-trained model, and grading the keywords;
integrating scores of the keywords and determining an emotion value corresponding to the diary content;
and determining the second emotional state according to the emotional value and the corresponding relation between the predetermined emotional value and the emotional state.
8. A user terminal device, comprising:
the diary data uploading unit is used for uploading the first diary data and the second diary data to the server; the first journal data includes a first emotional state filled in by a user; the second diary data does not include the first emotional state filled in by the user;
a feedback receiving unit, configured to receive a second emotional state fed back by the server according to the second diary data;
the recording unit is used for generating a user emotion record according to the first emotion state and the second emotion state;
and the warning unit is used for sending warning information when the preset number of days for which the emotion is dropped in the emotion records of the user exceeds a first preset value or the emotion change amplitude exceeds a second preset value.
9. A server device, comprising:
the diary data receiving unit is used for acquiring diary data uploaded by the user terminal;
the analysis unit is used for analyzing a second emotional state corresponding to the diary content contained in the diary data according to a pre-trained model;
a first response unit, configured to, if it is detected that the diary data includes a first emotional state filled by a user and it is detected that the first emotional state is inconsistent with the second emotional state, input the diary content and the first emotional state into the model for training;
and the second response unit is used for returning the second emotional state to the user terminal if the diary data is detected not to contain the first emotional state filled by the user.
10. A system for emotion analysis based on a user diary, comprising:
a server device according to claim 8 and a user terminal device according to claim 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010071623.3A CN111326222A (en) | 2020-01-21 | 2020-01-21 | Emotion analysis method, device and system based on user diary |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010071623.3A CN111326222A (en) | 2020-01-21 | 2020-01-21 | Emotion analysis method, device and system based on user diary |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111326222A true CN111326222A (en) | 2020-06-23 |
Family
ID=71171014
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010071623.3A Pending CN111326222A (en) | 2020-01-21 | 2020-01-21 | Emotion analysis method, device and system based on user diary |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111326222A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112766747A (en) * | 2021-01-22 | 2021-05-07 | 清华大学 | Suicide risk detection method based on social network media posting information |
CN112891705A (en) * | 2021-01-21 | 2021-06-04 | 中国人民解放军东部战区总医院 | Emotion therapeutic instrument based on electronic diary |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101000632A (en) * | 2007-01-11 | 2007-07-18 | 上海交通大学 | Blog search and browsing system of intention driven |
CN103565445A (en) * | 2012-08-09 | 2014-02-12 | 英华达(上海)科技有限公司 | Emotion assessment service system and emotion assessment service method |
CN107609009A (en) * | 2017-07-26 | 2018-01-19 | 北京大学深圳研究院 | Text emotion analysis method, device, storage medium and computer equipment |
WO2018043939A1 (en) * | 2016-09-01 | 2018-03-08 | 성기봉 | Personal record management system in which records are automatically classified and stored |
-
2020
- 2020-01-21 CN CN202010071623.3A patent/CN111326222A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101000632A (en) * | 2007-01-11 | 2007-07-18 | 上海交通大学 | Blog search and browsing system of intention driven |
CN103565445A (en) * | 2012-08-09 | 2014-02-12 | 英华达(上海)科技有限公司 | Emotion assessment service system and emotion assessment service method |
WO2018043939A1 (en) * | 2016-09-01 | 2018-03-08 | 성기봉 | Personal record management system in which records are automatically classified and stored |
CN107609009A (en) * | 2017-07-26 | 2018-01-19 | 北京大学深圳研究院 | Text emotion analysis method, device, storage medium and computer equipment |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112891705A (en) * | 2021-01-21 | 2021-06-04 | 中国人民解放军东部战区总医院 | Emotion therapeutic instrument based on electronic diary |
CN112766747A (en) * | 2021-01-22 | 2021-05-07 | 清华大学 | Suicide risk detection method based on social network media posting information |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11934789B2 (en) | Artificial intelligence augmented document capture and processing systems and methods | |
US10839149B2 (en) | Generating templates from user's past documents | |
US9043298B2 (en) | Platform for generating, managing and sharing content clippings and associated citations | |
US11106718B2 (en) | Content moderation system and indication of reliability of documents | |
WO2017136249A1 (en) | Automatic template generation based on previous documents | |
US20230214579A1 (en) | Intelligent character correction and search in documents | |
CN111326222A (en) | Emotion analysis method, device and system based on user diary | |
US11263447B2 (en) | Information processing method, information processing device, mobile terminal, and storage medium | |
CN110969056A (en) | Document layout analysis method and device for document image and storage medium | |
CN114155860A (en) | Abstract recording method and device, computer equipment and storage medium | |
CN113849474A (en) | Data processing method and device, electronic equipment and readable storage medium | |
CA3182692A1 (en) | Automatic generation of a contextual meeting summary | |
CN111222316A (en) | Text detection method, device and storage medium | |
CN113033912A (en) | Problem solving person recommendation method and device | |
CN116431884A (en) | Method, system, computing device and storage medium for auditing link short messages | |
US20220335387A1 (en) | Method and system for configuring user onboarding in a financial organization | |
US20200394733A1 (en) | Systems and methods for mobile device-based legal self help | |
CN111582281B (en) | Picture display optimization method and device, electronic equipment and storage medium | |
KR20210140807A (en) | System for automatically generating business portfolio interworked with multiple sns platform | |
US20220237692A1 (en) | Method and system for providing financial process automation to financial organization | |
CN117668154B (en) | Medical health consultation service method, system, equipment and storage medium | |
KR102706677B1 (en) | Signature recommendation method and corresponding system using artificial intelligence module | |
JP7376185B2 (en) | Post display control device, post display control method, and program | |
TWM645906U (en) | Document character recognition system | |
JP2021033855A (en) | Information processing device and information processing program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200623 |