CN115658908A - Five-personality perception method and system based on conversation interaction process - Google Patents
Five-personality perception method and system based on conversation interaction process Download PDFInfo
- Publication number
- CN115658908A CN115658908A CN202211700508.3A CN202211700508A CN115658908A CN 115658908 A CN115658908 A CN 115658908A CN 202211700508 A CN202211700508 A CN 202211700508A CN 115658908 A CN115658908 A CN 115658908A
- Authority
- CN
- China
- Prior art keywords
- dialogue
- personality
- data
- user
- speaker
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Machine Translation (AREA)
Abstract
The invention discloses a five-personality perception method and a five-personality perception system based on a dialogue interaction process, relates to an intelligent dialogue technology, and provides a scheme aiming at the problem that personality analysis depends on large-space content in the prior art. The system comprises a user dialogue data acquisition module; the user dialogue data processing module is used for respectively representing dialogue contents and dialogue characters in the user dialogue data; processing user dialogue data into a computable vector and representing; the five-personality analysis module is used for updating personality vectors and understanding natural language of the processed user dialogue data to jointly form new dialogue and generate input data; and a dialogue generating module. The method has the advantages that the five-personality analysis model based on the conversation is provided, and compared with the traditional five-personality analysis model, the analysis accuracy at the conversation level is optimized to a greater extent. The character personality characteristics are used, and the personality expression of the conversation is optimized on the premise of not influencing the conversation quality.
Description
Technical Field
The invention relates to an intelligent conversation technology, in particular to a five-personality perception method and a five-personality perception system based on a conversation interaction process.
Background
Depending on the specific application, man-machine dialog systems are roughly divided into two categories: task oriented systems and non-task oriented systems. The primary purpose of task-oriented systems is to help users perform certain tasks, such as finding products, reserving accommodations and restaurants, and so forth. The primary purpose of non-task oriented systems is to provide reasonable reaction and entertainment during human interaction.
In recent years, how to automatically recognize personality has gained more and more attention in the field of emotion calculation. The impact of human personality in everyday life is enormous, relating to their preferences and choices. Therefore, the development of the personality analysis model has important practical application value and has excellent performance in the fields of recommendation systems, work selection and the like. In the field of psychology, the five-personality theory has been greatly influenced in academic circles and application fields as a classic personality theory since half century. According to the classification criteria of the five-personality, the personality traits can be classified into five categories, which are respectively an extroversion type (Extraction), an amenity type (Agreebleness), a due-employment type (Consumeriosness), a nervousness type (Neurocism) and an open type (Opnness).
In prior art methods, everyone focuses on the five-personality aware task at the chapter level, and less focuses on the five-personality aware task in the middle of the conversation. Since the chapter level contains more and more complete information, the effect is not good when the chapter level is migrated to the conversation interaction process. And the words of the chapter level are more book-oriented and are not as high as the degree of spoken language in the man-machine conversation, so that the five-personality perception task difficulty in the conversation interaction is higher.
Therefore, it is highly necessary to develop a five-personality sensing method and system based on the conversation interaction process, which firstly fills up the vacancy of the five-personality sensing in the conversation layer and secondly meets the requirements of users on intelligent man-machine systems.
Disclosure of Invention
The invention aims to provide a five-personality perception method and a five-personality perception system based on a conversation interaction process, and aims to solve the problems in the prior art.
The invention discloses a five personality perception system based on a conversation interaction process, which comprises:
the user dialogue data acquisition module is used for acquiring user dialogue data;
the user dialogue data processing module is used for respectively representing dialogue contents and dialogue characters in the user dialogue data and temporarily storing the dialogue contents and the dialogue characters in the server; processing user dialogue data into a computable vector and representing;
the five personality analysis module is used for updating personality vectors and understanding natural language of the processed user dialogue data to jointly form new dialogue and generate input data;
and the dialogue generating module is used for outputting the dialogue reply in a targeted manner according to the new dialogue generating input data.
The dialog content is characterized in that(ii) a The characters of the conversation are characterized as(ii) a Where m represents the number of conversations for the first speaker and n represents the number of conversations for the second speaker.
And the user dialogue data processing module is processed into a calculable sentence vector in a word embedding mode.
Dialog contentThe vector is characterized asWhereinI denotes a speaker index, j denotes a speaker sentence index, k j The number of words representing the corresponding sentence; the dialog character vector is characterized as,S 1 Word vector, S, representing a first speaker token 2 A word vector representing a second speaker token.
The five-personality perception method based on the conversation interaction process utilizes the system to conduct five-personality perception.
The five personality perception method and the system based on the conversation interaction process have the advantages that firstly, a five personality analysis model based on conversation is provided, analysis accuracy at a conversation level is optimized in a larger mode compared with a traditional five personality analysis model, and the model can still play an important role in scenes such as commodity recommendation, guest service and the like outside a man-machine conversation system. The individual character is used, and the individual expression of the conversation is optimized on the premise of not influencing the conversation quality.
Drawings
Fig. 1 is a schematic flow chart of a five personality perception method based on a dialogue interaction process in the present invention.
Detailed Description
As shown in fig. 1, the five-personality perception method based on the conversation interaction process according to the present invention is to perform five-personality perception analysis by using the system. The five personality perception system based on the conversation interaction process comprises a user conversation data acquisition module, a user conversation data processing module, a five personality analysis module and a conversation generation module. The specific working steps are as follows:
step 1, a user dialogue data acquisition module constructs, inquires and modifies a relational database through a flash framework.
And step 2, the user dialogue data processing module initiates a request to the user dialogue data acquisition module to acquire user dialogue data. The dialog content is characterized in thatThe dialog character is characterized asM represents the number of conversations of the first speaker, and n represents the number of conversations of the second speaker. The requested data will be temporarily stored at the server side.
And 3, processing the data stored in the server side into a calculable sentence vector in a word embedding mode through a user dialogue data processing module. The processed dialog content vector is characterized asIn whichI denotes a speaker index, j denotes a speaker sentence index, k j Representing the number of words of the corresponding sentence. The processed dialog figure vector is characterized as。S 1 Word vector, S, representing a first speaker token 2 A word vector representing a second speaker token.
And 4, inputting the processed user dialogue data into a five-personality analysis module, updating personality vectors of the personages and understanding natural language, and forming personalized input of the dialogue generation module together.
And 5, outputting a reply in a targeted manner in the dialogue generation module according to the personalized input.
Specifically, the step 2 includes the following substeps:
step 21, the number of user dialoguesAfter the request is received by the data acquisition module, the dialogue data of the user is obtained and stored in the dataframe structure, the redundant information is cleaned, and then the original data is obtainedWherein utterance represents dialog content, spaker represents a corresponding speaker, and person _ for _ bigfive represents an object for five personality classification.
Step 22, extracting and sorting the raw data, and representing the utterance column asRepresents the spaker in a column as. U and P are in one-to-one correspondence.
Specifically, the step 3 includes the following substeps:
Step 32, carrying out word vector coding on the data after word segmentation processing, and carrying out word vector coding on the dataIs mapped into. Obtaining the final input sample。
Step 33, pairAnd (5) performing human vector coding. Will identifyTo person _ for _ bigfive to [ speeder 2 ]]Another speaker is treated as [ spearker 1 ]]. To obtain,S 1 Represents [ spearer 1 ]]Corresponding coding of, S 2 Represents [ spearer 2 ]]The corresponding code of (2). This operation can ensure that the personality obtained during the subsequent analysis of the five personalities is the target speaker.
Specifically, the step 4 includes the following substeps:
and step 41, the five-personality analysis module is formed by integrating a five-personality classification model and a natural language understanding model.
Step 42, a dialogue-personality encoder (DP Decoder, hereinafter abbreviated as DPD) and a personality updater (PF Machine, hereinafter abbreviated as PFM) are defined in a five-personality classification Model (BF Model, hereinafter abbreviated as BFM). The dialogue-personality encoder adopts a Multi-Head Attention mechanism (Multi-Head Attention), focuses on the characteristics of high dialogue and personality correlation, and performs Layer Normalization on the connection of the characteristics, wherein the formula is as follows:
at this time, Q MHA For the output vector, μ, of the sample through a multi-head attention mechanism L Represents a sample Q MHA Mean value of (a) L Represents a sample Q MHA The variance of (c). The five personality labels of the characters are obtained according to the softmax function, and the formula is as follows:
wherein z is i Is the output value of the ith node, and C is the number of output nodes. Defining a loss function in training as a cross entropy loss function, wherein the formula is as follows:
the character personality updater is constructed by a GRU network, and the input of the character personality updater is a current character personality vector output by the dialogue-personality encoderAnd the hidden state ht-1 transmitted by the last node in the character personality updater, and the output of the hidden state ht is the updated character personality vector F and the hidden state ht. The updated personality vector of the character is。
Step 43, the natural language understanding Model (NLU Model, abbreviated as NLUM) adopts BERT (Bidirectional Encoder Representation from transformations) pre-training Model. To be provided withThe influence of the preceding text on the following text and the influence of the following text on the preceding text can be simultaneously considered for training the objective function, and the obtained dialogue feature vector contains better contextual features. The obtained dialogue feature vector isContains the deep semantic information of the dialog.
And 44, outputting the processed data through a five-personality classification model and a natural language understanding model to form input of a conversation generation module.
It will be apparent to those skilled in the art that various other changes and modifications may be made in the above-described embodiments and concepts and all such changes and modifications are intended to be within the scope of the appended claims.
Claims (5)
1. A five personality perception system based on conversational interaction process, comprising:
the user dialogue data acquisition module is used for acquiring user dialogue data;
the user dialogue data processing module is used for respectively representing dialogue contents and dialogue characters in the user dialogue data and temporarily storing the dialogue contents and the dialogue characters in the server; processing user dialogue data into a computable vector and representing;
the five personality analysis module is used for updating personality vectors and understanding natural language of the processed user dialogue data to jointly form new dialogue and generate input data;
and the dialogue generating module is used for outputting the dialogue reply in a targeted manner according to the new dialogue generating input data.
3. The system according to claim 1, wherein the user dialogue data processing module is configured to process the user dialogue data into a calculable sentence vector by word embedding.
4. The system of claim 3, wherein the dialog content vector is characterized asIn whichI denotes a speaker index, j denotes a speaker sentence index, k j Representing the number of words of the corresponding sentence; the dialog character vector is characterized as,S 1 Word vector, S, representing a first speaker token 2 A word vector representing a second speaker token.
5. A method for five personality perception based on conversational interaction process, characterized in that five personality perception is performed using the system according to any one of claims 1-4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211700508.3A CN115658908B (en) | 2022-12-29 | 2022-12-29 | Five-personality perception method and system based on conversation interaction process |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211700508.3A CN115658908B (en) | 2022-12-29 | 2022-12-29 | Five-personality perception method and system based on conversation interaction process |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115658908A true CN115658908A (en) | 2023-01-31 |
CN115658908B CN115658908B (en) | 2023-04-11 |
Family
ID=85023156
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211700508.3A Active CN115658908B (en) | 2022-12-29 | 2022-12-29 | Five-personality perception method and system based on conversation interaction process |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115658908B (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6731307B1 (en) * | 2000-10-30 | 2004-05-04 | Koninklije Philips Electronics N.V. | User interface/entertainment device that simulates personal interaction and responds to user's mental state and/or personality |
CN107545028A (en) * | 2017-07-17 | 2018-01-05 | 宁波市智能制造产业研究院 | A kind of data processing method, device and electronic equipment |
CN108846073A (en) * | 2018-06-08 | 2018-11-20 | 青岛里奥机器人技术有限公司 | A kind of man-machine emotion conversational system of personalization |
US20190294693A1 (en) * | 2018-03-20 | 2019-09-26 | International Business Machines Corporation | Recommendation technique using automatic conversation |
CN111460143A (en) * | 2020-03-11 | 2020-07-28 | 华南理工大学 | Emotion recognition model of multi-person conversation system |
CN112380231A (en) * | 2020-11-13 | 2021-02-19 | 四川大学 | Training robot system and method with depressive disorder characteristics |
CN113505208A (en) * | 2021-07-09 | 2021-10-15 | 福州大学 | Intelligent dialogue system integrating multi-path attention mechanism |
CN115294988A (en) * | 2022-07-20 | 2022-11-04 | 北方民族大学 | Voice interaction system and method for collaboration |
CN115329779A (en) * | 2022-08-10 | 2022-11-11 | 天津大学 | Multi-person conversation emotion recognition method |
-
2022
- 2022-12-29 CN CN202211700508.3A patent/CN115658908B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6731307B1 (en) * | 2000-10-30 | 2004-05-04 | Koninklije Philips Electronics N.V. | User interface/entertainment device that simulates personal interaction and responds to user's mental state and/or personality |
CN107545028A (en) * | 2017-07-17 | 2018-01-05 | 宁波市智能制造产业研究院 | A kind of data processing method, device and electronic equipment |
US20190294693A1 (en) * | 2018-03-20 | 2019-09-26 | International Business Machines Corporation | Recommendation technique using automatic conversation |
CN108846073A (en) * | 2018-06-08 | 2018-11-20 | 青岛里奥机器人技术有限公司 | A kind of man-machine emotion conversational system of personalization |
CN111460143A (en) * | 2020-03-11 | 2020-07-28 | 华南理工大学 | Emotion recognition model of multi-person conversation system |
CN112380231A (en) * | 2020-11-13 | 2021-02-19 | 四川大学 | Training robot system and method with depressive disorder characteristics |
CN113505208A (en) * | 2021-07-09 | 2021-10-15 | 福州大学 | Intelligent dialogue system integrating multi-path attention mechanism |
CN115294988A (en) * | 2022-07-20 | 2022-11-04 | 北方民族大学 | Voice interaction system and method for collaboration |
CN115329779A (en) * | 2022-08-10 | 2022-11-11 | 天津大学 | Multi-person conversation emotion recognition method |
Non-Patent Citations (1)
Title |
---|
徐晖等: ""结合情感信息的个性化对话生成"" * |
Also Published As
Publication number | Publication date |
---|---|
CN115658908B (en) | 2023-04-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111966800B (en) | Emotion dialogue generation method and device and emotion dialogue model training method and device | |
CN113836277A (en) | Machine learning system for digital assistant | |
CN111161726B (en) | Intelligent voice interaction method, device, medium and system | |
CN113987179A (en) | Knowledge enhancement and backtracking loss-based conversational emotion recognition network model, construction method, electronic device and storage medium | |
CN112100350B (en) | Open domain dialogue method for intensifying reply personalized expression | |
CN113065344A (en) | Cross-corpus emotion recognition method based on transfer learning and attention mechanism | |
CN114911932A (en) | Heterogeneous graph structure multi-conversation person emotion analysis method based on theme semantic enhancement | |
CN113360610A (en) | Dialog generation method and system based on Transformer model | |
CN111339772B (en) | Russian text emotion analysis method, electronic device and storage medium | |
JP2024505076A (en) | Generate diverse, natural-looking text-to-speech samples | |
CN115062627A (en) | Method and apparatus for computer-aided uniform system based on artificial intelligence | |
CN114708474A (en) | Image semantic understanding algorithm fusing local and global features | |
CN111368066B (en) | Method, apparatus and computer readable storage medium for obtaining dialogue abstract | |
CN112905776B (en) | Emotional dialogue model construction method, emotional dialogue system and method | |
Zhao et al. | Tibetan Multi-Dialect Speech and Dialect Identity Recognition. | |
CN113486143A (en) | User portrait generation method based on multi-level text representation and model fusion | |
CN117349427A (en) | Artificial intelligence multi-mode content generation system for public opinion event coping | |
CN117251562A (en) | Text abstract generation method based on fact consistency enhancement | |
CN112349294A (en) | Voice processing method and device, computer readable medium and electronic equipment | |
CN116303966A (en) | Dialogue behavior recognition system based on prompt learning | |
CN115658908B (en) | Five-personality perception method and system based on conversation interaction process | |
CN115858756A (en) | Shared emotion man-machine conversation system based on perception emotional tendency | |
CN114898779A (en) | Multi-mode fused speech emotion recognition method and system | |
CN114974310A (en) | Emotion recognition method and device based on artificial intelligence, computer equipment and medium | |
CN114912020A (en) | Multi-sub-target dialogue recommendation method based on user preference graph |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |