CN115658908A - Five-personality perception method and system based on conversation interaction process - Google Patents

Five-personality perception method and system based on conversation interaction process Download PDF

Info

Publication number
CN115658908A
CN115658908A CN202211700508.3A CN202211700508A CN115658908A CN 115658908 A CN115658908 A CN 115658908A CN 202211700508 A CN202211700508 A CN 202211700508A CN 115658908 A CN115658908 A CN 115658908A
Authority
CN
China
Prior art keywords
dialogue
personality
data
user
speaker
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211700508.3A
Other languages
Chinese (zh)
Other versions
CN115658908B (en
Inventor
韩文静
徐向民
邢晓芬
陈艺荣
帖千枫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN202211700508.3A priority Critical patent/CN115658908B/en
Publication of CN115658908A publication Critical patent/CN115658908A/en
Application granted granted Critical
Publication of CN115658908B publication Critical patent/CN115658908B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Machine Translation (AREA)

Abstract

The invention discloses a five-personality perception method and a five-personality perception system based on a dialogue interaction process, relates to an intelligent dialogue technology, and provides a scheme aiming at the problem that personality analysis depends on large-space content in the prior art. The system comprises a user dialogue data acquisition module; the user dialogue data processing module is used for respectively representing dialogue contents and dialogue characters in the user dialogue data; processing user dialogue data into a computable vector and representing; the five-personality analysis module is used for updating personality vectors and understanding natural language of the processed user dialogue data to jointly form new dialogue and generate input data; and a dialogue generating module. The method has the advantages that the five-personality analysis model based on the conversation is provided, and compared with the traditional five-personality analysis model, the analysis accuracy at the conversation level is optimized to a greater extent. The character personality characteristics are used, and the personality expression of the conversation is optimized on the premise of not influencing the conversation quality.

Description

Five-personality perception method and system based on conversation interaction process
Technical Field
The invention relates to an intelligent conversation technology, in particular to a five-personality perception method and a five-personality perception system based on a conversation interaction process.
Background
Depending on the specific application, man-machine dialog systems are roughly divided into two categories: task oriented systems and non-task oriented systems. The primary purpose of task-oriented systems is to help users perform certain tasks, such as finding products, reserving accommodations and restaurants, and so forth. The primary purpose of non-task oriented systems is to provide reasonable reaction and entertainment during human interaction.
In recent years, how to automatically recognize personality has gained more and more attention in the field of emotion calculation. The impact of human personality in everyday life is enormous, relating to their preferences and choices. Therefore, the development of the personality analysis model has important practical application value and has excellent performance in the fields of recommendation systems, work selection and the like. In the field of psychology, the five-personality theory has been greatly influenced in academic circles and application fields as a classic personality theory since half century. According to the classification criteria of the five-personality, the personality traits can be classified into five categories, which are respectively an extroversion type (Extraction), an amenity type (Agreebleness), a due-employment type (Consumeriosness), a nervousness type (Neurocism) and an open type (Opnness).
In prior art methods, everyone focuses on the five-personality aware task at the chapter level, and less focuses on the five-personality aware task in the middle of the conversation. Since the chapter level contains more and more complete information, the effect is not good when the chapter level is migrated to the conversation interaction process. And the words of the chapter level are more book-oriented and are not as high as the degree of spoken language in the man-machine conversation, so that the five-personality perception task difficulty in the conversation interaction is higher.
Therefore, it is highly necessary to develop a five-personality sensing method and system based on the conversation interaction process, which firstly fills up the vacancy of the five-personality sensing in the conversation layer and secondly meets the requirements of users on intelligent man-machine systems.
Disclosure of Invention
The invention aims to provide a five-personality perception method and a five-personality perception system based on a conversation interaction process, and aims to solve the problems in the prior art.
The invention discloses a five personality perception system based on a conversation interaction process, which comprises:
the user dialogue data acquisition module is used for acquiring user dialogue data;
the user dialogue data processing module is used for respectively representing dialogue contents and dialogue characters in the user dialogue data and temporarily storing the dialogue contents and the dialogue characters in the server; processing user dialogue data into a computable vector and representing;
the five personality analysis module is used for updating personality vectors and understanding natural language of the processed user dialogue data to jointly form new dialogue and generate input data;
and the dialogue generating module is used for outputting the dialogue reply in a targeted manner according to the new dialogue generating input data.
The dialog content is characterized in that
Figure 22100DEST_PATH_IMAGE001
(ii) a The characters of the conversation are characterized as
Figure 492396DEST_PATH_IMAGE002
(ii) a Where m represents the number of conversations for the first speaker and n represents the number of conversations for the second speaker.
And the user dialogue data processing module is processed into a calculable sentence vector in a word embedding mode.
Dialog contentThe vector is characterized as
Figure 160138DEST_PATH_IMAGE003
Wherein
Figure 297858DEST_PATH_IMAGE004
I denotes a speaker index, j denotes a speaker sentence index, k j The number of words representing the corresponding sentence; the dialog character vector is characterized as
Figure 760063DEST_PATH_IMAGE005
,S 1 Word vector, S, representing a first speaker token 2 A word vector representing a second speaker token.
The five-personality perception method based on the conversation interaction process utilizes the system to conduct five-personality perception.
The five personality perception method and the system based on the conversation interaction process have the advantages that firstly, a five personality analysis model based on conversation is provided, analysis accuracy at a conversation level is optimized in a larger mode compared with a traditional five personality analysis model, and the model can still play an important role in scenes such as commodity recommendation, guest service and the like outside a man-machine conversation system. The individual character is used, and the individual expression of the conversation is optimized on the premise of not influencing the conversation quality.
Drawings
Fig. 1 is a schematic flow chart of a five personality perception method based on a dialogue interaction process in the present invention.
Detailed Description
As shown in fig. 1, the five-personality perception method based on the conversation interaction process according to the present invention is to perform five-personality perception analysis by using the system. The five personality perception system based on the conversation interaction process comprises a user conversation data acquisition module, a user conversation data processing module, a five personality analysis module and a conversation generation module. The specific working steps are as follows:
step 1, a user dialogue data acquisition module constructs, inquires and modifies a relational database through a flash framework.
And step 2, the user dialogue data processing module initiates a request to the user dialogue data acquisition module to acquire user dialogue data. The dialog content is characterized in that
Figure 216190DEST_PATH_IMAGE006
The dialog character is characterized as
Figure 156464DEST_PATH_IMAGE007
M represents the number of conversations of the first speaker, and n represents the number of conversations of the second speaker. The requested data will be temporarily stored at the server side.
And 3, processing the data stored in the server side into a calculable sentence vector in a word embedding mode through a user dialogue data processing module. The processed dialog content vector is characterized as
Figure 148691DEST_PATH_IMAGE008
In which
Figure 312956DEST_PATH_IMAGE009
I denotes a speaker index, j denotes a speaker sentence index, k j Representing the number of words of the corresponding sentence. The processed dialog figure vector is characterized as
Figure 23423DEST_PATH_IMAGE010
。S 1 Word vector, S, representing a first speaker token 2 A word vector representing a second speaker token.
And 4, inputting the processed user dialogue data into a five-personality analysis module, updating personality vectors of the personages and understanding natural language, and forming personalized input of the dialogue generation module together.
And 5, outputting a reply in a targeted manner in the dialogue generation module according to the personalized input.
Specifically, the step 2 includes the following substeps:
step 21, the number of user dialoguesAfter the request is received by the data acquisition module, the dialogue data of the user is obtained and stored in the dataframe structure, the redundant information is cleaned, and then the original data is obtained
Figure 767388DEST_PATH_IMAGE011
Wherein utterance represents dialog content, spaker represents a corresponding speaker, and person _ for _ bigfive represents an object for five personality classification.
Step 22, extracting and sorting the raw data, and representing the utterance column as
Figure 879701DEST_PATH_IMAGE006
Represents the spaker in a column as
Figure 949288DEST_PATH_IMAGE007
. U and P are in one-to-one correspondence.
Specifically, the step 3 includes the following substeps:
step 31, pair
Figure 380007DEST_PATH_IMAGE006
Each sub-column of the word segmentation processing unit
Figure 927663DEST_PATH_IMAGE012
Step 32, carrying out word vector coding on the data after word segmentation processing, and carrying out word vector coding on the data
Figure 894482DEST_PATH_IMAGE013
Is mapped into
Figure 134971DEST_PATH_IMAGE014
. Obtaining the final input sample
Figure 554451DEST_PATH_IMAGE008
Step 33, pair
Figure 905797DEST_PATH_IMAGE007
And (5) performing human vector coding. Will identifyTo person _ for _ bigfive to [ speeder 2 ]]Another speaker is treated as [ spearker 1 ]]. To obtain
Figure 461544DEST_PATH_IMAGE010
,S 1 Represents [ spearer 1 ]]Corresponding coding of, S 2 Represents [ spearer 2 ]]The corresponding code of (2). This operation can ensure that the personality obtained during the subsequent analysis of the five personalities is the target speaker.
Specifically, the step 4 includes the following substeps:
and step 41, the five-personality analysis module is formed by integrating a five-personality classification model and a natural language understanding model.
Step 42, a dialogue-personality encoder (DP Decoder, hereinafter abbreviated as DPD) and a personality updater (PF Machine, hereinafter abbreviated as PFM) are defined in a five-personality classification Model (BF Model, hereinafter abbreviated as BFM). The dialogue-personality encoder adopts a Multi-Head Attention mechanism (Multi-Head Attention), focuses on the characteristics of high dialogue and personality correlation, and performs Layer Normalization on the connection of the characteristics, wherein the formula is as follows:
Figure 872934DEST_PATH_IMAGE015
at this time, Q MHA For the output vector, μ, of the sample through a multi-head attention mechanism L Represents a sample Q MHA Mean value of (a) L Represents a sample Q MHA The variance of (c). The five personality labels of the characters are obtained according to the softmax function, and the formula is as follows:
Figure 36103DEST_PATH_IMAGE016
wherein z is i Is the output value of the ith node, and C is the number of output nodes. Defining a loss function in training as a cross entropy loss function, wherein the formula is as follows:
Figure 925562DEST_PATH_IMAGE017
the character personality updater is constructed by a GRU network, and the input of the character personality updater is a current character personality vector output by the dialogue-personality encoder
Figure 866973DEST_PATH_IMAGE018
And the hidden state ht-1 transmitted by the last node in the character personality updater, and the output of the hidden state ht is the updated character personality vector F and the hidden state ht. The updated personality vector of the character is
Figure 449264DEST_PATH_IMAGE019
Step 43, the natural language understanding Model (NLU Model, abbreviated as NLUM) adopts BERT (Bidirectional Encoder Representation from transformations) pre-training Model. To be provided with
Figure 108915DEST_PATH_IMAGE020
The influence of the preceding text on the following text and the influence of the following text on the preceding text can be simultaneously considered for training the objective function, and the obtained dialogue feature vector contains better contextual features. The obtained dialogue feature vector is
Figure 802065DEST_PATH_IMAGE021
Contains the deep semantic information of the dialog.
And 44, outputting the processed data through a five-personality classification model and a natural language understanding model to form input of a conversation generation module.
It will be apparent to those skilled in the art that various other changes and modifications may be made in the above-described embodiments and concepts and all such changes and modifications are intended to be within the scope of the appended claims.

Claims (5)

1. A five personality perception system based on conversational interaction process, comprising:
the user dialogue data acquisition module is used for acquiring user dialogue data;
the user dialogue data processing module is used for respectively representing dialogue contents and dialogue characters in the user dialogue data and temporarily storing the dialogue contents and the dialogue characters in the server; processing user dialogue data into a computable vector and representing;
the five personality analysis module is used for updating personality vectors and understanding natural language of the processed user dialogue data to jointly form new dialogue and generate input data;
and the dialogue generating module is used for outputting the dialogue reply in a targeted manner according to the new dialogue generating input data.
2. The system of claim 1, wherein the content of the dialog is characterized by
Figure 825680DEST_PATH_IMAGE001
(ii) a The dialog character is characterized in that
Figure 477241DEST_PATH_IMAGE002
(ii) a Where m represents the number of conversations for the first speaker and n represents the number of conversations for the second speaker.
3. The system according to claim 1, wherein the user dialogue data processing module is configured to process the user dialogue data into a calculable sentence vector by word embedding.
4. The system of claim 3, wherein the dialog content vector is characterized as
Figure 725820DEST_PATH_IMAGE003
In which
Figure 589871DEST_PATH_IMAGE004
I denotes a speaker index, j denotes a speaker sentence index, k j Representing the number of words of the corresponding sentence; the dialog character vector is characterized as
Figure 371620DEST_PATH_IMAGE005
,S 1 Word vector, S, representing a first speaker token 2 A word vector representing a second speaker token.
5. A method for five personality perception based on conversational interaction process, characterized in that five personality perception is performed using the system according to any one of claims 1-4.
CN202211700508.3A 2022-12-29 2022-12-29 Five-personality perception method and system based on conversation interaction process Active CN115658908B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211700508.3A CN115658908B (en) 2022-12-29 2022-12-29 Five-personality perception method and system based on conversation interaction process

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211700508.3A CN115658908B (en) 2022-12-29 2022-12-29 Five-personality perception method and system based on conversation interaction process

Publications (2)

Publication Number Publication Date
CN115658908A true CN115658908A (en) 2023-01-31
CN115658908B CN115658908B (en) 2023-04-11

Family

ID=85023156

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211700508.3A Active CN115658908B (en) 2022-12-29 2022-12-29 Five-personality perception method and system based on conversation interaction process

Country Status (1)

Country Link
CN (1) CN115658908B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6731307B1 (en) * 2000-10-30 2004-05-04 Koninklije Philips Electronics N.V. User interface/entertainment device that simulates personal interaction and responds to user's mental state and/or personality
CN107545028A (en) * 2017-07-17 2018-01-05 宁波市智能制造产业研究院 A kind of data processing method, device and electronic equipment
CN108846073A (en) * 2018-06-08 2018-11-20 青岛里奥机器人技术有限公司 A kind of man-machine emotion conversational system of personalization
US20190294693A1 (en) * 2018-03-20 2019-09-26 International Business Machines Corporation Recommendation technique using automatic conversation
CN111460143A (en) * 2020-03-11 2020-07-28 华南理工大学 Emotion recognition model of multi-person conversation system
CN112380231A (en) * 2020-11-13 2021-02-19 四川大学 Training robot system and method with depressive disorder characteristics
CN113505208A (en) * 2021-07-09 2021-10-15 福州大学 Intelligent dialogue system integrating multi-path attention mechanism
CN115294988A (en) * 2022-07-20 2022-11-04 北方民族大学 Voice interaction system and method for collaboration
CN115329779A (en) * 2022-08-10 2022-11-11 天津大学 Multi-person conversation emotion recognition method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6731307B1 (en) * 2000-10-30 2004-05-04 Koninklije Philips Electronics N.V. User interface/entertainment device that simulates personal interaction and responds to user's mental state and/or personality
CN107545028A (en) * 2017-07-17 2018-01-05 宁波市智能制造产业研究院 A kind of data processing method, device and electronic equipment
US20190294693A1 (en) * 2018-03-20 2019-09-26 International Business Machines Corporation Recommendation technique using automatic conversation
CN108846073A (en) * 2018-06-08 2018-11-20 青岛里奥机器人技术有限公司 A kind of man-machine emotion conversational system of personalization
CN111460143A (en) * 2020-03-11 2020-07-28 华南理工大学 Emotion recognition model of multi-person conversation system
CN112380231A (en) * 2020-11-13 2021-02-19 四川大学 Training robot system and method with depressive disorder characteristics
CN113505208A (en) * 2021-07-09 2021-10-15 福州大学 Intelligent dialogue system integrating multi-path attention mechanism
CN115294988A (en) * 2022-07-20 2022-11-04 北方民族大学 Voice interaction system and method for collaboration
CN115329779A (en) * 2022-08-10 2022-11-11 天津大学 Multi-person conversation emotion recognition method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
徐晖等: ""结合情感信息的个性化对话生成"" *

Also Published As

Publication number Publication date
CN115658908B (en) 2023-04-11

Similar Documents

Publication Publication Date Title
CN111966800B (en) Emotion dialogue generation method and device and emotion dialogue model training method and device
CN113836277A (en) Machine learning system for digital assistant
CN111161726B (en) Intelligent voice interaction method, device, medium and system
CN113987179A (en) Knowledge enhancement and backtracking loss-based conversational emotion recognition network model, construction method, electronic device and storage medium
CN112100350B (en) Open domain dialogue method for intensifying reply personalized expression
CN113065344A (en) Cross-corpus emotion recognition method based on transfer learning and attention mechanism
CN114911932A (en) Heterogeneous graph structure multi-conversation person emotion analysis method based on theme semantic enhancement
CN113360610A (en) Dialog generation method and system based on Transformer model
CN111339772B (en) Russian text emotion analysis method, electronic device and storage medium
JP2024505076A (en) Generate diverse, natural-looking text-to-speech samples
CN115062627A (en) Method and apparatus for computer-aided uniform system based on artificial intelligence
CN114708474A (en) Image semantic understanding algorithm fusing local and global features
CN111368066B (en) Method, apparatus and computer readable storage medium for obtaining dialogue abstract
CN112905776B (en) Emotional dialogue model construction method, emotional dialogue system and method
Zhao et al. Tibetan Multi-Dialect Speech and Dialect Identity Recognition.
CN113486143A (en) User portrait generation method based on multi-level text representation and model fusion
CN117349427A (en) Artificial intelligence multi-mode content generation system for public opinion event coping
CN117251562A (en) Text abstract generation method based on fact consistency enhancement
CN112349294A (en) Voice processing method and device, computer readable medium and electronic equipment
CN116303966A (en) Dialogue behavior recognition system based on prompt learning
CN115658908B (en) Five-personality perception method and system based on conversation interaction process
CN115858756A (en) Shared emotion man-machine conversation system based on perception emotional tendency
CN114898779A (en) Multi-mode fused speech emotion recognition method and system
CN114974310A (en) Emotion recognition method and device based on artificial intelligence, computer equipment and medium
CN114912020A (en) Multi-sub-target dialogue recommendation method based on user preference graph

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant