CN114418115B - Co-emotion conversation training method, device and equipment for psychological consultants and storage medium - Google Patents

Co-emotion conversation training method, device and equipment for psychological consultants and storage medium Download PDF

Info

Publication number
CN114418115B
CN114418115B CN202210028103.3A CN202210028103A CN114418115B CN 114418115 B CN114418115 B CN 114418115B CN 202210028103 A CN202210028103 A CN 202210028103A CN 114418115 B CN114418115 B CN 114418115B
Authority
CN
China
Prior art keywords
emotion
text
response
training
psychological
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210028103.3A
Other languages
Chinese (zh)
Other versions
CN114418115A (en
Inventor
任志洪
罗文俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Central China Normal University
Original Assignee
Central China Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Central China Normal University filed Critical Central China Normal University
Priority to CN202210028103.3A priority Critical patent/CN114418115B/en
Publication of CN114418115A publication Critical patent/CN114418115A/en
Application granted granted Critical
Publication of CN114418115B publication Critical patent/CN114418115B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/284Lexical analysis, e.g. tokenisation or collocates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • G06F40/35Discourse or dialogue representation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The application relates to the technical field of artificial intelligence, in particular to a co-emotion consultation training method, device and equipment of psychological consultants and a storage medium, wherein the method comprises the following steps: asking questions of the psychological consultant and acquiring a co-emotion response text of the psychological consultant; text processing is carried out on the co-emotion response text, vectorization processing is carried out, and vectorized text features are generated; and inputting the vectorized text features into a pre-trained exploratory technology classification and feedback machine learning model to obtain the co-emotion technology category and similarity, determining the current co-emotion level of the psychological consultant based on the co-emotion technology category and the similarity, and continuing to conduct co-emotion conversation training or ending the co-emotion conversation training according to the training target of the psychological consultant. Therefore, the problems that the training mode of the psychological consultant in the related technology is seriously dependent on expert supervision, the intentional training level of training is reduced, the consultant is restricted to grasp the co-emotion interview skills and the like are solved.

Description

Co-emotion conversation training method, device and equipment for psychological consultants and storage medium
Technical Field
The application relates to the technical field of natural language processing, in particular to a co-emotion consultation training method, device and equipment of psychological consultants and a storage medium.
Background
The co-emotion is a process of experiencing emotion or thinking of other people and actively placing the co-emotion in the world perceived by other people, the meaning of life and life events is honest seen through eyes of other people, the consultation and treatment relationship has a treatment function, the co-emotion is one of three sufficient necessary conditions for establishing a good consultation relationship, and the co-emotion is widely accepted and adopted as an important factor in psychological consultation.
As a interview technique, co-mors may be trained. Training to improve the co-emotion level of novice consultants is very necessary for the cultivation of psychological consultants and the improvement of psychological consultation effects.
The innovative experience type intervention is taken as a co-emotion training mode which breaks the conventional and bypasses language technology training and directly acts on the emotion experience of the trainee, and is paid attention in recent years, but is limited by training conditions, and the training of psychological consultants is mainstream by the traditional micro-technology training method.
Although students systematically criticize the micro-technical training means, it is undeniable that this training method is most widely used and is proven effective by a large number of demonstration studies, especially in terms of basic interview skills. The three-stage model of the human assisting technology such as Hill integrates an early human interaction training model of Carkhuff, a micro consultation model of Ivey and a human process review model of Kagan. The assistant technology in the exploration stage is based on the center theory of the Rojies parties, and the principle of co-condition is deeply reflected.
However, the training mode of the psychological consultant in the related art is severely dependent on expert supervision, the intentional training level of training is not high, the psychological consultant is restricted to grasp the co-emotion interview skills, the training effect is reduced, and the training efficiency is lower.
Disclosure of Invention
The application provides a co-emotion conversation training method, device and equipment for psychological consultants and a storage medium, which are used for solving the problems that in the related technology, the training mode of the psychological consultants is seriously dependent on expert supervision, the intentional training level of training is reduced, the consultants are restricted to grasp the co-emotion conversation skills, the training effect is poor, the training efficiency is low and the like.
An embodiment of a first aspect of the present application provides a co-emotion consultation training method for psychological consultants, including the following steps: asking questions of psychological consultants and acquiring co-emotion response texts of the psychological consultants; performing text processing on the co-emotion response text, performing vectorization processing, and generating vectorized text features; inputting the vectorized text features to a pre-trained exploratory technology classification and feedback machine learning model to obtain a co-emotion technology category and similarity, determining the current co-emotion level of the psychological consultant based on the co-emotion technology category and the similarity, and continuing to conduct co-emotion conversation training or ending co-emotion conversation training according to the training target of the psychological consultant.
Further, prior to inputting the text features into the pre-trained exploratory technology classification and feedback machine learning model, further comprising: acquiring training samples in real interviews by utilizing co-emotion opportunities defined by a co-emotion communication coding system; and training a machine learning model of the exploratory technique classification and feedback based on the training sample, wherein the co-emotion opportunities include direct expression of emotion, direct expression of challenges, and direct expression of progress.
Further, the asking the question of the psychological consultant, obtaining the consensus response text of the psychological consultant, including: and obtaining the co-emotion response of the psychological consultant, and converting the co-emotion response into a co-emotion response text in a preset mode.
Further, the text processing is performed on the co-emotion response text, vectorization processing is performed, and vectorized text features are generated, including: preprocessing the co-emotion response text, and performing word segmentation processing on the preprocessed co-emotion response text to obtain the text characteristics; and carrying out word frequency and inverse file frequency vectorization on the text features to generate vectorized text features.
Further, the inputting the vectorized text features into a pre-trained exploratory technology classification and feedback machine learning model to obtain a co-emotion technology category and similarity includes: obtaining the actual technical classification of the co-emotion response text according to a machine learning algorithm; the determining the current co-emotion level of the psychological consultant based on the co-emotion technology category and the similarity includes: and obtaining the co-emotion level responded by the psychological consultant according to the similarity between the co-emotion response text of the psychological consultant and the preset text.
An embodiment of the second aspect of the present application provides a co-emotion consultant conversation training device, including: the acquisition module is used for asking questions of the psychological consultants and acquiring the co-emotion response text of the psychological consultants; the processing module is used for extracting text features from the co-emotion response text, carrying out vectorization processing and generating vectorized text features; the training module is used for inputting the vectorized text characteristics into a machine learning model of pre-trained exploratory technology classification and feedback to obtain a co-emotion technology category and similarity, determining the current co-emotion level of the psychological consultant based on the co-emotion technology category and the similarity, and continuing to conduct co-emotion interview training or ending co-emotion interview training according to the training target of the psychological consultant.
Further, the method further comprises the following steps: the model construction module is used for acquiring training samples in real interviews by utilizing co-emotion opportunities defined by a co-emotion communication coding system before inputting the text characteristics into a pre-trained exploratory technology classification and feedback machine learning model; and training a machine learning model of the exploratory technique classification and feedback based on the training sample, wherein the co-emotion opportunities include direct expression of emotion, direct expression of challenges, and direct expression of progress.
Further, the obtaining module is used for obtaining the co-emotion response of the psychological consultant and converting the co-emotion response into a co-emotion response text in a preset mode; the processing module is used for preprocessing the co-emotion response text, word segmentation processing is carried out on the preprocessed co-emotion response text to obtain text features, word frequency vectorization and inverse file frequency vectorization are carried out on the text features to generate vectorized text features; the training module is used for obtaining the actual technical classification of the co-emotion response text according to a machine learning algorithm, and obtaining the co-emotion level responded by the psychological consultant according to the similarity between the co-emotion response text of the psychological consultant and a preset text.
An embodiment of a third aspect of the present application provides an electronic device, including: the system comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor executes the program to realize the co-emotion consultant conversation training method according to the embodiment.
An embodiment of a fourth aspect of the present application provides a computer-readable storage medium having stored thereon a computer program for execution by a processor for implementing the co-emotion consultant conversation training method of the above-described embodiment.
Therefore, the application has at least the following beneficial effects:
the machine model for automatically classifying the user input and scoring the co-emotion level is programmed, so that the category of the co-emotion technology in consultation conversation can be effectively predicted, the co-emotion level of the user response is reasonably reflected according to the similarity of the user response and the expert response, the expert is not required to supervise and guide the conversation skill of a psychological consultant automatically and effectively, the reaction co-emotion level in consultation conversation is improved, and the training effect and efficiency are effectively improved. Therefore, the problems that the training mode of a psychological consultant in the related technology is seriously dependent on expert supervision, the intentional training level of training is reduced, the consultant is restricted to grasp the co-emotion interview skills, the training effect is poor, the training efficiency is low and the like are solved.
Additional aspects and advantages of the application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the application.
Drawings
The foregoing and/or additional aspects and advantages of the application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a flowchart of a method for training a co-emotion consultation of psychological consultants according to an embodiment of the present application;
FIG. 2 is a machine learning model development flow diagram for exploratory technique classification and feedback provided in accordance with an embodiment of the present application;
FIG. 3 is an exemplary diagram of an application of a machine learning model for exploratory technique classification and feedback provided in accordance with an embodiment of the present application;
FIG. 4 is a block diagram illustrating a device for training a co-emotion consultant provided in accordance with an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative and intended to explain the present application and should not be construed as limiting the application.
The co-emotion can be trained, so that the co-emotion level of a novice consultant is improved, and the culture of the psychological consultant and the improvement of the psychological consultation effect are very necessary. The training of psychological consultants is mainly based on the traditional micro-technology training method, which is most widely applied and proved to be effective by a large number of demonstration researches, especially in the aspect of basic interview skills. The three-stage model of the assistant technology such as Hill integrates an early consultation technology model, wherein the assistant technology of the exploration stage is based on the principal center theory and deeply embodies the principle of co-emotion.
NLP (Natural Language Processing ) is intended to understand human language, and is applied to the field of machine translation at the earliest, and the flow of NLP includes: corpus acquisition, corpus preprocessing (data cleaning, word segmentation, word deactivation removal and the like), feature selection, model training such as SVM and model evaluation. Machine learning can be simply classified into supervised learning and unsupervised learning algorithms according to whether manual tagging of raw data is required, and has been increasingly applied in clinical psychology and psychiatry in recent years. In the psychological consultation field, the combination of machine learning and natural language processing provides a method for researchers to automatically and deeply analyze consultation conversation text.
The co-emotion interaction in medical scene communication can be automatically detected by utilizing the voice characteristics and the vocabulary characteristics, and then the expressive co-emotion is further encoded into three different levels of reaction, namely strong/weak interpretation, strong/weak exploration and strong/weak emotion reflection, and the expression mechanisms after the co-emotion reflection and reflection are identified on a large-scale mental health support platform. Motivational interviews are a widely used consultation modality for substance abuse therapy, in which consultants 'co-emotion levels can be detailed behavioral coded, characterized by psycholinguistically inspired vocabulary (LIWC features and n-grams, etc.), predicting the consultants' perceived co-emotion of visitors in MI; or a consultant co-emotion level of the observer score is predicted using an SVM (Support Vector Machine ).
However, the training mode of the novice psychological consultant at present is seriously dependent on expert supervision, the intentional training level of training is not high, and the mastery of the co-emotion interview skills by the novice consultant is restricted; the machine learning method is not applied in the context of Chinese clinical consultation.
In order to solve the above problems, the short board trained by micro technology is supplemented and developed by combining a machine learning method, and a consultant is further cultivated by using the efficient method, and the application provides a new solution: the coding system ECCS introduced into the medical field selects samples in real interviews, pays specific attention to five exploratory technologies, builds a machine learning model for exploratory technology classification and feedback, is used for training interview skills of new-hand consultants, and improves the reaction co-emotion level in the consultation interviews.
The method, apparatus, device and storage medium for training a co-emotion consultant of an embodiment of the present application will be described below with reference to the accompanying drawings. Aiming at the problems that the training mode of a psychological consultant in the related technology is severely dependent on expert supervision, the intentional training level of training is reduced, the consultant is restricted to grasp the co-emotion consultation skills, the training effect is poor and the training efficiency is low in the related technology, the application provides a co-emotion consultation training method of the psychological consultant. Therefore, the problems that the training mode of a psychological consultant in the related technology is seriously dependent on expert supervision, the intentional training level of training is reduced, the consultant is restricted to grasp the co-emotion interview skills, the training effect is poor, the training efficiency is low and the like are solved.
Specifically, fig. 1 is a flow chart of a co-emotion consultation training method of a psychological consultant according to an embodiment of the present application.
As shown in fig. 1, the co-emotion conversation training method of the psychological consultant includes the following steps:
in step S101, a question is asked to the psychological consultant, and a consensus response text of the psychological consultant is acquired.
In this embodiment, the co-emotion response of the psychological consultant is obtained, and the co-emotion response is converted into a co-emotion response text in a preset manner.
The preset mode may include a mode of voice conversion, and the like, which is not limited in particular.
It can be appreciated that, in the embodiment of the present application, the text response result of the psychological consultant to the sample can be obtained in various manners, for example, the text is converted from the speech response of the psychological consultant; for example, the text reply or the like input by the psychological consultant through the input device is directly obtained, and is not particularly limited.
In step S102, text processing is performed on the consensus response text, vectorization processing is performed, and vectorized text features are generated.
In this embodiment, text processing is performed on the co-emotion response text, vectorization processing is performed, and vectorized text features are generated, including: preprocessing the co-emotion response text, and performing word segmentation processing on the preprocessed co-emotion response text to obtain text characteristics; and carrying out word frequency vectorization and inverse file frequency vectorization on the text features to generate vectorized text features.
The preprocessing can include the processing of removing duplication, nulling and removing other characters except Chinese and English characters and numbers on the co-emotion response text.
It can be understood that after the text is preprocessed, the embodiment of the application can adopt a word segmentation tool, such as a chinese word segmentation tool Jieba word segmentation, perform word segmentation processing on the preprocessed co-emotion response text, and perform TF-IDF (term frequency-inverse document frequency) vectorization on the segmented text, so as to convert the text into numbers which can be processed by a computer.
In step S103, the vectorized text features are input to a machine learning model of pre-trained exploratory technology classification and feedback, so as to obtain a co-emotion technology category and similarity, and the current co-emotion level of the psychological consultant is determined based on the co-emotion technology category and similarity, and the co-emotion interview training is continued or ended according to the training target of the psychological consultant.
The training target can be specifically set according to the training requirement of the psychological consultant, so as to be used for determining that the consultant continues to repeatedly exercise or ends training.
It can be understood that the embodiment of the application can be used for programming a machine model for carrying out automatic technical classification and co-emotion level scoring on user input, and a machine learning algorithm can be used for effectively predicting the category of co-emotion technology in consultation conversation; the similarity calculation mode of the user response and the expert response can reasonably embody the co-emotion level of the user response; therefore, the consultant can practice repeatedly at any time to improve the co-emotion level. It should be noted that other machine learning models, including deep learning models, may also be employed in embodiments of the present application to classify exploratory techniques.
In this embodiment, inputting the vectorized text features into a pre-trained exploratory technology classification and feedback machine learning model to obtain a co-emotion technology class and similarity, including: obtaining actual technical classification of the co-emotion response text according to a machine learning algorithm; determining a current co-emotion level for the psychological consultant based on the co-emotion technology categories and the similarity, comprising: and obtaining the co-emotion level responded by the psychological consultant according to the similarity between the co-emotion response text of the psychological consultant and the preset text.
The preset text features may be features obtained based on expert response text.
It can be understood that the embodiment of the application can use Chinese natural language processing technology, adopts a mode of calculating the similarity between user input and expert input, gives an approximate mode of automatically evaluating the co-condition level of the text input by the user, and ensures that the co-condition level evaluation does not depend on the work of high-level experts in the field any more, thereby becoming more operative.
In this embodiment, prior to inputting the text features into the pre-trained exploratory technology classification and feedback machine learning model, further comprising: acquiring training samples in real interviews by utilizing co-emotion opportunities defined by a co-emotion communication coding system; machine learning models of exploratory technology classification and feedback are trained based on training samples, wherein co-emotion opportunities include direct expression of emotion, direct expression of challenges, and direct expression of progression.
It can be understood that the embodiment of the application provides a machine learning model construction method of exploratory technology classification and feedback as shown in fig. 2 for training and improving the co-emotion level of novice consultants aiming at the blank of the co-emotion training program based on the machine learning method in the field of Chinese psychological consultation. Specifically, the machine learning model is constructed as follows:
step one, co-emotion opportunity marking: the embodiment of the application introduces an ECCS (Empathic Communication Code System, co-emotion communication coding system) in the medical field, and adopts a co-emotion opportunity marking mode to extract samples from first hand data of consultation conversation, namely, manuscripts. ECCS defines three co-estrus opportunities, direct expression of emotion, direct expression of challenges, and direct expression of progression, respectively. Training psychology researchers, marking the original interview manuscripts by dialog turns after marking consistency meets requirements, and selecting interviews with high consistency for discussion to reach agreement; then, the cosmophilic opportunity is rewritten, and the aim is to remove the expression of excessive spoken language under the conditions of ensuring independent semantics and smooth expression; and selecting the co-emotion opportunity of the rewritten word number which is about 90 words as a research sample.
It should be noted that, the co-emotion communication coding system is a coding system for measuring co-emotion communication behavior in communication between doctors and patients, and includes two parts: identifying the co-morbid opportunity created by the patient and encoding the response of the medical staff to the co-morbid opportunity. The co-estrus opportunity is operationally defined in ECCS as the direct expression of emotion, progression and challenges by the patient. While there are more potential co-morals opportunities in consultation sessions that provide indirect clues, it is not easy for expert consultants to consider identifying the party "un-spoken" and one of the model design objectives of embodiments of the present application is to train novice consultants, so that attention is paid to the well-defined "direct" co-morals opportunities. The definitions and examples of the three co-occurrence opportunities are shown in table 1.
TABLE 1
Step two, writing in co-emotion reaction: the main technologies of the exploration phase in the three-phase model of Hill's assistant technology are selected, namely, restatement, open questioning aiming at ideas, emotion reflection, emotion exposure and open questioning aiming at emotion. Doctor students who are specialized in clinical and counseling psychology are trained, please apply the five techniques to perform text response on the sample. The supervisor is registered by the Chinese psychological society to conduct the response quality.
It should be noted that, the embodiment of the application can adopt a machine learning method in the artificial intelligence field to automatically classify exploratory technologies in the HSS (Helping Skills System, assisted technology model), thereby avoiding a complicated manual coding mode and improving a research method for consultation and conversation texts; other interviewing techniques in addition to exploratory techniques in the technology model may also be used, such as facilitating insight and action techniques.
In particular, the assisted technology model exploration phase is based on the principal's therapeutic theory, with the aim of knowing the principal's consultation goals, where restatement and open questions for ideas can help the principal explore ideas, while emotion reflection, emotion exposure and open questions for emotions can encourage the principal to experience and express emotions. The improvement of the proportion of exploratory skills used in consultation interviews is also used as a goal for training psychological consultation assistance techniques. The definitions, examples, and general intent of the consultants for the five exploratory techniques are shown in Table 2, based on the definition of the help technical system.
TABLE 2
Step three, data preprocessing: preliminary data processing of co-emotion responses (including expert and user co-emotion responses) including duplication removal, nulling and removal of characters other than chinese and english characters and numbers may be performed using Python 3.9.6 programming; and performing word segmentation on the co-emotion response text subjected to preliminary processing by adopting a commonly used Chinese word segmentation tool, namely Jieba word segmentation.
Step four, text vectorization: and carrying out word frequency-inverse document frequency vectorization on the text after word segmentation, and converting the text into numbers which can be processed by a computer.
Step five, similarity calculation: and calculating the cosine distance between the text of the user response and the expert response by using the similarity function in the genesim module as the similarity between the user response and the expert response.
Step six, machine learning classification: programming by Python 3.9.6, performing similar preliminary data processing on a sample, performing TF-IDF vectorization on the segmented text, performing model training on the text as text features by using common machine learning models such as a polynomial naive Bayes classifier, a logistic regression classifier, a random forest classifier, a support vector linear classifier and the like, and verifying the classification accuracy of the model on five exploratory technologies.
Step seven, embedding a small program: the applet is developed based on a traditional developer tool, a cloud server is adopted, and a Mysql database is configured on the cloud server to realize data storage and extraction. In addition, the small program integrates a voice recognition function, so that voice input of a user can be converted into text output. The cloud server is built by taking the python language as the environment, the applet can send the text into the server by a POST method, and then returns an operation result in json format after processing by the python algorithm, and the operation result is displayed in the applet after processing.
In a specific application, as shown in fig. 3, when a user inputs a co-emotion response, the user's voice input is transcribed into text in the background, compared with the existing expert responses, and the model outputs the similarity of the user response and the expert response, and the automatically predicted co-emotion technology classification. Through the intentional exercise of the specific exploratory technology, the aim of improving the co-emotion level of novice consultants is achieved.
According to the co-emotion consultation training method of the psychological consultation provided by the embodiment of the application, the machine model for automatically classifying the user input and scoring the co-emotion level is programmed and designed, so that the category of the co-emotion technology in the consultation can be effectively predicted, the co-emotion level of the user response is reasonably reflected according to the similarity of the user response and the expert response, the consultation skill of the psychological consultation can be automatically and effectively trained without supervision of the expert, the reaction co-emotion level in the consultation is improved, and the training effect and efficiency are effectively improved.
Next, a co-emotion consultant conversation training device according to an embodiment of the present application will be described with reference to the accompanying drawings.
Fig. 4 is a block diagram of a device for training a co-emotion consultant of an embodiment of the present application.
As shown in fig. 4, the co-emotion conversation training device 10 of the psychological consultant includes: an acquisition module 100, a processing module 200, and a training module 300.
The obtaining module 100 is configured to ask questions of a psychological consultant, and obtain a consensus response text of the psychological consultant; the processing module 200 is used for performing text processing on the co-emotion response text, performing vectorization processing, and generating vectorized text features; the training module 300 is configured to input the vectorized text features to a machine learning model of pre-trained exploratory technology classification and feedback, obtain a co-emotion technology category and similarity, determine a current co-emotion level of the psychological consultant based on the co-emotion technology category and similarity, and continue co-emotion conversation training according to a training target of the psychological consultant, or end co-emotion conversation training.
Further, the apparatus 10 according to the embodiment of the present application further includes: and a model building module. The model construction module is used for acquiring training samples in real interviews by utilizing co-emotion opportunities defined by a co-emotion communication coding system before inputting text features into a pre-trained exploratory technology classification and feedback machine learning model; machine learning models of exploratory technology classification and feedback are trained based on training samples, wherein co-emotion opportunities include direct expression of emotion, direct expression of challenges, and direct expression of progression.
Further, the obtaining module 100 is configured to obtain a co-emotion response of the psychological consultant, and convert the co-emotion response into a co-emotion response text in a preset manner; the processing module 200 is configured to pre-process the co-emotion response text, segment the pre-processed co-emotion response text to obtain text features, and vectorize word frequency and inverse file frequency of the text features to generate vectorized text features; the training module 300 obtains the actual technical classification of the co-emotion response text according to the machine learning algorithm, and obtains the co-emotion level of the psychological consultant response according to the similarity between the co-emotion response text of the psychological consultant and the preset text.
It should be noted that the explanation of the embodiment of the co-emotion conversation training method for psychological consultants is also applicable to the co-emotion conversation training device for psychological consultants in this embodiment, and is not repeated herein.
According to the co-emotion consultation training device for the psychological consultants, disclosed by the embodiment of the application, the machine model for automatically classifying the user input and scoring the co-emotion level is programmed and designed, so that the category of the co-emotion technology in the consultation can be effectively predicted, the co-emotion level of the user response is reasonably reflected according to the similarity of the user response and the expert response, the consultation skill of the psychological consultant can be automatically and effectively trained without supervision of the expert, the reaction co-emotion level in the consultation is improved, and the training effect and efficiency are effectively improved.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application. The electronic device may include:
memory 501, processor 502, and a computer program stored on memory 501 and executable on processor 502.
The processor 502 implements the co-emotion consultant's co-emotion conversation training method provided in the above embodiments when executing the program.
Further, the electronic device further includes:
a communication interface 503 for communication between the memory 501 and the processor 502.
Memory 501 for storing a computer program executable on processor 502.
The memory 501 may include high-speed RAM memory and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
If the memory 501, the processor 502, and the communication interface 503 are implemented independently, the communication interface 503, the memory 501, and the processor 502 may be connected to each other via a bus and perform communication with each other. The bus may be an industry standard architecture (Industry Standard Architecture, abbreviated ISA) bus, an external device interconnect (Peripheral Component, abbreviated PCI) bus, or an extended industry standard architecture (Extended Industry Standard Architecture, abbreviated EISA) bus, among others. The buses may be divided into address buses, data buses, control buses, etc. For ease of illustration, only one thick line is shown in fig. 5, but not only one bus or one type of bus.
Alternatively, in a specific implementation, if the memory 501, the processor 502, and the communication interface 503 are integrated on a chip, the memory 501, the processor 502, and the communication interface 503 may perform communication with each other through internal interfaces.
The processor 502 may be a central processing unit (Central Processing Unit, abbreviated as CPU) or an application specific integrated circuit (Application Specific Integrated Circuit, abbreviated as ASIC) or one or more integrated circuits configured to implement embodiments of the present application.
The embodiment of the present application also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the co-emotion consultant conversation training method as described above.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or N embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present application, "N" means at least two, for example, two, three, etc., unless specifically defined otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more N executable instructions for implementing specific logical functions or steps of the process, and further implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present application.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or N wires, a portable computer cartridge (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). In addition, the computer readable medium may even be paper or other suitable medium on which the program is printed, as the program may be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
It is to be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the N steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. As with the other embodiments, if implemented in hardware, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
Those of ordinary skill in the art will appreciate that all or a portion of the steps carried out in the method of the above-described embodiments may be implemented by a program to instruct related hardware, where the program may be stored in a computer readable storage medium, and where the program, when executed, includes one or a combination of the steps of the method embodiments.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing module, or each unit may exist alone physically, or two or more units may be integrated in one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated modules may also be stored in a computer readable storage medium if implemented in the form of software functional modules and sold or used as a stand-alone product.
The above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, or the like. While embodiments of the present application have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the application, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the application.

Claims (6)

1. The co-emotion conversation training method of the psychological consultant is characterized by comprising the following steps of:
asking questions of psychological consultants and acquiring co-emotion response texts of the psychological consultants;
performing text processing on the co-emotion response text, performing vectorization processing, and generating vectorized text features; and
inputting the vectorized text features into a pre-trained exploratory technology classification and feedback machine learning model to obtain a co-emotion technology category and similarity, determining the current co-emotion level of the psychological consultant based on the co-emotion technology category and the similarity, and continuing to conduct co-emotion interview training or ending co-emotion interview training according to the training target of the psychological consultant;
before inputting the text features into the pre-trained exploratory technology classification and feedback machine learning model, further comprising:
acquiring training samples in real interviews by utilizing co-emotion opportunities defined by a co-emotion communication coding system;
training a machine learning model of the exploratory technique classification and feedback based on the training sample, wherein the co-emotion opportunities include direct expression of emotion, direct expression of challenges, and direct expression of progress;
inputting the vectorized text features to a pre-trained exploratory technology classification and feedback machine learning model to obtain a co-emotion technology class and similarity, wherein the method comprises the following steps: obtaining the actual technical classification of the co-emotion response text according to a machine learning algorithm;
the determining the current co-emotion level of the psychological consultant based on the co-emotion technology category and the similarity includes: obtaining the co-emotion level responded by the psychological consultant according to the similarity between the co-emotion response text of the psychological consultant and the preset text;
the method comprises the steps of obtaining training samples in real interviews by utilizing co-emotion opportunities defined by a co-emotion communication coding system, training a machine learning model of exploratory technology classification and feedback based on the training samples, and comprising the following steps: co-estrus opportunity tagging: introducing a co-emotion communication coding system, extracting a sample from first hand data of consultation conversation, marking and rewriting co-emotion opportunities, and selecting the co-emotion opportunities with the rewritten number of words being a preset number of words as training samples; the co-emotion communication coding system comprises the steps of identifying co-emotion opportunities created by patients, and coding the response of medical staff to the co-emotion opportunities, wherein the coded response of the medical staff to the co-emotion opportunities is used as a co-emotion response text of an expert;
the co-emotion response is written: performing text response on the sample by utilizing the technology of the exploration stage in the three-stage model of the assistant technology to obtain a co-emotion response text of the user;
data preprocessing: preprocessing the co-emotion response text of the user and the co-emotion response text of the expert respectively, and performing word segmentation on the preprocessed co-emotion response text;
text vectorization: word frequency-inverse document frequency vectorization is respectively carried out on the text after word segmentation, and the text is converted into numbers which can be processed by a computer so as to respectively obtain text characteristics responded by a user and text characteristics responded by an expert;
similarity calculation: calculating cosine distance between text features of the user response and the expert response, and taking the cosine distance as similarity between the user response and the expert response;
machine learning classification: training the training sample to obtain a classification result, wherein the classification comprises: restated, open questions for ideas, emotion reflection, emotion exposure, and open questions for emotions.
2. The method of claim 1, wherein the asking the psychological consultant to obtain the consensus response text of the psychological consultant comprises:
and obtaining the co-emotion response of the psychological consultant, and converting the co-emotion response into a co-emotion response text in a preset mode.
3. The method of claim 1, wherein the text processing and vectorizing the co-emotion response text to generate vectorized text features comprises:
preprocessing the co-emotion response text, and performing word segmentation processing on the preprocessed co-emotion response text to obtain the text characteristics;
and carrying out word frequency and inverse file frequency vectorization on the text features to generate vectorized text features.
4. A co-emotion conversation training device for psychological consultants, comprising:
the acquisition module is used for asking questions of the psychological consultants and acquiring the co-emotion response text of the psychological consultants;
the processing module is used for carrying out text processing on the co-emotion response text, carrying out vectorization processing on the text, and generating vectorized text characteristics; and
the training module is used for inputting the vectorized text characteristics into a pre-trained exploratory technology classification and feedback machine learning model to obtain a co-emotion technology category and similarity, determining the current co-emotion level of the psychological consultant based on the co-emotion technology category and the similarity, and continuing to conduct co-emotion interview training or ending co-emotion interview training according to the training target of the psychological consultant;
the model construction module is used for acquiring training samples in real interviews by utilizing co-emotion opportunities defined by a co-emotion communication coding system before inputting the text characteristics into a pre-trained exploratory technology classification and feedback machine learning model; training a machine learning model of the exploratory technique classification and feedback based on the training samples; wherein the co-morbid opportunity comprises direct expression of emotion, direct expression of challenge and direct expression of progression;
the acquisition module is used for acquiring the co-emotion response of the psychological consultant and converting the co-emotion response into a co-emotion response text in a preset mode;
the processing module is used for preprocessing the co-emotion response text, word segmentation processing is carried out on the preprocessed co-emotion response text to obtain text features, word frequency vectorization and inverse file frequency vectorization are carried out on the text features to generate vectorized text features;
the training module is used for obtaining the actual technical classification of the co-emotion response text according to a machine learning algorithm, and obtaining the co-emotion level responded by the psychological consultant according to the similarity between the co-emotion response text of the psychological consultant and a preset text;
the method comprises the steps of obtaining training samples in real interviews by utilizing co-emotion opportunities defined by a co-emotion communication coding system, training a machine learning model of exploratory technology classification and feedback based on the training samples, and comprising the following steps: co-estrus opportunity tagging: introducing a co-emotion communication coding system, extracting a sample from first hand data of consultation conversation, marking and rewriting co-emotion opportunities, and selecting the co-emotion opportunities with the rewritten number of words being a preset number of words as training samples; the co-emotion communication coding system comprises the steps of identifying co-emotion opportunities created by patients, and coding the response of medical staff to the co-emotion opportunities, wherein the coded response of the medical staff to the co-emotion opportunities is used as a co-emotion response text of an expert;
the co-emotion response is written: performing text response on the sample by utilizing the technology of the exploration stage in the three-stage model of the assistant technology to obtain a co-emotion response text of the user;
data preprocessing: preprocessing the co-emotion response text of the user and the co-emotion response text of the expert respectively, and performing word segmentation on the preprocessed co-emotion response text;
text vectorization: word frequency-inverse document frequency vectorization is respectively carried out on the text after word segmentation, and the text is converted into numbers which can be processed by a computer so as to respectively obtain text characteristics responded by a user and text characteristics responded by an expert;
similarity calculation: calculating cosine distance between text features of the user response and the expert response, and taking the cosine distance as similarity between the user response and the expert response;
machine learning classification: training the training sample to obtain a classification result, wherein the classification comprises: restated, open questions for ideas, emotion reflection, emotion exposure, and open questions for emotions.
5. An electronic device, comprising: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the program to implement the co-emotion consultant's co-emotion conversation training method of any of claims 1-3.
6. A computer readable storage medium having stored thereon a computer program for execution by a processor for implementing the co-emotion consultant conversation training method of any of claims 1-3.
CN202210028103.3A 2022-01-11 2022-01-11 Co-emotion conversation training method, device and equipment for psychological consultants and storage medium Active CN114418115B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210028103.3A CN114418115B (en) 2022-01-11 2022-01-11 Co-emotion conversation training method, device and equipment for psychological consultants and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210028103.3A CN114418115B (en) 2022-01-11 2022-01-11 Co-emotion conversation training method, device and equipment for psychological consultants and storage medium

Publications (2)

Publication Number Publication Date
CN114418115A CN114418115A (en) 2022-04-29
CN114418115B true CN114418115B (en) 2023-09-12

Family

ID=81274023

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210028103.3A Active CN114418115B (en) 2022-01-11 2022-01-11 Co-emotion conversation training method, device and equipment for psychological consultants and storage medium

Country Status (1)

Country Link
CN (1) CN114418115B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2021105938A4 (en) * 2021-08-19 2021-12-09 Choudhary, Deepak MR Automatic and dynamic contextual analysis of sentiment of social content and feedback reviews based on machine learning model

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9300790B2 (en) * 2005-06-24 2016-03-29 Securus Technologies, Inc. Multi-party conversation analyzer and logger
CN104050361B (en) * 2014-06-04 2017-06-23 杭州华亭科技有限公司 A kind of intellectual analysis method for early warning of prison prisoner danger sexual orientation
CN106156850A (en) * 2015-04-24 2016-11-23 江苏卓顿信息科技有限公司 A kind of psychological consultant's robot system based on cloud computing
CN108021703B (en) * 2017-12-26 2021-12-24 广西师范大学 Conversation type intelligent teaching system
CN108596523A (en) * 2018-05-29 2018-09-28 黑龙江省经济管理干部学院 One kind being used for the outcome-based teaching system of teachers ' teaching
CN109523853A (en) * 2018-12-05 2019-03-26 陈庆云 A kind of psychological consultation practice ability training auxiliary system
CN109805944B (en) * 2019-01-02 2021-10-29 华中师范大学 Children's ability analytic system that shares feelings
CN111292835A (en) * 2020-03-04 2020-06-16 上海市精神卫生中心(上海市心理咨询培训中心) Substance addiction patient psychological intervention method, system and storage device
CN112580953A (en) * 2020-12-11 2021-03-30 深圳市乐知网络科技有限公司 Consulting person ability evaluation method
CN112581015B (en) * 2020-12-28 2024-02-09 北京智能工场科技有限公司 Consultant quality assessment system and assessment method based on AI (advanced technology attachment) test
CN113407677B (en) * 2021-06-28 2023-11-14 北京百度网讯科技有限公司 Method, apparatus, device and storage medium for evaluating consultation dialogue quality

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2021105938A4 (en) * 2021-08-19 2021-12-09 Choudhary, Deepak MR Automatic and dynamic contextual analysis of sentiment of social content and feedback reviews based on machine learning model

Also Published As

Publication number Publication date
CN114418115A (en) 2022-04-29

Similar Documents

Publication Publication Date Title
Schuller et al. A review on five recent and near-future developments in computational processing of emotion in the human voice
Saffran Statistical language learning: Mechanisms and constraints
Hasan et al. A study of the effectiveness of machine learning methods for classification of clinical interview fragments into a large number of categories
CN108491486B (en) Method, device, terminal equipment and storage medium for simulating patient inquiry dialogue
US20060190804A1 (en) Writing and reading aid system
Kumar et al. A deep learning approaches and fastai text classification to predict 25 medical diseases from medical speech utterances, transcription and intent
US20150248397A1 (en) Computer-Implemented Systems and Methods for Measuring Discourse Coherence
Roberts et al. When parallel processing in visual word recognition is not enough: New evidence from naming
CN111145903A (en) Method and device for acquiring vertigo inquiry text, electronic equipment and inquiry system
CN116616770A (en) Multimode depression screening and evaluating method and system based on voice semantic analysis
Schmalz et al. Quantifying the reliance on different sublexical correspondences in German and English
Cross et al. Mini Pinyin: A modified miniature language for studying language learning and incremental sentence processing
Wang et al. RETRACTED: Research on automatic evaluation method of Mandarin Chinese pronunciation based on 5G network and FPGA
Ross A new frontier: AI and ancient language pedagogy
Avishka et al. Mobile app to support people with dyslexia and dysgraphia
Danner et al. Advancing Mental Health Diagnostics: GPT-Based Method for Depression Detection
CN114418115B (en) Co-emotion conversation training method, device and equipment for psychological consultants and storage medium
Nguyen et al. Designing ai-based conversational agent for diabetes care in a multilingual context
Muthumal et al. Mobile and Simulation-based Approach to reduce the Dyslexia with children Learning Disabilities
Agrima et al. Emotion recognition from syllabic units using k-nearest-neighbor classification and energy distribution
CN114300127A (en) Method, device, equipment and storage medium for inquiry processing
Zhou et al. Automatic patient note assessment without strong supervision
Vuyyuru et al. A Transformer-CNN Hybrid Model for Cognitive Behavioral Therapy in Psychological Assessment and Intervention for Enhanced Diagnostic Accuracy and Treatment Efficiency
Allouche Assisting children with special needs in their daily interaction with other people
Barros et al. Understanding public speakers’ performance: First contributions to support a computational approach

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant