CN112860995A - Interaction method, device, client, server and storage medium - Google Patents

Interaction method, device, client, server and storage medium Download PDF

Info

Publication number
CN112860995A
CN112860995A CN202110166690.8A CN202110166690A CN112860995A CN 112860995 A CN112860995 A CN 112860995A CN 202110166690 A CN202110166690 A CN 202110166690A CN 112860995 A CN112860995 A CN 112860995A
Authority
CN
China
Prior art keywords
search
emotion
search word
target
confidence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110166690.8A
Other languages
Chinese (zh)
Inventor
袁杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202110166690.8A priority Critical patent/CN112860995A/en
Publication of CN112860995A publication Critical patent/CN112860995A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H80/00ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Epidemiology (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Primary Health Care (AREA)
  • Psychiatry (AREA)
  • Developmental Disabilities (AREA)
  • Hospice & Palliative Care (AREA)
  • Child & Adolescent Psychology (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Linguistics (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an interaction method, an interaction device, a client, a server and a server, and relates to the technical field of artificial intelligence, in particular to the fields of natural language processing and deep learning. The specific implementation scheme is as follows: after responding to user operation executed on a search page, obtaining a first search word, sending the first search word to a server, enabling the server to generate a search word sequence according to the first search word and a plurality of second search words adopted by historical search, further adopting a trained emotion recognition model to perform emotion recognition on the search word sequence to obtain the confidence coefficient of a target emotion, and after obtaining the confidence coefficient of the target emotion from the server, displaying a guiding animation corresponding to the target emotion on the search page under the condition that the confidence coefficient of the target emotion is greater than a confidence coefficient threshold value. Therefore, the state of the negative emotion and the like of the user can be found early through the search words input by the user, so that the negative emotion of the user can be guided.

Description

Interaction method, device, client, server and storage medium
Technical Field
The application discloses an interaction method, an interaction device, a client and a storage medium, and particularly relates to the technical field of natural language processing, in particular to the fields of deep learning and artificial intelligence.
Background
The physical conditions of the society develop rapidly at present, but the mental world of human individuals faces crisis, and more young people go towards depression and even rest. The search engine meets the scene as the requirement of high daily access frequency of the user, records a plurality of traces of struggling of the user's mind, can give humanistic care to the user in a specific scene under the condition of not invading the privacy of the user to the maximum extent, and can also play the social responsibility of the Internet platform.
Disclosure of Invention
The application provides an interaction method, an interaction device, interaction equipment and a storage medium.
According to an aspect of the present application, there is provided an interaction method, including:
responding to user operation executed on a search page to obtain a first search word;
sending the first search word to a server; the first search word is used for generating a search word sequence with a plurality of second search words adopted by historical search, and emotion recognition is carried out on the search word sequence by adopting a trained emotion recognition model so as to obtain the confidence coefficient of a target emotion;
obtaining a confidence level of the target emotion from the server;
and displaying a guiding animation corresponding to the target emotion on the search page under the condition that the confidence of the target emotion is greater than a confidence threshold.
According to another aspect of the present application, there is provided another interaction method, including:
acquiring a first search word sent by a client in response to user operation of a search page;
querying the history record of the client to obtain a plurality of second search terms adopted by history search;
generating a search word sequence according to the first search word and the plurality of second search words;
performing emotion recognition on the search word sequence by adopting a trained emotion recognition model to obtain the confidence coefficient of the target emotion;
and sending the confidence to the client, so that the client displays a guiding animation corresponding to the target emotion on the search page under the condition that the confidence is greater than a confidence threshold.
According to another aspect of the present application, there is provided an interaction apparatus, including:
the response module is used for responding to user operation executed on the search page to obtain a first search word;
the sending module is used for sending the first search word to a server; the first search word is used for generating a search word sequence with a plurality of second search words adopted by historical search, and emotion recognition is carried out on the search word sequence by adopting a trained emotion recognition model so as to obtain the confidence coefficient of a target emotion;
an obtaining module for obtaining the confidence of the target emotion from the server;
and the display module is used for displaying the guiding animation corresponding to the target emotion on the search page under the condition that the confidence coefficient of the target emotion is greater than the confidence coefficient threshold value.
According to another aspect of the present application, there is provided another interaction apparatus, including:
the acquisition module is used for acquiring a first search word sent by a client in response to the user operation of a search page;
the query module is used for querying the historical records of the client to obtain a plurality of second search terms adopted by historical search;
the generating module is used for generating a search word sequence according to the first search word and the plurality of second search words;
the recognition module is used for recognizing the emotion of the search word sequence by adopting a trained emotion recognition model so as to obtain the confidence coefficient of the target emotion;
and the sending module is used for sending the confidence to the client so that the client displays the guiding animation corresponding to the target emotion on the search page under the condition that the confidence is greater than a confidence threshold.
According to another aspect of the present application, there is provided a client comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of interaction of the embodiments of the above-described aspect.
According to another aspect of the present application, there is provided a server comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of interacting as described in the above further aspect embodiment.
According to another aspect of the present application, there is provided a non-transitory computer-readable storage medium storing computer instructions for causing the computer to perform the interaction method of the above embodiment.
According to another aspect of the present application, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the interaction method of the above embodiments.
One embodiment in the above application has the following advantages or benefits: the state of the negative emotion of the user is discovered early through the search words input by the user, so that the negative emotion of the user is guided.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present application, nor do they limit the scope of the present application. Other features of the present application will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
fig. 1 is a schematic flowchart of an interaction method according to an embodiment of the present application;
fig. 2 is a schematic flowchart of an interaction method according to a second embodiment of the present application;
fig. 3 is a schematic flowchart of an interaction method provided in the third embodiment of the present application;
fig. 4 is a schematic flowchart of an interaction method according to a fourth embodiment of the present application;
fig. 5 is a schematic flowchart of an interaction method according to a fifth embodiment of the present application;
fig. 6 is a schematic structural diagram of an interaction apparatus according to a sixth embodiment of the present application;
fig. 7 is a schematic structural diagram of an interaction apparatus according to a seventh embodiment of the present application;
fig. 8 is a block diagram of a client for implementing the interaction method of the embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
The existing search engine can carry out big data mining on massive search logs of users, establish user figures according to business requirements and apply mining results to specific occasions, such as personalized advertisement recommendation.
In the existing user behavior analysis based on data mining, most of applications are recommended advertisements, and the user behavior analysis is good in brief. However, with the overall development of the internet, the internet is slowly integrated into the social encyclopedia, and most internet platforms are particularly lack in the aspect of humanistic construction.
The method comprises the steps of carrying out semantic recognition and emotion analysis on search words input by a user in a search engine to determine the confidence coefficient of the target emotion of the current user, and displaying a guide picture corresponding to the target emotion on a search page when the confidence coefficient of the target emotion is larger than a confidence coefficient threshold value.
An interaction method, an apparatus, a device, and a storage medium according to embodiments of the present application are described below with reference to the accompanying drawings.
Fig. 1 is a schematic flowchart of an interaction method according to an embodiment of the present application.
The embodiment of the present application is exemplified by the interaction method being configured in an interaction device, and the interaction device may be applied in any client, so that the client may execute an interaction function.
The client may be a Personal Computer (PC), a cloud device, a mobile device, and the like, and the mobile device may be a hardware device with various operating systems, such as a mobile phone, a tablet Computer, a Personal digital assistant, a wearable device, and an in-vehicle device.
As shown in fig. 1, the interaction method, executed by a client, may include the following steps:
step 101, in response to a user operation performed on a search page, a first search term is obtained.
The search page may be a search page of various search engines, and the search engine is not limited in this application.
In order to facilitate distinguishing, the search terms input by the user in the search page are named as first search terms in the application; and naming the search word input by the user at the historical time as a second search word. Of course, they may be named in other ways, and are not limited herein.
In the embodiment of the application, after the user inputs the search word on the search page, the client responds to the input operation executed by the user on the search page, and can acquire the first search word input by the user.
It should be explained that the user operation may be an operation of inputting a search term by a voice input manner, an operation of inputting a search term by a manual input manner, and the like, and is not limited herein.
Step 102, sending the first search term to a server.
The first search word is used for generating a search word sequence with a plurality of second search words adopted by historical search.
In the embodiment of the application, the client responds to the user operation input by the user on the search page, and after the first search word is obtained, the first search word can be sent to the server.
The server can also obtain a plurality of second search terms adopted in searching the page in the historical time of the user from the historical search behavior log. The historical search behavior log stores search words input in a search page within the historical time of a user. The server can also query the history record of the client to obtain a plurality of second search terms used by the historical search of the user.
After receiving the first search word sent by the client, the server may generate a search word sequence from the first search word and the plurality of second search words.
It should be noted that, when a search word sequence is generated according to the first search word and the plurality of second search words, the ordering of the first search word and the plurality of second search words is not limited, and the first search word and the plurality of second search words may be ordered in any order.
Further, the server performs emotion recognition on the search word sequence by adopting the trained emotion recognition model to obtain the confidence coefficient of the target emotion.
The emotion recognition model is obtained by training a large number of training samples, and emotion recognition can be accurately performed on the search terms to obtain the confidence coefficient of the target emotion. Wherein the training samples comprise text describing the target emotion.
The training samples in the present application may be search terms including emotions stored by a server, or search terms including emotions input by a user, and the like, which is not limited herein.
Step 103, obtaining confidence of the target emotion from the server.
In the embodiment of the application, the server adopts the trained emotion recognition model to perform emotion recognition on the search word sequence so as to obtain the confidence coefficient of the target emotion, and then the confidence coefficient of the target emotion can be sent to the client, so that the client can acquire the confidence coefficient of the target emotion from the server.
And 104, displaying a guiding animation corresponding to the target emotion on the search page under the condition that the confidence coefficient of the target emotion is greater than the confidence coefficient threshold value.
The confidence threshold is a preset confidence value.
As a possible case, the confidence threshold may be a confidence value set by the client in response to a user operation.
It will be appreciated that the target emotion may be divided into a plurality of levels, with different levels corresponding to different confidence thresholds, which the user may set according to his or her own needs.
As an example, assuming that the target mood is depression, the depression can be divided into three levels, namely low level, medium level and high level, different levels have corresponding confidence thresholds, and the user can set the confidence thresholds according to the needs of the user.
In the embodiment of the application, after the client side obtains the confidence degree of the target emotion from the server, the confidence degree of the target emotion is compared with the confidence degree threshold value, and the guiding animation corresponding to the target emotion is displayed on the searching page under the condition that the confidence degree of the target emotion is determined to be larger than the confidence degree threshold value.
As an example, assuming that the target emotion is a negative emotion, in the case where the confidence of determining the negative emotion is greater than the confidence threshold, a guidance picture corresponding to the negative emotion may be displayed on the search page.
In another possible case, after the client acquires the confidence degree of the target emotion from the server, the confidence degree of the target emotion is compared with a confidence degree threshold value, and under the condition that the confidence degree of the target emotion is determined to be smaller than or equal to the confidence degree threshold value, the search page jumps to a display page for displaying the search result according to the search terms.
According to the interaction method, after responding to user operation executed on a search page, a first search word is obtained, the first search word is sent to a server, so that the server generates a search word sequence according to the first search word and a plurality of second search words adopted by historical search, further, a trained emotion recognition model is adopted to perform emotion recognition on the search word sequence, confidence of a target emotion is obtained, and after the confidence of the target emotion is obtained from the server, a guiding animation corresponding to the target emotion is displayed on the search page under the condition that the confidence of the target emotion is larger than a confidence threshold. Therefore, the state of the negative emotion and the like of the user can be found early through the search words input by the user, so that the negative emotion of the user can be guided.
On the basis of the embodiment, after the search page displays the guide animation corresponding to the target emotion, the user can be guided to enter the emotion treatment communication page so as to intervene in states such as negative emotion of the user. Referring to fig. 2 for details, fig. 2 is a schematic flowchart of an interaction method according to a second embodiment of the present application.
As shown in fig. 2, the interaction method may include the following steps:
step 201, responding to a user operation executed on a search page, obtaining a first search term.
Step 202, sending a first search term to a server; the first search word is used for generating a search word sequence with a plurality of second search words adopted by historical search, and emotion recognition is carried out on the search word sequence by adopting a trained emotion recognition model so as to obtain the confidence coefficient of the target emotion.
Step 203, obtaining confidence of the target emotion from the server.
And 204, displaying a guiding animation corresponding to the target emotion on the search page under the condition that the confidence coefficient of the target emotion is greater than the confidence coefficient threshold value.
In the embodiment of the present application, the implementation process of step 201 to step 204 may refer to the implementation process of step 101 to step 104 in the above embodiment, which is not described herein again.
Step 205, displaying the target control.
The target control is a control which is displayed at the client and used for interaction between the user and the client. The target control may be an interesting animated character, or may be a control displayed in other forms, which is not limited herein.
As an example, where the confidence of the determined target emotion is greater than a confidence threshold, a painted egg robot may be displayed on the search page that guides the user through the therapeutic communication page in a cartoon-like manner.
And step 206, responding to the triggering operation of the target control, and displaying a target page corresponding to the target emotion.
In the embodiment of the application, after the client displays the target control, the client can prompt the user to click the target control in a text and/or voice mode so as to respond to the triggering operation of the user on the target control and display the target page corresponding to the target emotion.
In one possible case, the client can display a presentation area for presenting knowledge content of the target emotion on the target page in response to the triggering operation of the target control. Therefore, the user can know the emotion of the user, so that the user pays attention to physical and psychological health.
As an example, assuming that the target emotion is depression, a presentation area for presenting knowledge content of depression may be displayed on the target page. For example, the manifestation of depression, simple treatment, etc. may be displayed in the display area.
In another possible case, in response to the triggering operation of the target control, the client may further display an interaction area for invoking the conversation service corresponding to the target emotion on the target page. Therefore, the user can carry out conversation communication with the robot or a psychotherapist and the like so as to relieve the negative emotion of the user.
Optionally, an interaction area of the artificial intelligence conversation service can be displayed on the target page, so that the conversation service displayed in the interaction area by the user can be communicated with the intelligent robot.
Optionally, an interaction area of the manual conversation service may be displayed on the target page, so that the manual conversation service displayed in the interaction area by the user communicates with a human.
It should be noted that, when the user communicates with the intelligent robot or the human, the user may communicate in a text input manner, in a voice manner, in a video manner, and the like, which is not limited herein.
In another possible case, in response to the triggering operation of the target control, the client may also simultaneously display, on the target page, a presentation area for presenting knowledge content of the target emotion and an interaction area for invoking a conversation service corresponding to the target emotion.
As an example, assuming that the target emotion is depression, an interaction area for invoking a dialogue service for depression and a presentation area for presenting knowledge content of depression may also be simultaneously displayed on the target page.
In the embodiment of the application, after the search page displays the guiding animation corresponding to the target emotion, the target control can be displayed so as to respond to the triggering operation of the target control and display the target page corresponding to the target emotion. Therefore, the emotion of the user is relieved by communicating with the user on the target page.
In order to implement the above embodiments, the present application proposes another interaction method.
Fig. 3 is a schematic flowchart of an interaction method provided in the third embodiment of the present application.
As shown in fig. 3, the interaction method, executed by the server, may include the following steps:
step 301, obtaining a first search term sent by a client in response to a user operation of a search page.
The search page may be a search page of various search engines, and the search engine is not limited in this application.
In the embodiment of the application, after a user inputs a search word on a search page, a client responds to an input operation executed by the user on the search page, and can acquire a first search word input by the user, and then the client sends the first search word to a server, so that the server acquires the first search word sent by the client.
It should be explained that the user operation may be an operation of inputting a search term by a voice input manner, an operation of inputting a search term by a manual input manner, and the like, and is not limited herein.
Step 302, querying the history record of the client to obtain a plurality of second search terms used in the history search.
In the embodiment of the application, the server can query the historical search behavior log of the client to acquire a plurality of second search terms used in historical search of the user from the historical search behavior log.
Step 303, generating a search word sequence according to the first search word and the plurality of second search words.
The search word sequence refers to a sequence obtained by sequencing a plurality of search words.
In the embodiment of the application, after the server obtains the plurality of search terms used by the historical search of the user, the server can generate the search term sequence according to the first search term and the plurality of second search terms.
It should be noted that, when a search word sequence is generated according to the first search word and the plurality of second search words, the ordering of the first search word and the plurality of second search words is not limited, and the first search word and the plurality of second search words may be ordered in any order.
And 304, performing emotion recognition on the search word sequence by adopting the trained emotion recognition model to obtain the confidence coefficient of the target emotion.
The emotion recognition model is obtained by training a large number of training samples, and emotion recognition can be accurately performed on the search terms to obtain the confidence coefficient of the target emotion. Wherein the training samples comprise text describing the target emotion.
The training sample in the present application may be a text stored by the server and describing the target emotion, or may also be a text input by the user and describing the target emotion, and the like, which is not limited herein.
In the embodiment of the application, after the emotion recognition model is trained by adopting a large number of training samples, the trained emotion recognition model can be adopted to carry out emotion recognition on the search word sequence so as to obtain the confidence coefficient of the target emotion. Therefore, the emotion recognition accuracy of the emotion recognition model on the search word sequence can be improved.
And 305, sending the confidence level to the client so that the client displays the guiding animation corresponding to the target emotion on the search page under the condition that the confidence level is greater than the confidence level threshold value.
The confidence threshold is a preset confidence value.
As a possible case, the confidence threshold may be a confidence value set by the client in response to a user operation.
It will be appreciated that the target emotion may be divided into a plurality of levels, with different levels corresponding to different confidence thresholds, which the user may set according to his or her own needs.
As an example, assuming that the target mood is depression, the depression can be divided into three levels, namely low level, medium level and high level, different levels have corresponding confidence thresholds, and the user can set the confidence thresholds according to the needs of the user.
In the embodiment of the application, the server adopts the trained emotion recognition model to perform emotion recognition on the search word sequence, and after the confidence coefficient of the target emotion is obtained, the confidence coefficient of the target emotion can be sent to the client. After receiving the confidence level of the target emotion, the client compares the confidence level with a confidence level threshold value.
In a possible case, in a case that the confidence degree of the target emotion is determined to be greater than the confidence degree threshold value, a guiding animation corresponding to the target emotion is displayed on the searching page.
In another possible case, in a case where it is determined that the confidence of the target emotion is less than or equal to the confidence threshold, the search page jumps to a display page on which the search result is displayed according to the search word.
According to the interaction method, after the server obtains the first search word sent by the client in response to the user operation of the search page, the server inquires the historical record of the client to obtain a plurality of second search words adopted by historical search, generates a search word sequence according to the first search word and the second search words, adopts a trained emotion recognition model to perform emotion recognition on the search word sequence to obtain the confidence coefficient of the target emotion, and sends the confidence coefficient to the client so that the client displays the guide animation corresponding to the target emotion on the search page under the condition that the confidence coefficient is greater than the confidence coefficient threshold. . Therefore, the state of the negative emotion and the like of the user can be found early through the search words input by the user, so that the negative emotion of the user can be guided.
In the above embodiments, it has been mentioned that emotion recognition is performed on a search word sequence using an emotion recognition model to obtain a confidence of a target emotion. Referring to fig. 4 for details, fig. 4 is a schematic flowchart of an interaction method according to a fourth embodiment of the present application.
As shown in fig. 4, the interaction method, executed by the server, may include the following steps:
step 401, inputting the search word sequence into an emotion recognition model, and performing semantic feature extraction on the search word sequence by using a feature extraction layer of the emotion recognition model to obtain semantic features of the search word sequence.
The feature extraction layer is used for extracting semantic features of the search word sequence to obtain the semantic features of the search word sequence.
In the embodiment of the application, when the trained emotion recognition model is used for emotion recognition of the search word sequence, the feature extraction layer of the emotion recognition model can be used for semantic feature extraction of the search word sequence to obtain the semantic features of the search words in the search word sequence.
Step 402, classifying semantic features by using a classification layer of the emotion recognition model to obtain a confidence coefficient belonging to the target emotion.
The classification layer is used for classifying the semantic features extracted by the feature extraction layer and scoring each search term.
In the embodiment of the application, the semantic features of the search word sequence are extracted by adopting the feature extraction layer of the emotion recognition model, after the semantic features of the search word sequence are obtained, the semantic features can be classified by adopting the classification layer so as to determine the target emotion to which each search word in the search word sequence belongs, and further, the confidence coefficient of the target emotion is obtained.
It should be explained that the feature extraction layer and the classification layer of the emotion recognition model are both trained by using training samples, and semantic features of the search word sequence and confidence degrees belonging to target emotions can be accurately extracted and obtained.
On the basis of the above embodiments, the embodiments of the present application provide an interaction method.
Fig. 5 is a schematic flowchart of an interaction method provided in the fifth embodiment of the present application.
As shown in fig. 5, the interaction method may include the following steps:
in step 501, a client responds to a user operation executed on a search page to obtain a first search term.
Step 502, the client sends a first search term to the server.
In step 503, the server queries the history of the client to obtain a plurality of second search terms used in the history search.
In step 504, the server generates a search word sequence according to the first search word and the plurality of second search words.
And 505, the server performs emotion recognition on the search word sequence by adopting the trained emotion recognition model to obtain the confidence coefficient of the target emotion.
Step 506, the server sends the confidence of the target emotion to the client.
In step 507, the client displays the guiding animation corresponding to the target emotion on the search page under the condition that the confidence coefficient is determined to be greater than the confidence coefficient threshold value.
Step 508, displaying the target control at the client.
In step 509, the client displays a target page corresponding to the target emotion in response to the trigger operation of the user on the target control.
It should be noted that, for the specific implementation process of the steps 501 to 509, reference may be made to the implementation process of each step in the foregoing embodiment, and details are not described here.
In order to implement the above embodiments, an interactive device is provided in an embodiment of the present application.
Fig. 6 is a schematic structural diagram of an interaction device according to a sixth embodiment of the present application.
As shown in fig. 6, the interactive apparatus 600 may include: a response module 610, a sending module 620, an obtaining module 630, and a presentation module 640.
The response module 610 is configured to obtain the first search term in response to a user operation performed on the search page.
A sending module 620, configured to send the first search term to a server; the first search word is used for generating a search word sequence with a plurality of second search words adopted by historical search, and emotion recognition is carried out on the search word sequence by adopting a trained emotion recognition model so as to obtain the confidence coefficient of the target emotion.
An obtaining module 630, configured to obtain the confidence level of the target emotion from the server.
And the presentation module 640 is configured to display a guidance animation corresponding to the target emotion on the search page when the confidence of the target emotion is greater than the confidence threshold.
In a possible case, the interaction apparatus 600 may further include:
the first display module is used for displaying the target control;
and the second display module is used for responding to the triggering operation of the target control and displaying the target page corresponding to the target emotion.
In another possible scenario, the target page includes at least one of the following:
the display area is used for displaying the knowledge content of the target emotion;
and the interaction area is used for calling the dialogue service corresponding to the target emotion.
In another possible case, the interaction apparatus 600 may further include:
and the setting module is used for setting the confidence coefficient threshold value in response to user operation.
It should be noted that the explanation of the interaction method embodiment in fig. 1 and fig. 2 is also applicable to the interaction apparatus, and is not repeated here.
The interaction device of the embodiment of the application sends the first search word to the server after responding to the user operation executed on the search page and obtaining the first search word, so that the server generates a search word sequence according to the first search word and a plurality of second search words adopted by historical search, further adopts a trained emotion recognition model to perform emotion recognition on the search word sequence to obtain the confidence coefficient of the target emotion, and displays the guide animation corresponding to the target emotion on the search page under the condition that the confidence coefficient of the target emotion is greater than the confidence coefficient threshold after the confidence coefficient of the target emotion is obtained from the server. Therefore, the state of the negative emotion and the like of the user can be found early through the search words input by the user, so that the negative emotion of the user can be guided.
In order to implement the above embodiments, the present application proposes another interaction device.
Fig. 7 is a schematic structural diagram of an interaction device according to a seventh embodiment of the present application.
As shown in fig. 7, the interactive apparatus 700 may include: an acquisition module 710, a query module 720, a generation module 730, an identification module 740, and a sending module 750.
The obtaining module 710 is configured to obtain a first search term sent by the client in response to a user operation of the search page.
And the query module 720 is configured to query the history record of the client to obtain a plurality of second search terms used in the history search.
The generating module 730 is configured to generate a search word sequence according to the first search word and the plurality of second search words.
And the recognition module 740 is configured to perform emotion recognition on the search word sequence by using the trained emotion recognition model to obtain a confidence of the target emotion.
And a sending module 750, configured to send the confidence level to the client, so that the client displays a guidance animation corresponding to the target emotion on the search page when the confidence level is greater than the confidence level threshold.
The identifying module 740, where possible, may be further configured to:
inputting the search word sequence into an emotion recognition model, and performing semantic feature extraction on the search word sequence by adopting a feature extraction layer of the emotion recognition model to obtain semantic features of the search word sequence;
and classifying semantic features by adopting a classification layer of the emotion recognition model to obtain confidence degrees belonging to the target emotion.
In another possible case, the emotion recognition model is obtained by training a training sample; wherein the training samples comprise text describing the target emotion.
It should be noted that the explanation of the interaction method embodiment in fig. 3 and fig. 4 is also applicable to the interaction apparatus, and is not repeated here.
According to the interaction device, after the server obtains the user operation of the client responding to the search page, after the first search word is sent, the server inquires the historical record of the client to obtain a plurality of second search words adopted by historical search, a search word sequence is generated according to the first search word and the second search words, emotion recognition is carried out on the search word sequence by adopting a trained emotion recognition model to obtain the confidence coefficient of the target emotion, and the confidence coefficient is sent to the client so that the client can display the guide animation corresponding to the target emotion on the search page under the condition that the confidence coefficient is larger than the confidence coefficient threshold. . Therefore, the state of the negative emotion and the like of the user can be found early through the search words input by the user, so that the negative emotion of the user can be guided.
There is also provided, in accordance with an embodiment of the present application, a client, a server, a readable storage medium, and a computer program product.
In order to implement the foregoing embodiments, the present application provides a client, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the interaction method of fig. 1 or 2.
In order to implement the above embodiments, the present application provides a server, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the interaction method of fig. 3 or 4.
In order to implement the above embodiments, the present application proposes a non-transitory computer-readable storage medium storing computer instructions for causing the computer to perform the interaction method of fig. 1 or 2, or perform the interaction method of fig. 3 or 4.
In order to implement the above embodiments, the present application proposes a computer program product comprising a computer program which, when executed by a processor, implements the interaction method of fig. 1 or 2, or implements the interaction method described in fig. 3 or 4.
Fig. 8 shows a schematic block diagram of an example client 800 that may be used to implement embodiments of the present application. The client is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The client may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 8, the device 800 includes a computing unit 801 that can perform various appropriate actions and processes in accordance with a computer program stored in a ROM (Read-Only Memory) 802 or a computer program loaded from a storage unit 808 into a RAM (Random Access Memory) 803. In the RAM803, various programs and data required for the operation of the device 800 can also be stored. The calculation unit 801, the ROM 802, and the RAM803 are connected to each other by a bus 804. An I/O (Input/Output) interface 805 is also connected to the bus 804.
A number of components in the device 800 are connected to the I/O interface 805, including: an input unit 806, such as a keyboard, a mouse, or the like; an output unit 807 such as various types of displays, speakers, and the like; a storage unit 808, such as a magnetic disk, optical disk, or the like; and a communication unit 809 such as a network card, modem, wireless communication transceiver, etc. The communication unit 809 allows the device 800 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
Computing unit 801 may be a variety of general and/or special purpose processing components with processing and computing capabilities. Some examples of the computing Unit 801 include, but are not limited to, a CPU (Central Processing Unit), a GPU (graphics Processing Unit), various dedicated AI (Artificial Intelligence) computing chips, various computing Units running machine learning model algorithms, a DSP (Digital Signal Processor), and any suitable Processor, controller, microcontroller, and the like. The computing unit 801 performs the various methods and processes described above, such as interaction. For example, in some embodiments, the interaction may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 808. In some embodiments, part or all of the computer program can be loaded and/or installed onto device 800 via ROM 802 and/or communications unit 809. When the computer program is loaded into RAM803 and executed by the computing unit 801, one or more steps of the interaction described above may be performed. Alternatively, in other embodiments, the computing unit 801 may be configured to perform the interaction in any other suitable manner (e.g., by way of firmware).
Various implementations of the systems and techniques described here above may be realized in digital electronic circuitry, Integrated circuitry, FPGAs (Field Programmable Gate arrays), ASICs (Application-Specific Integrated circuits), ASSPs (Application Specific Standard products), SOCs (System On Chip, System On a Chip), CPLDs (Complex Programmable Logic devices), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present application may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this application, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a RAM, a ROM, an EPROM (Electrically Programmable Read-Only-Memory) or flash Memory, an optical fiber, a CD-ROM (Compact Disc Read-Only-Memory), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: LAN (Local Area Network), WAN (Wide Area Network), internet, and blockchain Network.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The Server can be a cloud Server, also called a cloud computing Server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service ("Virtual Private Server", or simply "VPS").
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present application can be achieved, and the present invention is not limited herein.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (18)

1. An interaction method, comprising:
responding to user operation executed on a search page to obtain a first search word;
sending the first search word to a server; the first search word is used for generating a search word sequence with a plurality of second search words adopted by historical search, and emotion recognition is carried out on the search word sequence by adopting a trained emotion recognition model so as to obtain the confidence coefficient of a target emotion;
obtaining a confidence level of the target emotion from the server;
and displaying a guiding animation corresponding to the target emotion on the search page under the condition that the confidence of the target emotion is greater than a confidence threshold.
2. The interaction method according to claim 1, wherein after the search page displays the guidance animation corresponding to the target emotion, further comprising:
displaying a target control;
and responding to the triggering operation of the target control, and displaying a target page corresponding to the target emotion.
3. The interaction method according to claim 2, wherein the target page comprises at least one of the following:
a display area for displaying the knowledge content of the target emotion;
and the interaction area is used for calling the dialogue service corresponding to the target emotion.
4. The interaction method according to any one of claims 1-3, wherein the method further comprises:
setting the confidence threshold in response to a user action.
5. An interaction method, comprising:
acquiring a first search word sent by a client in response to user operation of a search page;
querying the history record of the client to obtain a plurality of second search terms adopted by history search;
generating a search word sequence according to the first search word and the plurality of second search words;
performing emotion recognition on the search word sequence by adopting a trained emotion recognition model to obtain the confidence coefficient of the target emotion;
and sending the confidence to the client, so that the client displays a guiding animation corresponding to the target emotion on the search page under the condition that the confidence is greater than a confidence threshold.
6. The interaction method of claim 5, wherein the performing emotion recognition on the search word sequence by using the trained emotion recognition model to obtain the confidence of the target emotion comprises:
inputting the search word sequence into the emotion recognition model, and performing semantic feature extraction on the search word sequence by adopting a feature extraction layer of the emotion recognition model to obtain semantic features of the search word sequence;
and classifying the semantic features by adopting a classification layer of the emotion recognition model to obtain the confidence coefficient of the target emotion.
7. The interaction method according to claim 5 or 6,
the emotion recognition model is obtained by training a training sample; wherein the training sample comprises text describing the target emotion.
8. An interaction device, comprising:
the response module is used for responding to user operation executed on the search page to obtain a first search word;
the sending module is used for sending the first search word to a server; the first search word is used for generating a search word sequence with a plurality of second search words adopted by historical search, and emotion recognition is carried out on the search word sequence by adopting a trained emotion recognition model so as to obtain the confidence coefficient of a target emotion;
an obtaining module for obtaining the confidence of the target emotion from the server;
and the display module is used for displaying the guiding animation corresponding to the target emotion on the search page under the condition that the confidence coefficient of the target emotion is greater than the confidence coefficient threshold value.
9. The interaction device of claim 8, wherein the device further comprises:
the first display module is used for displaying the target control;
and the second display module is used for responding to the triggering operation of the target control and displaying the target page corresponding to the target emotion.
10. The interaction device of claim 9, wherein the destination page includes at least one of:
a display area for displaying the knowledge content of the target emotion;
and the interaction area is used for calling the dialogue service corresponding to the target emotion.
11. The interaction device of any one of claims 8-10, wherein the device further comprises:
and the setting module is used for setting the confidence coefficient threshold value in response to user operation.
12. An interaction device, comprising:
the acquisition module is used for acquiring a first search word sent by a client in response to the user operation of a search page;
the query module is used for querying the historical records of the client to obtain a plurality of second search terms adopted by historical search;
the generating module is used for generating a search word sequence according to the first search word and the plurality of second search words;
the recognition module is used for recognizing the emotion of the search word sequence by adopting a trained emotion recognition model so as to obtain the confidence coefficient of the target emotion;
and the sending module is used for sending the confidence to the client so that the client displays the guiding animation corresponding to the target emotion on the search page under the condition that the confidence is greater than a confidence threshold.
13. The interaction apparatus of claim 12, wherein the identification module is further configured to:
inputting the search word sequence into the emotion recognition model, and performing semantic feature extraction on the search word sequence by adopting a feature extraction layer of the emotion recognition model to obtain semantic features of the search word sequence;
and classifying the semantic features by adopting a classification layer of the emotion recognition model to obtain the confidence coefficient of the target emotion.
14. The interaction device of claim 12 or 13,
the emotion recognition model is obtained by training a training sample; wherein the training sample comprises text describing the target emotion.
15. A client, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the interaction method of any one of claims 1-4 or to perform the interaction method of any one of claims 5-7.
16. A server, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the interaction method of any one of claims 5-7.
17. A non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the interaction method of any one of claims 1-4 or the interaction method of any one of claims 5-7.
18. A computer program product comprising a computer program which, when executed by a processor, implements the interaction method of any one of claims 1-4 or implements the interaction method of any one of claims 5-7.
CN202110166690.8A 2021-02-04 2021-02-04 Interaction method, device, client, server and storage medium Pending CN112860995A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110166690.8A CN112860995A (en) 2021-02-04 2021-02-04 Interaction method, device, client, server and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110166690.8A CN112860995A (en) 2021-02-04 2021-02-04 Interaction method, device, client, server and storage medium

Publications (1)

Publication Number Publication Date
CN112860995A true CN112860995A (en) 2021-05-28

Family

ID=75988905

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110166690.8A Pending CN112860995A (en) 2021-02-04 2021-02-04 Interaction method, device, client, server and storage medium

Country Status (1)

Country Link
CN (1) CN112860995A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113572893A (en) * 2021-07-13 2021-10-29 青岛海信移动通信技术股份有限公司 Terminal device, emotion feedback method and storage medium
CN114861057A (en) * 2022-05-17 2022-08-05 北京百度网讯科技有限公司 Resource sending method, training of recommendation model and device

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103137043A (en) * 2011-11-23 2013-06-05 财团法人资讯工业策进会 Advertisement display system and advertisement display method in combination with search engine service
CN104063418A (en) * 2014-03-17 2014-09-24 百度在线网络技术(北京)有限公司 Search recommendation method and device
CN105224554A (en) * 2014-06-11 2016-01-06 阿里巴巴集团控股有限公司 Search word is recommended to carry out method, system, server and the intelligent terminal searched for
CN105260416A (en) * 2015-09-25 2016-01-20 百度在线网络技术(北京)有限公司 Voice recognition based searching method and apparatus
CN105975492A (en) * 2016-04-26 2016-09-28 乐视控股(北京)有限公司 Search term prompt method and device
CN106407287A (en) * 2016-08-29 2017-02-15 宇龙计算机通信科技(深圳)有限公司 Multimedia resource pushing method and system
CN106649409A (en) * 2015-11-04 2017-05-10 陈包容 Method and apparatus for displaying search result based on scene information
CN107506349A (en) * 2017-08-04 2017-12-22 卓智网络科技有限公司 A kind of user's negative emotions Forecasting Methodology and system based on network log
CN107515928A (en) * 2017-08-25 2017-12-26 百度在线网络技术(北京)有限公司 A kind of method, apparatus, server, storage medium for judging assets price tendency
CN108280200A (en) * 2018-01-29 2018-07-13 百度在线网络技术(北京)有限公司 Method and apparatus for pushed information
CN108763545A (en) * 2018-05-31 2018-11-06 深圳市零度智控科技有限公司 Negative emotions interference method, device and readable storage medium storing program for executing, terminal device
CN110377726A (en) * 2019-06-05 2019-10-25 特斯联(北京)科技有限公司 A kind of artificial intelligence realization natural language text Emotion identification method and apparatus
CN111651586A (en) * 2020-05-29 2020-09-11 北京小米松果电子有限公司 Rule template generation method for text classification, classification method and device, and medium
CN111816211A (en) * 2019-04-09 2020-10-23 Oppo广东移动通信有限公司 Emotion recognition method and device, storage medium and electronic equipment

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103137043A (en) * 2011-11-23 2013-06-05 财团法人资讯工业策进会 Advertisement display system and advertisement display method in combination with search engine service
CN104063418A (en) * 2014-03-17 2014-09-24 百度在线网络技术(北京)有限公司 Search recommendation method and device
CN105224554A (en) * 2014-06-11 2016-01-06 阿里巴巴集团控股有限公司 Search word is recommended to carry out method, system, server and the intelligent terminal searched for
CN105260416A (en) * 2015-09-25 2016-01-20 百度在线网络技术(北京)有限公司 Voice recognition based searching method and apparatus
CN106649409A (en) * 2015-11-04 2017-05-10 陈包容 Method and apparatus for displaying search result based on scene information
CN105975492A (en) * 2016-04-26 2016-09-28 乐视控股(北京)有限公司 Search term prompt method and device
CN106407287A (en) * 2016-08-29 2017-02-15 宇龙计算机通信科技(深圳)有限公司 Multimedia resource pushing method and system
CN107506349A (en) * 2017-08-04 2017-12-22 卓智网络科技有限公司 A kind of user's negative emotions Forecasting Methodology and system based on network log
CN107515928A (en) * 2017-08-25 2017-12-26 百度在线网络技术(北京)有限公司 A kind of method, apparatus, server, storage medium for judging assets price tendency
CN108280200A (en) * 2018-01-29 2018-07-13 百度在线网络技术(北京)有限公司 Method and apparatus for pushed information
CN108763545A (en) * 2018-05-31 2018-11-06 深圳市零度智控科技有限公司 Negative emotions interference method, device and readable storage medium storing program for executing, terminal device
CN111816211A (en) * 2019-04-09 2020-10-23 Oppo广东移动通信有限公司 Emotion recognition method and device, storage medium and electronic equipment
CN110377726A (en) * 2019-06-05 2019-10-25 特斯联(北京)科技有限公司 A kind of artificial intelligence realization natural language text Emotion identification method and apparatus
CN111651586A (en) * 2020-05-29 2020-09-11 北京小米松果电子有限公司 Rule template generation method for text classification, classification method and device, and medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113572893A (en) * 2021-07-13 2021-10-29 青岛海信移动通信技术股份有限公司 Terminal device, emotion feedback method and storage medium
CN113572893B (en) * 2021-07-13 2023-03-14 青岛海信移动通信技术股份有限公司 Terminal device, emotion feedback method and storage medium
CN114861057A (en) * 2022-05-17 2022-08-05 北京百度网讯科技有限公司 Resource sending method, training of recommendation model and device
CN114861057B (en) * 2022-05-17 2023-05-30 北京百度网讯科技有限公司 Resource sending method, training of recommendation model and device

Similar Documents

Publication Publication Date Title
CN112487173B (en) Man-machine conversation method, device and storage medium
CN115309877B (en) Dialogue generation method, dialogue model training method and device
CN113407850B (en) Method and device for determining and acquiring virtual image and electronic equipment
CN112579909A (en) Object recommendation method and device, computer equipment and medium
EP4113357A1 (en) Method and apparatus for recognizing entity, electronic device and storage medium
CN116501960B (en) Content retrieval method, device, equipment and medium
US20230342667A1 (en) Classification model training method, semantic classification method, device and medium
CN112148850A (en) Dynamic interaction method, server, electronic device and storage medium
CN112860995A (en) Interaction method, device, client, server and storage medium
CN114625855A (en) Method, apparatus, device and medium for generating dialogue information
CN113360001A (en) Input text processing method and device, electronic equipment and storage medium
CN112818227A (en) Content recommendation method and device, electronic equipment and storage medium
CN116521841A (en) Method, device, equipment and medium for generating reply information
CN112559715B (en) Attitude identification method, device, equipment and storage medium
CN112910761A (en) Instant messaging method, device, equipment, storage medium and program product
US20230206007A1 (en) Method for mining conversation content and method for generating conversation content evaluation model
CN117033587A (en) Man-machine interaction method and device, electronic equipment and medium
CN116257690A (en) Resource recommendation method and device, electronic equipment and storage medium
CN112784599B (en) Method and device for generating poem, electronic equipment and storage medium
CN109002498A (en) Interactive method, device, equipment and storage medium
CN114118937A (en) Information recommendation method and device based on task, electronic equipment and storage medium
CN114490986A (en) Computer-implemented data mining method, computer-implemented data mining device, electronic device, and storage medium
CN112817463A (en) Method, equipment and storage medium for acquiring audio data by input method
CN113553413A (en) Dialog state generation method and device, electronic equipment and storage medium
CN113342179A (en) Input text processing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination