CN112487164A - Artificial intelligence interaction method - Google Patents

Artificial intelligence interaction method Download PDF

Info

Publication number
CN112487164A
CN112487164A CN202011389775.4A CN202011389775A CN112487164A CN 112487164 A CN112487164 A CN 112487164A CN 202011389775 A CN202011389775 A CN 202011389775A CN 112487164 A CN112487164 A CN 112487164A
Authority
CN
China
Prior art keywords
interactive
statement
interaction
user
content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011389775.4A
Other languages
Chinese (zh)
Inventor
王晓东
梁镇爽
张慧
张扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Global Tone Communication Technology Co ltd
Original Assignee
Global Tone Communication Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Global Tone Communication Technology Co ltd filed Critical Global Tone Communication Technology Co ltd
Priority to CN202011389775.4A priority Critical patent/CN112487164A/en
Publication of CN112487164A publication Critical patent/CN112487164A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/332Query formulation
    • G06F16/3329Natural language query formulation or dialogue systems

Abstract

The embodiment of the invention relates to the technical field of artificial intelligence interaction, and particularly discloses an artificial intelligence interaction method, wherein in the artificial interaction method provided by the embodiment of the invention, interactive sentences input through an interactive terminal are obtained; wherein the interactive statement comprises a first interactive statement and a second interactive statement; judging whether the first interactive statement and the second interactive statement are associated or not; then determining interactive content according to the user intention of the interactive statement meeting the requirement, and sending the interactive content to an interactive terminal for expression; the problem of among the prior art be difficult to determine user's intention, influence interactive accuracy for the result that returns and satisfy user's needs appears more mistake is solved, and can be with the interactive result that generates and correspond with interactive statement live the search result as the interactive content that presents for the user, guaranteed interactive mode more various, increased interactive accuracy and interest moreover.

Description

Artificial intelligence interaction method
Technical Field
The embodiment of the invention relates to the technical field of artificial intelligence interaction, in particular to an artificial intelligence interaction method.
Background
With the development of artificial intelligence conversation systems, the application range of human-computer interaction is wider and wider, such as common intelligent question-answering systems and the like.
In the process of man-machine interaction, the artificial intelligence conversation system analyzes according to the content input by the user, and determines the corresponding reply content to return to the user. One difficulty in human-computer interaction is that the user intention embodied by the input content of the user is predicted, and if the user intention cannot be accurately predicted, the determined reply content is difficult to meet the user requirement, so that the interaction experience is reduced.
However, the content input by the user during the human-computer interaction is often simple, so that when the related technology for determining the user intention is adopted, the user intention is often difficult to determine, and when the user does not input a complete requirement once and needs to be supplemented or modified, the previous information often needs to be re-input again, that is, the complete requirement information containing the last requirement information needs to be input, so that the operation is troublesome, the efficiency is low, the accuracy of the interaction is influenced, and more errors occur in the result of returning to meet the requirement of the user.
Disclosure of Invention
An object of the embodiments of the present invention is to provide an artificial intelligence interaction method, so as to solve the problems proposed in the above background art. In order to achieve the above purpose, the embodiments of the present invention provide the following technical solutions:
an artificial intelligence interaction method, the method being performed by an interaction terminal, the method comprising:
acquiring an interactive sentence input through an interactive terminal; the interactive statements comprise a first interactive statement and a second interactive statement, and the first interactive statement is acquired before the second interactive statement;
judging whether the first interactive statement and the second interactive statement are associated or not;
if the association does not exist, determining interactive content according to the user intention of the second interactive statement, and sending the interactive content to an interactive terminal for expression;
and if the correlation exists, integrating and updating the feature information of the first interactive statement and the second interactive statement to form an integrated and updated third interactive statement, determining interactive content according to the user intention of the third interactive statement, and sending the interactive content to an interactive terminal for expression.
As a further limitation of the technical solution of the embodiment of the present invention, the step of obtaining the interactive statement input through the interactive terminal further includes: and judging the type of the interactive sentence, and identifying the interactive sentence as text information when the interactive sentence is non-text.
As a further limitation of the technical solution of the embodiment of the present invention, the step of determining whether there is an association between the first interactive statement and the second interactive statement includes:
extracting at least one feature information according to the first interactive statement and the second interactive statement;
processing the at least one feature information by using a preset neural network model to determine whether an intention switching/holding relationship exists between the first interactive statement and the second interactive statement;
wherein the non-existing association is an intention switching relationship and the existing association is an intention maintaining relationship.
As a further limitation of the technical solution of the embodiment of the present invention, both the first interactive statement and the second interactive statement carry a user identifier.
As a further limitation of the technical solution of the embodiment of the present invention, the determining the interactive content according to the user intention of the interactive sentence includes:
determining similarity information of the interactive sentences and historical contents in a user feature library, wherein the user feature library is determined according to the user identification;
determining a user intention corresponding to the interactive statement according to the similarity information;
and determining the interactive content corresponding to the interactive statement according to the user intention.
As a further limitation of the technical solution of the embodiment of the present invention, the method further includes:
judging whether an interaction result corresponding to the interaction statement is generated or not;
and determining the interactive information presented to the user according to the judgment result.
As a further limitation of the technical solution of the embodiment of the present invention, the interactive content includes an interactive result corresponding to the generated interactive statement and a search result corresponding to the interactive statement in the search engine.
As a further limitation of the technical solution of the embodiment of the present invention, the step of determining the interactive information presented to the user according to the determination result includes:
when generating an interaction result corresponding to the interaction statement, taking the generated interaction result corresponding to the interaction statement as interaction content presented to a user;
and when the interactive result corresponding to the interactive statement is not generated, taking the search result corresponding to the interactive statement in the search engine as interactive content presented to the user.
As a further limitation of the technical solution of the embodiment of the present invention, the sending the interactive content to the interactive terminal for expression includes: and sending the interactive content to an interactive terminal for display and expression.
As a further limitation of the technical solution of the embodiment of the present invention, the sending the interactive content to the interactive terminal for expression includes: and sending the interactive content to an interactive terminal for playing and expressing.
Compared with the prior art, in the manual interaction method provided by the embodiment of the invention, the interactive sentences input by the interactive terminal are obtained; wherein the interactive statement comprises a first interactive statement and a second interactive statement; judging whether the first interactive statement and the second interactive statement are associated or not; then determining interactive content according to the user intention of the interactive statement meeting the requirement, and sending the interactive content to an interactive terminal for expression; the problem that in the prior art, the user intention is difficult to determine, the accuracy of interaction is influenced, and more errors are caused in the result which meets the user requirement and is returned is solved, and when the interaction result corresponding to the interaction statement is generated, the interaction result corresponding to the interaction statement is generated and serves as the interaction content which is presented to the user; when the interactive result corresponding to the interactive statement is not generated, the search result corresponding to the interactive statement in the search engine is used as the interactive content presented to the user, so that the interactive mode is more diversified, and the accuracy and the interestingness of the interaction are increased.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention.
FIG. 1 sets forth an exemplary system architecture diagram of a method of artificial intelligence interaction provided by embodiments of the present invention.
FIG. 2 is a flowchart illustrating an artificial intelligence interaction method according to an embodiment of the present invention.
FIG. 3 is a flow chart of an artificial intelligence interaction method provided by the embodiment of the invention.
Fig. 4 illustrates an exemplary flow chart for determining whether an association exists between a first interactive statement and the second interactive statement.
FIG. 5 illustrates an exemplary flow chart for determining interactive content based on a user's intent of an interactive statement.
FIG. 6 illustrates an exemplary flow chart for determining interaction information to present to a user.
Detailed Description
In order to make the technical problems, technical solutions and advantageous effects to be solved by the present invention more clearly apparent, the present invention is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the prior art, the input content of a user during man-machine interaction is often simple, so that the user intention is often difficult to determine when a related technology for determining the user intention is adopted, and when the user does not input a complete requirement once and needs to be supplemented or modified, the previous information is often required to be input again, namely the complete requirement information containing the last requirement information needs to be input, so that the operation is troublesome, the efficiency is low, the interaction accuracy is influenced, and more errors appear in the returned result meeting the user requirement.
In the manual interaction method provided by the embodiment of the invention, interactive sentences input by an interactive terminal are obtained; wherein the interactive statement comprises a first interactive statement and a second interactive statement; judging whether the first interactive statement and the second interactive statement are associated or not; then determining interactive content according to the user intention of the interactive statement meeting the requirement, and sending the interactive content to an interactive terminal for expression; the problem that in the prior art, the user intention is difficult to determine, the accuracy of interaction is influenced, and more errors are caused in the result which meets the user requirement and is returned is solved, and when the interaction result corresponding to the interaction statement is generated, the interaction result corresponding to the interaction statement is generated and serves as the interaction content which is presented to the user; when the interactive result corresponding to the interactive statement is not generated, the search result corresponding to the interactive statement in the search engine is used as the interactive content presented to the user, so that the interactive mode is more diversified, and the accuracy and the interestingness of the interaction are increased.
It should be noted that, in the present disclosure, the embodiments and features of the embodiments may be combined with each other without conflict. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
FIG. 1 illustrates an exemplary system architecture 100 to which embodiments of the artificial intelligence interaction methods of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include an interactive terminal, which may be a smartphone 101, a tablet 102, or a laptop 103, the system architecture 100 further including a network 104 and a server 105.
Wherein the network 104 may be a medium to provide a communication link between the interactive terminal and the server 105.
In addition, the network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
Specifically, the user may interact with the server 105 through the network 104 using an interactive terminal to receive or send messages or the like. Various communication client applications, such as a conversation application, a live broadcast application, a search application, an instant messaging tool, a mailbox client, social platform software and the like, can be installed on the interactive terminal.
It is understood that the interactive terminal may be hardware or software, and when the interactive terminal is hardware, the interactive terminal includes but is not limited to a smart phone 101, a tablet computer 102 or a notebook computer 103, for example, the intelligent terminal that is hardware may also be an e-book reader, an MP3 player, an MP4 player, a desktop computer, and the like; when the interactive terminal is software, the interactive terminal can be installed in the electronic equipment listed above. It may be implemented as multiple pieces of software or software modules, or as a single piece of software or software module. And is not particularly limited herein.
The server 105 may be a server providing various services, such as a background server supporting interactive-capable applications on interactive terminals. The server 105 may receive the interactive statement sent by the interactive terminal. Then, the server 105 may process and feed back the interactive statements to obtain the interactive content, and the server 105 may return the interactive content to the interactive terminal and express the interactive content.
The server 105 may be hardware or software. When the server 105 is hardware, it may be implemented as a distributed server cluster composed of a plurality of servers, or may be implemented as a single server. When the server is software, it may be implemented as a plurality of software or software modules, or may be implemented as a single software or software module. And is not particularly limited herein.
Referring to FIG. 2, FIG. 2 illustrates a flow diagram of one embodiment of an artificial intelligence interaction method 200.
Referring to fig. 3, fig. 3 is a flow chart illustrating an artificial intelligence interaction method according to an embodiment of the present invention.
The embodiment is mainly exemplified by applying the artificial intelligence interaction method to an electronic device with certain computing capability, where the electronic device may be the server 105 or the interactive terminal shown in fig. 1.
The artificial intelligence interaction method, the method 200 is executed by an interactive terminal, and comprises the following steps:
step 201: acquiring an interactive sentence input through an interactive terminal;
the interactive sentence may be content input by the user to embody the user's intention. In the embodiment of the application, the user input mode corresponding to the interactive statement is not limited, and the user can input the interactive statement in the form of voice input, text input, content option selection and the like on a human-computer interaction interface.
In a preferred embodiment provided by the present invention, the interactive statement includes a first interactive statement and a second interactive statement, and of course, the interactive statement may also include a third interactive statement, a fourth interactive statement, and so on; for convenience of description, in step S201 provided in the embodiment of the present invention, the interactive statement input through the interactive terminal includes a first interactive statement and a second interactive statement;
in addition, in the preferred embodiment provided by the present invention, the first interactive statement and the second interactive statement have a sequence, that is, the interactive statement received by the interactive terminal is not embodied in a complete input message, for example, after the user inputs the first interactive statement, the second interactive statement is input to the interactive terminal after a certain time interval, that is, the first interactive statement and the second interactive statement are formed.
Preferably, in the embodiment of the present invention, the first interactive statement is acquired before the second interactive statement, that is, the interactive terminal acquires the first interactive statement first and then acquires the second interactive statement.
Step 202: judging whether the first interactive statement and the second interactive statement are associated or not;
referring to FIG. 4, FIG. 4 illustrates an exemplary flow chart for determining whether an association exists between a first interactive statement and a second interactive statement.
Preferably, in step 202 provided in the embodiment of the present invention, the determining whether the first interactive statement and the second interactive statement have an association mainly adopts the following manner, that is, the step 202 of determining whether the first interactive statement and the second interactive statement have an association includes:
step 2021: extracting at least one feature information according to the first interactive statement and the second interactive statement;
in step 2021 provided by the embodiment of the present invention, extracting feature information from the interactive sentence includes performing feature extraction on information in the interactive sentence; the following are exemplary: as the first interactive statement "what hotels are nearby? ", a second interactive statement" I want to stay at the five-star hotel! ", it can be seen that the first interactive statement" what hotels are nearby? One of the feature information of "is" Hotel ", and the second interactive sentence" I want to stay at the five-star Hotel! One of the characteristic information of "is" hotel "as well.
Step 2022: processing the at least one feature information by using a preset neural network model to determine whether an intention switching/holding relationship exists between the first interactive statement and the second interactive statement;
wherein the non-existing association is an intention switching relationship and the existing association is an intention maintaining relationship.
It can be understood that the at least one type of feature information is processed by using a preset neural network model, that is, the feature information is subjected to derivative expansion in the same field; the following are exemplary: as the first interactive statement "what hotels are nearby? ", the second interactive statement" cheap points! ", it can be seen that the first interactive statement" what hotels are nearby? "one of the feature information is" hotel ", the derivative development of the feature information" hotel "can obtain derivative information including but not limited to" hotel "," price "," location "," cheap "," expensive ", etc., and the second interactive statement" cheap point! "characteristic information includes, but is not limited to," price, "cheapness";
it can be seen that both contain feature information such as "price", "cheapness", etc., so that whether there is an intention switching/holding relationship between the first interactive sentence and the second interactive sentence can be determined according to whether the same feature information or derived feature information is contained.
In the above examples, the same feature information or derived feature information is included, and the relationship is intended to be maintained;
if the same characteristic information or derived characteristic information is not included, the switching relationship is intended.
Step 203: if the association does not exist, determining interactive content according to the user intention of the second interactive statement, and sending the interactive content to an interactive terminal for expression;
and if the correlation exists, integrating and updating the feature information of the first interactive statement and the second interactive statement to form an integrated and updated third interactive statement, determining interactive content according to the user intention of the third interactive statement, and sending the interactive content to an interactive terminal for expression.
If no association exists, determining interactive content according to the feature information of the second interactive statement on the basis of the second interactive statement, and outputting the interactive content to the interactive terminal for expression;
and if the correlation exists, extracting and integrating the characteristic information of the first interactive statement and the second interactive statement, updating to form a third interactive statement, determining interactive content according to the characteristic information of the third interactive statement, and outputting the interactive content to the interactive terminal for expression.
Further, in a preferred embodiment provided by the present invention, the step 201 of acquiring the interactive statement input through the interactive terminal further includes:
and judging the type of the interactive sentence, and identifying the interactive sentence as text information when the interactive sentence is non-text.
Specifically, in the preferred embodiment provided by the present invention, the step of determining the type of the interactive statement includes identifying the interactive statement, that is, identifying whether the interactive statement collected by the interactive terminal is text information;
further, when the interactive statement is a non-text statement, the interactive statement is recognized as text information, and if the interactive statement acquired by the interactive terminal is voice information, the voice information is recognized by a voice recognition module arranged in the interactive terminal and is converted into the text information by the voice recognition module.
In addition, the interactive terminal can acquire the interactive statements by acquiring real-time pictures, wherein the real-time pictures acquired by the interactive terminal can be scanned by adopting an OCR recognition module arranged in the interactive terminal, and text information in the characters can be extracted.
Specifically, graying binarization, laplacian sharpening, symmetric mean filtering, horizontal drawing and thinning of the picture are carried out on the real-time picture, an anti-aliasing attribute is set, and OCR character recognition is carried out on the real-time picture.
The first interactive statement and the second interactive statement both carry user identifications.
In the embodiment of the application, when a user needs to perform human-computer interaction, related interactive statements can be input, and the interactive terminal can obtain the corresponding interactive statements.
The interactive statement carries a user identifier, and the user identifier is used for identifying user identity information.
FIG. 5 illustrates an exemplary flow chart for determining interactive content based on a user's intent of an interactive statement.
FIG. 6 illustrates another exemplary flow chart for determining interaction information for presentation to a user.
The determining of the interactive content according to the user intention of the interactive sentence includes:
determining similarity information of the interactive sentences and historical contents in a user feature library, wherein the user feature library is determined according to the user identification;
determining a user intention corresponding to the interactive statement according to the similarity information;
and determining the interactive content corresponding to the interactive statement according to the user intention.
It can be understood that, for the historical user behaviors of the user for human-computer interaction, the user tendencies reflected by the user behaviors are likely to accord with the user intention of the user for human-computer interaction.
Therefore, in the embodiment of the application, based on the user identifier carried by the input interactive content, the user feature library corresponding to the user identifier may be determined according to the historical user behavior related to the user identifier. Wherein, the user characteristic library can comprise historical content related to user identification. The historical content may be used to record historical user behavior associated with the user identification.
In a specific implementation, the user feature library may be determined by obtaining historical user behaviors from stored recorded data of the user during the human-computer interaction, analyzing the historical user behaviors to obtain historical contents, and forming a corresponding user feature library.
In addition, when interactive content is determined, user portrait information corresponding to the user identification, such as age, gender and the like, can be taken into consideration, thousands of people can be achieved, the satisfaction degree of the user on the interactive content is improved, and the user interactive experience is improved.
Further, in the embodiment of the present invention, the method further includes:
judging whether an interaction result corresponding to the interaction statement is generated or not;
and determining the interactive information presented to the user according to the judgment result.
It is understood that the interactive contents include interactive results corresponding to the generated interactive sentences or search results corresponding to the interactive sentences in the search engine.
Correspondingly, in a preferred embodiment provided by the present invention, the step of determining the interactive information presented to the user according to the determination result includes:
when generating an interaction result corresponding to the interaction statement, taking the generated interaction result corresponding to the interaction statement as interaction content presented to a user;
and when the interactive result corresponding to the interactive statement is not generated, taking the search result corresponding to the interactive statement in the search engine as interactive content presented to the user.
Further, in a preferred embodiment provided by the present invention, the sending the interactive content to the interactive terminal for expression includes: and sending the interactive content to an interactive terminal for display and expression. The interactive terminal is provided with a display screen, and interactive contents fed back after interaction are displayed and expressed through the display screen of the interactive terminal, so that the interactive terminal is more intuitive.
Further, in another preferred embodiment provided by the present invention, the sending the interactive content to the interactive terminal for expression includes: and sending the interactive content to an interactive terminal for playing and expressing. The interactive terminal is provided with a player module, and interactive contents fed back after interaction are played and expressed through the player module of the interactive terminal, so that the interactive terminal is more flexible and effective.
In the manual interaction method provided by the embodiment of the invention, interactive sentences input by an interactive terminal are obtained; wherein the interactive statement comprises a first interactive statement and a second interactive statement; judging whether the first interactive statement and the second interactive statement are associated or not; then determining interactive content according to the user intention of the interactive statement meeting the requirement, and sending the interactive content to an interactive terminal for expression; the problem that in the prior art, the user intention is difficult to determine, the accuracy of interaction is influenced, and more errors are caused in the result which meets the user requirement and is returned is solved, and when the interaction result corresponding to the interaction statement is generated, the interaction result corresponding to the interaction statement is generated and serves as the interaction content which is presented to the user; when the interactive result corresponding to the interactive statement is not generated, the search result corresponding to the interactive statement in the search engine is used as the interactive content presented to the user, so that the interactive mode is more diversified, and the accuracy and the interestingness of the interaction are increased.
It should be noted that, in the present specification, all the embodiments are described in a progressive manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the apparatus and system embodiments, since they are substantially similar to the method embodiments, they are described in a relatively simple manner, and reference may be made to some of the descriptions of the method embodiments for related points. The above-described embodiments of the apparatus and system are merely illustrative, and the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The above description is only one specific embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. An artificial intelligence interaction method, characterized in that the method is executed by an interaction terminal, and the method comprises:
acquiring an interactive sentence input through an interactive terminal; the interactive statements comprise a first interactive statement and a second interactive statement, and the first interactive statement is acquired before the second interactive statement;
judging whether the first interactive statement and the second interactive statement are associated or not;
if the association does not exist, determining interactive content according to the user intention of the second interactive statement, and sending the interactive content to an interactive terminal for expression;
and if the correlation exists, integrating and updating the feature information of the first interactive statement and the second interactive statement to form an integrated and updated third interactive statement, determining interactive content according to the user intention of the third interactive statement, and sending the interactive content to an interactive terminal for expression.
2. The artificial intelligence interaction method of claim 1, wherein the step of obtaining the interactive sentence inputted through the interactive terminal further comprises: and judging the type of the interactive sentence, and identifying the interactive sentence as text information when the interactive sentence is non-text.
3. The artificial intelligence interaction method of claim 2, wherein the step of determining whether an association exists between the first interactive statement and the second interactive statement comprises:
extracting at least one feature information according to the first interactive statement and the second interactive statement;
processing the at least one feature information by using a preset neural network model to determine whether an intention switching/holding relationship exists between the first interactive statement and the second interactive statement;
wherein the non-existing association is an intention switching relationship and the existing association is an intention maintaining relationship.
4. The artificial intelligence interaction method of claim 1 or 3, wherein the first interaction statement and the second interaction statement both carry a user identifier.
5. The artificial intelligence interaction method of claim 4, wherein the determining interactive contents according to the user's intention of the interactive sentence comprises:
determining similarity information of the interactive sentences and historical contents in a user feature library, wherein the user feature library is determined according to the user identification;
determining a user intention corresponding to the interactive statement according to the similarity information;
and determining the interactive content corresponding to the interactive statement according to the user intention.
6. The artificial intelligence interaction method of claim 5, further comprising:
judging whether an interaction result corresponding to the interaction statement is generated or not;
and determining the interactive information presented to the user according to the judgment result.
7. The artificial intelligence interaction method of claim 6, wherein the interaction content comprises interaction results corresponding to the generated interaction sentences and search results corresponding to the interaction sentences in a search engine.
8. The artificial intelligence interaction method of claim 7, wherein the step of determining the interaction information presented to the user according to the determination result comprises:
when generating an interaction result corresponding to the interaction statement, taking the generated interaction result corresponding to the interaction statement as interaction content presented to a user;
and when the interactive result corresponding to the interactive statement is not generated, taking the search result corresponding to the interactive statement in the search engine as interactive content presented to the user.
9. The artificial intelligence interaction method of claim 8, wherein the sending the interaction content to an interaction terminal for expression comprises: and sending the interactive content to an interactive terminal for display and expression.
10. The artificial intelligence interaction method of claim 8, wherein the sending the interaction content to an interaction terminal for expression comprises: and sending the interactive content to an interactive terminal for playing and expressing.
CN202011389775.4A 2020-12-01 2020-12-01 Artificial intelligence interaction method Pending CN112487164A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011389775.4A CN112487164A (en) 2020-12-01 2020-12-01 Artificial intelligence interaction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011389775.4A CN112487164A (en) 2020-12-01 2020-12-01 Artificial intelligence interaction method

Publications (1)

Publication Number Publication Date
CN112487164A true CN112487164A (en) 2021-03-12

Family

ID=74938889

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011389775.4A Pending CN112487164A (en) 2020-12-01 2020-12-01 Artificial intelligence interaction method

Country Status (1)

Country Link
CN (1) CN112487164A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113934825A (en) * 2021-12-21 2022-01-14 北京云迹科技有限公司 Question answering method and device and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106372132A (en) * 2016-08-25 2017-02-01 北京百度网讯科技有限公司 Artificial intelligence-based query intention prediction method and apparatus
CN106383875A (en) * 2016-09-09 2017-02-08 北京百度网讯科技有限公司 Artificial intelligence-based man-machine interaction method and device
CN107133345A (en) * 2017-05-22 2017-09-05 北京百度网讯科技有限公司 Exchange method and device based on artificial intelligence
CN107656996A (en) * 2017-09-19 2018-02-02 北京百度网讯科技有限公司 Man-machine interaction method and device based on artificial intelligence
CN109697282A (en) * 2017-10-20 2019-04-30 阿里巴巴集团控股有限公司 A kind of the user's intension recognizing method and device of sentence

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106372132A (en) * 2016-08-25 2017-02-01 北京百度网讯科技有限公司 Artificial intelligence-based query intention prediction method and apparatus
CN106383875A (en) * 2016-09-09 2017-02-08 北京百度网讯科技有限公司 Artificial intelligence-based man-machine interaction method and device
CN107133345A (en) * 2017-05-22 2017-09-05 北京百度网讯科技有限公司 Exchange method and device based on artificial intelligence
CN107656996A (en) * 2017-09-19 2018-02-02 北京百度网讯科技有限公司 Man-machine interaction method and device based on artificial intelligence
CN109697282A (en) * 2017-10-20 2019-04-30 阿里巴巴集团控股有限公司 A kind of the user's intension recognizing method and device of sentence

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113934825A (en) * 2021-12-21 2022-01-14 北京云迹科技有限公司 Question answering method and device and electronic equipment
CN113934825B (en) * 2021-12-21 2022-03-08 北京云迹科技有限公司 Question answering method and device and electronic equipment

Similar Documents

Publication Publication Date Title
CN107393541B (en) Information verification method and device
US10984226B2 (en) Method and apparatus for inputting emoticon
CN114578969B (en) Method, apparatus, device and medium for man-machine interaction
CN109543058B (en) Method, electronic device, and computer-readable medium for detecting image
CN108268450B (en) Method and apparatus for generating information
CN107592255B (en) Information display method and equipment
CN112818224B (en) Information recommendation method and device, electronic equipment and readable storage medium
US11244153B2 (en) Method and apparatus for processing information
JP2022088304A (en) Method for processing video, device, electronic device, medium, and computer program
CN113139816A (en) Information processing method, device, electronic equipment and storage medium
CN112929253B (en) Virtual image interaction method and device
CN116150339A (en) Dialogue method, dialogue device, dialogue equipment and dialogue storage medium
CN107885872B (en) Method and device for generating information
CN112487164A (en) Artificial intelligence interaction method
CN114880498B (en) Event information display method and device, equipment and medium
CN114528851B (en) Reply sentence determination method, reply sentence determination device, electronic equipment and storage medium
CN115101069A (en) Voice control method, device, equipment, storage medium and program product
CN115098729A (en) Video processing method, sample generation method, model training method and device
CN114118937A (en) Information recommendation method and device based on task, electronic equipment and storage medium
CN113923477A (en) Video processing method, video processing device, electronic equipment and storage medium
CN109299240B (en) Chat robot knowledge display method and device
CN113515280A (en) Page code generation method and device
CN110958172B (en) Method, device and computer storage medium for recommending friends
CN113505293B (en) Information pushing method and device, electronic equipment and storage medium
CN112328871B (en) Reply generation method, device, equipment and storage medium based on RPA module

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210312