WO2020007214A1 - 人机交互方法、设备、系统及存储介质 - Google Patents

人机交互方法、设备、系统及存储介质 Download PDF

Info

Publication number
WO2020007214A1
WO2020007214A1 PCT/CN2019/092679 CN2019092679W WO2020007214A1 WO 2020007214 A1 WO2020007214 A1 WO 2020007214A1 CN 2019092679 W CN2019092679 W CN 2019092679W WO 2020007214 A1 WO2020007214 A1 WO 2020007214A1
Authority
WO
WIPO (PCT)
Prior art keywords
interaction
guidance
human
target
intention
Prior art date
Application number
PCT/CN2019/092679
Other languages
English (en)
French (fr)
Inventor
许立龙
Original Assignee
阿里巴巴集团控股有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 阿里巴巴集团控股有限公司 filed Critical 阿里巴巴集团控股有限公司
Publication of WO2020007214A1 publication Critical patent/WO2020007214A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0633Lists, e.g. purchase orders, compilation or processing
    • G06Q30/0635Processing of requisition or of purchase orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions

Definitions

  • the present application relates to the field of artificial intelligence technology, and in particular, to a human-computer interaction method, device, system, and storage medium.
  • the process of online shopping and online transactions is mainly the order processing process, that is, the information processing process that takes the order as the processing object.
  • Various aspects of the embodiments of the present application provide a method, a device, a system, and a storage medium for human-computer interaction, so as to reduce the labor cost in the process of human-computer interaction and improve the efficiency of human-computer interaction.
  • the embodiment of the present application provides a human-machine interaction method, which is applicable to a human-machine interaction device and includes:
  • the interaction result of the current round of interaction is obtained according to the interaction guidance information corresponding to the current round of interaction.
  • the embodiment of the present application further provides an interaction method applicable to a management platform, including:
  • each interaction guidance process including at least one guidance node, and each guidance node corresponds to one interaction guidance information
  • An embodiment of the present application further provides a human-computer interaction device, including: a memory, a processor, and a communication component;
  • the memory is used to store one or more computer instructions
  • the processor is configured to execute one or more computer instructions for:
  • the interaction result of the current round of interaction is obtained according to the interaction guidance information corresponding to the current round of interaction.
  • An embodiment of the present application further provides a computer-readable storage medium storing a computer program, and the computer program, when executed, can implement the steps in the human-machine interaction method on the human-machine interaction device side.
  • An embodiment of the present application further provides a management platform, including: a memory, a processor, and a communication component;
  • the memory is used to store one or more computer instructions
  • the processor is configured to execute one or more computer instructions for:
  • each interaction guidance process including at least one guidance node, and each guidance node corresponds to one interaction guidance information
  • interaction guidance information corresponding to the next guidance node to the human-computer interaction device as the interaction guidance information corresponding to the current round of interaction through the communication component.
  • An embodiment of the present application further provides a computer-readable storage medium storing a computer program, and the computer program, when executed, can implement the steps in the human-machine interaction method on the management platform side.
  • An embodiment of the present application further provides a human-computer interaction system, including: a human-machine interaction device and a management platform;
  • the human-computer interaction device is configured to determine a target interaction intention to which the current round of interaction belongs according to the human-computer interaction requirement; and request the management platform to provide interaction guidance information corresponding to the current round of interaction according to the target interaction intention, and receive the management. Obtaining the interaction guidance information corresponding to the current round of interaction sent by the platform; and obtaining the interaction result of the current round of interaction according to the interaction guidance information corresponding to the current round of interaction;
  • the management platform is configured to obtain, according to a request from the human-computer interaction device, a target interaction guidance process adapted to the target interaction intention from an interaction guidance process under at least one interaction intention maintained, according to the The guidance node where the target interaction guidance process is currently located determines the interaction guidance information corresponding to the next guidance node; and returns the interaction guidance information corresponding to the next guidance node to the person as the interaction guidance information corresponding to the current round of interaction Machine interactive device; wherein each interactive guidance process includes at least one guidance node, and each guidance node corresponds to one interactive guidance information.
  • the management platform maintains the interaction guidance process under each interaction intent, and makes the human-machine interaction device cooperate with the management platform.
  • the human-machine interaction device determines Based on the target interaction intent of this round of interaction, based on the target interaction intent, request the management guidance information corresponding to this round of interaction from the management platform, and then obtain the interaction result of this round of interaction based on the interaction guidance information corresponding to this round of interaction sent by the management platform. It does not need to guide the interactive process manually, realizes the automatic human-computer interaction, saves the labor cost, and improves the human-computer interaction efficiency.
  • FIG. 1 is a schematic structural diagram of a human-computer interaction system according to an exemplary embodiment of the present application
  • FIG. 2a is a schematic structural diagram of a human-computer interaction system according to another exemplary embodiment of the present application.
  • FIG. 2b is a schematic structural diagram of a human-computer interaction system according to another exemplary embodiment of the present application.
  • FIG. 3 is a schematic diagram of interaction of a human-computer interaction system in an order rights protection scenario according to an exemplary embodiment of the present application
  • FIG. 4 is a method flowchart of a human-computer interaction method according to an exemplary embodiment of the present application.
  • FIG. 5 is a flowchart of a first-round interaction method of a human-computer interaction method according to another exemplary embodiment of the present application.
  • FIG. 6 is a flowchart of a non-first-round interaction method of a human-computer interaction method according to another exemplary embodiment of the present application.
  • FIG. 7 is a method flowchart of a human-computer interaction method according to another exemplary embodiment of the present application.
  • FIG. 8 is a flowchart of a first-round interaction method of a human-computer interaction method according to another exemplary embodiment of the present application.
  • FIG. 9 is a flowchart of a non-first-round interaction method of a human-machine interaction method according to another exemplary embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of a human-machine interaction device according to another exemplary embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of a management platform according to another exemplary embodiment of the present application.
  • the management platform maintains the interactive guidance process under each interaction intent, and makes the human-machine interaction device cooperate with the management platform and the human-machine interaction device.
  • the human-computer interaction requirements determine the target interaction intent to which this round of interaction belongs, and request the management platform for the interaction guidance information corresponding to this round of interaction based on the target interaction intention, and then obtain the interaction guidance information corresponding to this round of interaction sent by the management platform.
  • the interaction result of this round of interaction in this process, there is no need to manually guide the human-machine interaction process, and the automatic human-machine interaction is realized, which saves manpower costs and improves human-machine interaction efficiency.
  • FIG. 1 is a schematic structural diagram of a human-computer interaction system according to an exemplary embodiment of the present application.
  • the human-computer interaction system 10 includes a human-computer interaction device 11 and a management platform 12.
  • the human-computer interaction device 11 refers to a device capable of interacting with a user. According to different human-computer interaction scenarios, the implementation forms of human-computer interaction devices will also be different.
  • the human-computer interaction device 11 may be an online customer service device, a robot Huawei, etc. in these business scenarios.
  • the human-computer interaction device 11 in these business scenarios may include a front end (the end facing the user) and a back end (the end not visible to the user).
  • the implementation form of the front end can be a web page, an application page, or a window, etc.
  • the backend mainly provides functions such as computing, data processing, and human-computer interaction logic control. It can be deployed on servers or robots in business scenarios.
  • the human-machine interaction device 11 may be an electronic device installed in a vehicle or a drone, and these electronic devices may provide a user with a human-machine interaction interface, a voice input interface, and the like.
  • Various forms of human-computer interaction interface may be an electronic device installed in a vehicle or a drone, and these electronic devices may provide a user with a human-machine interaction interface, a voice input interface, and the like.
  • the management platform 12 maintains an interactive guidance process under at least one interaction intent, can communicate with the human-computer interaction device 11, and can perform interaction with the human-machine interaction device based on the maintained interactive guidance process under each interaction intention. 11 human-computer interaction process to guide.
  • This embodiment does not limit the implementation form of the management platform 12, and may be a server device such as a conventional server, a cloud server, a cloud host, or a virtual center.
  • the composition of the server device mainly includes a processor, a hard disk, a memory, a system bus, and the like, and is similar to a general computer architecture.
  • the management platform 12 and the human-computer interaction device 11 may be wirelessly or wiredly connected.
  • the network standard of the mobile network may be 2G (GSM), 2.5G (GPRS), 3G (WCDMA, TD-SCDMA, CDMA2000). , UTMS), 4G (LTE), 4G + (LTE +), WiMax, etc.
  • the human-computer interaction device 11 may also establish a communication connection with the management platform 12 using communication methods such as WiFi, Bluetooth, and infrared.
  • the human-machine interaction device 11 can automatically complete the human-machine interaction process under the guidance of the management platform 12 without manual guidance. The following sections will introduce in detail how the human-computer interaction device 11 can automatically realize human-computer interaction under the guidance of the management platform 12.
  • the human-machine interaction device 11 can perform human-machine interaction with a user.
  • a complete human-computer interaction process may include at least one round of interaction, and each round of interaction can be regarded as an interaction link in the human-machine interaction process.
  • Each complete human-computer interaction process corresponds to an interaction intent.
  • the human-machine interaction device 11 can automatically complete a human-machine interaction process under an interaction intention with the cooperation of the management platform 12.
  • the human-computer interaction device 11 can automatically complete at least one round of interaction under an interaction intention with the cooperation of the management platform 12.
  • this embodiment describes the process of completing the current round of interaction by the human-machine interaction device 11 as an example.
  • This round of interaction refers to the interaction that needs to be performed or is currently being executed in at least one round of interaction.
  • the human-computer interaction device 11 may determine the interaction intention to which the current round of interaction belongs according to the human-computer interaction requirements. For the convenience of description and partitioning, the interaction intention to which the current round of interaction belongs is recorded as the target interaction intention. Depending on the round of this round of interaction, the situation of human-computer interaction needs will be different. If this round of interaction is the first round of interaction under the target interaction intent, the human-computer interaction requirements may be initiated by the user through the human-computer interaction device 11. If this round of interaction is a non-first round of interaction under the target interaction intent, for example, it may be It is the second round of interaction, the third round of interaction, etc., the human-computer interaction requirement may be caused by the interaction result of the previous round of interaction, which is not limited in this embodiment.
  • the human-machine interaction device 11 may request the management platform 12 for the interaction guidance corresponding to the current round of interaction according to the target interaction intention.
  • the interactive guidance information refers to information for guiding the human-machine interaction process, and the information may indicate the operation that the human-machine interaction process should perform at the current time or the next time; the human-machine interaction device 11 may be based on the interactive guidance information. To guide the user through the entire interaction process.
  • the management platform 12 may receive a request sent by the human-computer interaction device 11 and obtain a target interaction guidance process adapted to the target interaction intention from the interaction guidance process under the maintained at least one interaction intention according to the received request.
  • the management platform 12 maintains an interaction guidance process under at least one interaction intent, and different interaction intents may correspond to different human-computer interaction scenarios.
  • the interaction intention corresponding to the online rights protection scenario is online rights protection.
  • the interaction intention corresponding to the online consulting scenario is online consulting.
  • the interaction intention corresponding to the online question and answer scenario is an online question and answer.
  • each interactive guidance process includes at least one guiding node, each guiding node represents a guiding link in the interactive guiding process, and the number of guiding nodes represents the guiding link in the interactive guiding process.
  • the number of upstream and downstream relationships between the guidance nodes can reflect the sequence of the guidance links in the interactive guidance process.
  • Each guidance node corresponds to an interactive guidance information, which is used to indicate how to guide the human-machine in the guidance link represented by the guidance node. The interactive process is guided.
  • the management platform 12 can determine the interactive guidance information corresponding to the next guidance node according to the guidance node where the target interactive guidance process is currently located, and use the interactive guidance information corresponding to the next guidance node as the interactive guidance corresponding to the current round of interaction.
  • the information is returned to the human-computer interaction device 11.
  • the human-computer interaction device 11 receives the interaction guidance information corresponding to the current round of interaction sent by the management platform 12, and can complete the current round of interaction process and obtain the interaction result of the current round of interaction based on the interactive guidance information corresponding to the current round of interaction.
  • the management platform maintains the interaction guidance process under each interaction intent, and makes the human-computer interaction device 11 cooperate with the management platform.
  • the human-computer interaction device 11 is based on the human-computer interaction requirements. Determine the target interaction intention to which this round of interaction belongs, and request the management platform 12 for the interaction guidance information corresponding to the current round of interaction based on the target interaction intention, and then obtain the current round of interaction based on the interaction guidance information corresponding to the current round of interaction sent by the management platform 12.
  • Interactive results Compared to the prior art, which uses manual guidance, in this embodiment, the interactive guidance information requested by the human-computer interaction device 11 to the management platform 12 can automatically guide the human-computer interaction process without manually guiding the interactive process. , Saving labor costs and improving human-computer interaction efficiency.
  • the human-machine interaction device 11 performs the current round of interaction as an example to describe the process of the human-machine interaction device 11 automatically completing the human-machine interaction with the cooperation of the management platform 12. According to different interaction rounds of this round of interaction, the detailed implementation process of the human-machine interaction device 11 to automatically complete the human-machine interaction with the cooperation of the management platform 12 will be different, and the following description will be given for different rounds.
  • this round of interaction is the first round of interaction under the target interaction intent
  • the human-machine interaction requirement is initiated by the user through the human-machine interaction device 11.
  • the human-machine interaction device 11 may respond to a user-initiated human-machine interaction requirement, perform intention recognition on the human-machine interaction requirement, and determine a target interaction intention to which the first round of interaction belongs according to the recognition result.
  • the human-machine interaction device 11 may display an interaction interface for a user to input an interaction requirement; the user may enter an interaction requirement on the interaction interface.
  • the user can input interaction requirements on the interaction interface by handwriting, and can also input interaction requirements on the interaction interface through peripherals such as a mouse, a keyboard, a stylus, a recording pen, and a microphone.
  • the human-machine interaction device 11 may further receive voice information provided by the user, the voice information carrying the user's interaction needs, and perform intent recognition on the voice information to determine the target to which this round of interaction belongs. Interaction intent.
  • the manner in which the user provides the voice information may be: the user directly utters the voice, or the user plays the voice information through a voice playback device.
  • the user can play the generated or recorded voice information through a mobile phone, computer, voice recorder, etc. This embodiment is not limited.
  • a method for intent recognition of voice information includes: comparing received voice information with reference voice information in a reference voice database, and from the reference voice database according to a comparison result.
  • the target reference voice information corresponding to the voice information is determined in.
  • the reference voice information 1 corresponds to the interaction intention 1
  • the reference voice information 2 corresponds to the interaction intention 2.
  • the interaction intent corresponding to the target reference voice information can be used as the interaction intention to which this round of interaction belongs.
  • a method for intent recognition of voice information includes: performing voice recognition on the received voice information to convert the voice information into text information; and then, performing the conversion on the text information obtained
  • the target interaction intention corresponding to the voice information is determined according to the result of the semantic recognition, which is not described here.
  • the content of the interaction requirements input by the user will also be different.
  • the user can enter words such as "order rights protection" on the interactive interface.
  • the human-computer interaction device 11 may obtain the interaction requirements input by the user on the interaction interface, and then may perform intent recognition on the interaction requirements input by the user to determine the target interaction intention to which the first round of interaction belongs.
  • the human-computer interaction device 11 may perform semantic recognition according to the text character to determine the target interaction intention to which the first round of interaction belongs.
  • the user may first perform voice recognition on the voice signal and perform semantic recognition based on the voice recognition result to determine the target interaction intention of the first round of interaction.
  • the human-computer interaction device 11 may send the description information of the target interaction intention to the management platform 12.
  • the manner in which the management platform 12 maintains the interaction guidance process under at least one interaction intent may be: maintaining the description information of each interaction intent and the mapping relationship between the interaction guidance processes under each interaction intent.
  • the interaction guidance process under the interaction intent whose description information is M includes PM
  • the interaction guidance process under the interaction intent whose description information is N includes PN.
  • the management platform 12 receives the request carrying the description information of the target interaction intention, it can select the interactive guidance process under the interaction intent where the description information matches the description information carried in the request sent by the human-computer interaction device 11 as the interaction intention with the target
  • the adaptive target guides the process.
  • the management platform 12 determines the interaction guidance information corresponding to the next guidance node according to the guidance node where the target interaction guidance process is currently located.
  • an object acquisition request may also be sent to the human-computer interaction device 11 to request the identification of the object to be processed under the target interaction intent.
  • the object to be processed refers to the object to which human-computer interaction is directed under the target interaction intention.
  • the object to be processed may be a commodity; when the target interaction intention is a rights protection intent, the object to be processed may be an order; when the target interaction intention is a game parameter configuration, the target interaction object may be a game character or game props.
  • the human-computer interaction device 11 After the human-computer interaction device 11 receives the object acquisition request sent by the management platform 12, it can acquire the identification of the object to be processed under the target interaction intent, and send the identification of the object to be processed to the management platform 12.
  • the human-computer interaction device 11 may display the interactive interface to the user, and prompt the user to input the identifier of the object to be processed on the interactive interface, and then may obtain the identifier of the object to be processed entered by the user on the interactive interface, such as an order number, Product ID or game item name.
  • the manner in which the user inputs the identifier of the object to be processed on the interactive interface may be handwriting input, voice input, keyboard input, input using an input pen, and the like.
  • the management platform 12 receives the identification of the object to be processed sent by the human-computer interaction device 11, and can establish a correspondence between the identification of the object to be processed and the target interactive guidance process.
  • the correspondence relationship may be used to determine a to-be-processed object corresponding to the next interaction requirement according to the correspondence between the identifier of the to-be-processed object and the target interaction guidance process when the interaction requirement associated with the target guidance process is received next time.
  • the management platform 12 in addition to maintaining the mapping relationship between the interaction intent and the interactive guidance process, the management platform 12 also maintains the process identifier of the interactive guidance process.
  • Each process identifier is unique and can be unique. Identifies an interactive guidance process, and can also uniquely identify the interactive intent to which the interactive guidance process belongs. Based on this, after determining the target interactive guidance process, the management platform 12 may send the process identification of the target interactive guidance process to the human-computer interaction device 11. When the human-machine interaction device 11 receives the process identification, it can be stored locally.
  • the human-computer interaction device 11 may determine the interaction intention to which the next round of interaction belongs based on the process identifier, and may directly report to the management based on the process identifier.
  • the platform 12 requests the interaction guidance information corresponding to the next round of interaction.
  • the management platform 12 may send the process identification of the target interactive guidance process to the human-machine interaction device 11 when sending the interactive guidance information corresponding to the current round of interaction to the human-machine interaction device 11, or may After the interaction device 11 sends the interaction guidance information corresponding to the current round of interaction, the process identification of the target interaction guidance process is sent to the human-machine interaction device 11; the target interaction guidance process may also be sent before the object acquisition request is sent to the human-machine interaction device 11
  • the process identifier is sent to the human-computer interaction device 11, which is not limited in this embodiment.
  • this round of interaction is a non-first round of interaction under the intention of the target interaction.
  • it can be the second round of interaction, the third round of interaction, etc.
  • the human-computer interaction requirement can be triggered by the result of the previous round of interaction.
  • the result of the previous round of interaction may indicate that the interaction process under the target interaction intent is not over, and the next round of interaction may continue to be performed, and the human-computer interaction device 11 may regard the result of the previous round of interaction as the source of the demand for human-computer interaction To determine the target interaction intent to which this round of interaction belongs.
  • the manner in which the human-computer interaction device 11 determines the target interaction intention to which this round of interaction belongs will also vary. different.
  • the result of the previous round of interaction may be associated with the process ID corresponding to the interaction intent it belongs to.
  • the human-computer interaction device 11 may determine the interaction intent corresponding to the process ID as the target interaction intent to which this round of interaction belongs.
  • the corresponding process identifier may be carried in the previous round of interaction results.
  • the human-computer interaction device 11 may obtain the process identifier from the previous round of interaction results, and determine the current round of interaction based on the process identifier. The target's interaction intent.
  • a process identifier associated with the result may be obtained together, and based on this, the human-machine interaction device 11 may determine, according to the obtained process identifier, which round of interaction belongs to Target interaction intent.
  • the human-computer interaction device 11 may send the process identifier and the identifier of the object to be processed under the target interaction intention to the management platform 12 to request the management platform 12 for interaction guidance information corresponding to the current round of interaction.
  • the identifier of the object to be processed may be obtained during the first round of interaction. For details, reference may be made to the foregoing embodiment. After the identifier of the object to be processed is obtained during the first round of interaction, it may be saved locally for subsequent interaction.
  • the management platform 12 may select the interaction guidance process identified by the process identifier as the target interaction guidance process from the interaction guidance processes under at least one interaction intent maintained.
  • the management platform 13 obtains the target interaction guidance process corresponding to the current round of interaction in different ways, and then, can obtain the interaction guidance information corresponding to the current round of interaction from the target interaction guidance process in the same way.
  • the target interactive guidance process may include at least one guidance node, and each guidance node represents a guidance link in the interactive guidance process.
  • the processing status of the object to be processed is different.
  • the processing status of the object to be processed may be used to characterize the progress of human-computer interaction for the object to be processed.
  • the object to be processed needs to undergo three processing states in three different guidance links, such as processing states 1, 2, and 3; these three processing states may be respectively Corresponds to the guidance nodes 1, 2, and 3 in the target interactive guidance process.
  • the management platform 12 may determine the current guidance node and the next guidance node where the target interactive guidance process is currently located according to the current processing status of the object to be processed.
  • each guidance node in the target interactive guidance process may correspond to one interactive guidance information.
  • the interactive guidance information may be displayed in various forms.
  • the interactive guidance information may be displayed in the form of a card, or in the form of speech, or in the form of a webpage link, which is not limited in this embodiment.
  • the management platform 12 may determine the interactive guidance information corresponding to the next guidance node. For example, assuming that the guidance node where the target interactive guidance process is currently located is an empty node, the next guidance node is node 1. At this time, the interactive guidance information corresponding to the guidance node 1 may be used as the guidance information corresponding to the current round of interaction. As another example, assuming that the guidance node where the target interactive guidance process is currently located is the guidance node 2, the next guidance node is node 3. At this time, the interactive guidance information corresponding to the guidance node 3 may be used as the guidance information corresponding to this round of interaction.
  • the management platform 12 may return the interaction guidance information corresponding to this round of interaction to the human-machine interaction device 11.
  • the human-machine interaction device 11 may obtain the interaction result of the current round of interaction based on the obtained interaction guidance information.
  • the manner in which the human-computer interaction device 11 obtains the interaction result of the current round of interaction based on the interaction guidance information corresponding to the current round of interaction will also be different.
  • the human-computer interaction process may be related to the business server in the corresponding scenario. The following part of this application will be described in detail with reference to the human-computer interaction system shown in FIG. 2a.
  • FIG. 2a is a schematic structural diagram of a human-computer interaction system according to another exemplary embodiment of the present application. As shown in FIG. 2 a, in addition to the human-machine interaction device 11 and the management platform 12 described in the above embodiment, the human-machine interaction system 10 further includes a service server 13.
  • the human-computer interaction device 11 may display the to-be-filled information page for the user to input the to-be-filled information according to the interactive guidance information, and the data required to display the to-be-filled information page may be provided to the service server 13 Or other optional platforms.
  • the human-computer interaction device 11 responds to a user's input operation on the information page to be filled in, obtains the filled out information page, and sends it to the service server 13. After the service server 13 receives the filled-in information page sent by the human-computer interaction device 11, it can generate the interaction result of the current round of interaction and return it to the human-machine interaction device 11 according to the filled-in information page.
  • the information page to be filled in may be an information page for confirming the user's claim.
  • the information page may include the reason for the refund, whether it is signed by me, and contact information that needs to be filled in by the user.
  • the human-computer interaction device 11 displays the information page confirming the user's request to the user, and the user fills in the relevant information and submits it; the human-computer interaction device 11 sends the completed information page to the service server 13; the service server 13 sends according to the human-computer interaction device 11
  • the filled-in information page determines whether to agree with the user's rights protection, and returns the result of the user's rights protection to the human-computer interaction device 11 as the interaction result of this round of interaction.
  • the human-computer interaction device 11 when the human-computer interaction device 11 sends the completed information page to the service server 13, it may associate the filled information page with the process identifier and send it to the service server 13.
  • the service server 13 may receive and save the process Logo.
  • the process identifier may be set as an indication that the interaction process is not over, and the service server 13 may send the process identifier and the interaction result to the human-computer interaction device 11 to inform the human-computer interaction device 11 to start. A round of interaction.
  • the human-machine interactive system 10 further includes a rendering platform 14.
  • a page rendering operation may be involved.
  • the human-computer interaction device 11 requests the interactive guidance information of the current round from the management platform 12, it may request the rendering platform 14 to render the interactive guidance information to obtain a guidance information page to display the rendered guidance information page to the user.
  • the boot information page contains interactive boot information.
  • the human-computer interaction device 11 displays the to-be-filled information page according to the interactive guidance information, it may request the rendering platform 14 to render the to-be-filled information page, and then display the rendered to-be-filled information page to the user.
  • other operations related to page rendering included in the actual interaction process can also be implemented by the rendering platform 14, which are not described herein again.
  • the user can perform single or multiple rounds of interaction with the human-machine interaction device 11, and realize the final interaction intention based on the single or multiple rounds of interaction process.
  • simple or complex interaction intent can be achieved without additional human involvement, saving labor costs and improving the efficiency of human-computer interaction.
  • the human-computer interaction device 11 may include an interactive front-end device and an interactive back-end device.
  • the interactive front-end device may be a device that directly contacts the user such as a smart phone, a smart speaker, a personal computer, a wearable device, and a tablet computer.
  • the user can initiate an interaction request, perform an interactive process, and receive an interaction result through the interactive front-end device.
  • the interactive backend device may be an interactive server or an application installed on the interactive server, which can process the interaction request received by the interactive frontend and communicate with the management platform 12 to assist the interactive frontend to perform the interaction process and receive the interaction result based on the management platform 12.
  • the management platform 12 may be implemented by a SOP (Standard Operating Procedure) platform.
  • SOP Standard Operating Procedure
  • different interactive guidance processes can be created on the SOP platform, and the corresponding relationship between different interactive guidance intents and interactive guidance processes can be generated; the SOP platform can respond to the created interactive guidance processes and corresponding relations. Perform unified maintenance for subsequent calls.
  • the SOP platform as a unified process management platform, facilitates the management of different interactive guidance processes, which facilitates communication or integration between the interactive guidance process and other devices or servers, and improves interaction efficiency.
  • the human-computer interaction process and interaction intention will also be different.
  • the human-machine interaction process (interaction intent is the order's rights intent) for users to initiate rights protection for orders in the field of e-commerce will be used as an example to describe the human-machine interaction scheme provided in the embodiments of this application in detail .
  • a user can initiate a rights protection request online through a rights protection tool provided by a shopping platform, and the online customer service is transferred to a human customer service.
  • the human customer service follows the user's rights protection request and guides the user to complete the rights maintenance operation.
  • such a rights protection scheme based on the prior art requires a large amount of manual participation and high labor costs.
  • the rights protection processing is performed based on the human-machine interaction method provided in the embodiment of the present application, the user can be automatically guided to complete the rights maintenance operation through the cooperation between the human-machine interaction device 11 and the management platform 12 without the involvement of human customers.
  • the following section will further explain the human-computer interaction scheme provided by the implementation of the present invention by taking a specific rights protection scenario as an example in conjunction with FIG. 3.
  • the user may initiate a human-machine interaction requirement through the human-machine interaction device 11, and the demand may be specifically expressed as a rights protection request.
  • the user may enter text or voice content such as "I want to complain to the seller" and "I want to apply for after-sale" in the interactive interface provided by the human-computer interaction device 11 to request entry to the first round of human-computer interaction.
  • the human-machine interaction device 11 obtains the human-machine interaction requirement initiated by the user, it can perform voice recognition and / or semantic recognition on the human-machine interaction requirement.
  • the target interaction intention to which the first round of human-computer interaction belongs is determined. For example, you can perform semantic recognition on the human-computer interaction requirements of "I want to apply for after-sale", and determine that the target interaction intent of the first round of human-computer interaction is the "order protection right".
  • the human-computer interaction device 11 can send the management platform 12 with description information related to the order's rights intent.
  • the management platform 12 receives a request carrying descriptive information related to an order's rights intent, it can determine a target interaction guidance process that is adapted to the order's rights intent.
  • the target guidance process can be: order rights guidance process; it is assumed that the order rights guidance process includes four steps: "confirm buyer appeals", “guide buyers to upload evidence”, “review proof images” and "judgment” Rights guide nodes, each rights guide node corresponds to a right guide information related to order rights operation.
  • the rights protection guidance information corresponding to "Confirm Buyer's Claim” can include guidance information such as “Refund Reason”, "Is it signed?", “Contact Information” and so on.
  • the rights protection guidance information corresponding to "Guiding Buyers to Upload Evidence” can include guidance information such as "upload product pictures” and "upload order status.”
  • the management platform 12 After the management platform 12 determines the order right protection guidance process that is compatible with the order right protection intention, it can send an object acquisition request to the human-computer interaction device 11 to request the identification of the order to be processed under the order right protection intention.
  • order rights protection rights protection is initiated for the order, and the pending order is the aforementioned pending object.
  • the identification of the pending order may include, but is not limited to, an order number, a payment serial number, buyer information, or a delivery address.
  • the human-computer interaction device 11 may obtain the identification of the order to be processed under the order rights maintenance intention upon receiving the order acquisition request sent by the management platform 12.
  • the human-computer interaction device 11 may obtain the identifier of the pending order from the locally pre-stored order information.
  • the human-computer interaction device 11 may obtain an identifier of an order to be processed from a database in which order information is stored.
  • the human-machine interactive device 11 may request the user to manually provide the identification of the pending order; in this embodiment, the management platform 12 may send an order selector to the human-machine interactive device 11; the human-machine interactive device 11 receives the order selector After that, the user can display the order selector for the user to select a pending order based on the order selector, so that the human-computer interaction device 11 can obtain information such as an identification of the pending order.
  • the human-computer interaction device 11 may send the identification of the pending order selected by the user to the management platform 12.
  • the order rights guide process maintained by the management platform 12 includes four rights guide nodes, that is, four rights protection links, which need to guide users and human-computer interaction devices 11 to complete rights protection in the order of the four rights protection links.
  • the management platform 12 determines that it is necessary to start the order rights guidance process for the pending order, and establishes a correspondence between the identity of the pending order and the order rights guidance process.
  • the management platform 12 determines the first rights protection guidance node in the order rights protection guidance process, for example, the rights protection guidance information corresponding to "confirm buyer's request" as the rights protection guidance information corresponding to this round of interaction, and sends the rights protection guidance information to the human-computer interaction.
  • the rights protection guidance information may be ID (identification) information of an "information card” used to guide the rights protection process, or may be semantic information used to guide the rights protection process, which is not limited in this embodiment.
  • the human-computer interaction device 11 receives the ID of the "information card” sent by the management platform 12, the "information card” The ID is sent to the rendering platform to request the rendering platform to render the information card identified by the "info card” ID, and after obtaining the rendering data of the information card returned by the rendering platform, the information card is displayed.
  • the information card may display an operable button and prompt information about what operation is triggered by the button. After reading the prompt information, the user can trigger the button on the information card to perform the operation indicated by the information card.
  • the human-computer interaction device 11 may request the service server 13 to load the information page to be filled in, and display the information page to be filled in when the business server 13 returns data of the information page to be filled in.
  • the to-be-filled information page may be a to-be-filled form, which contains a number of to-be-entered contents, such as a voucher upload area for quality issues in the order, dialogues with the seller Screenshot upload area, refund amount input area, etc. for users to enter.
  • the human-computer interaction device 11 can obtain the filled out information page and send it to the service server 13.
  • the human-computer interaction device 11 may associate the filled-in information page with the process identifier and send it to the service server 13.
  • the service server 13 After receiving the filled-in information page sent by the human-computer interaction device 11, the service server 13 generates the interaction result of the current round of interaction according to the filled-in information page, and returns the interaction result to the human-machine interaction device 11.
  • the service server 13 may return the process identifier of the order rights guidance process corresponding to the previous round of interaction to the human-machine interaction device 11 to instruct the human-machine interaction device 11 to continue the next round of human-machine interaction process.
  • the business service 13 may also return the interaction end identifier to the human-machine interaction device 11 to instruct the human-machine interaction device 11 to end the human-machine interaction process.
  • the human-machine interaction device 11 may determine that the interaction process under the target interaction intention is not over, and may continue to perform the next round of interaction.
  • the human-computer interaction device 11 may determine, based on the process identifier, that the target interaction intention to which this round of interaction belongs is the order's rights maintenance intention.
  • the human-computer interaction device 11 may send the process identifier and the identifier of the order to be processed to the management platform 12 with the intention of maintaining the order.
  • the management platform 12 may determine, from at least one of the interaction guidance processes maintained by the interaction intent, the process used for this round of interaction as the order rights guidance process, and based on the current processing Go to the "Confirm Buyer Request" node and determine that you need to enter a rights protection guide node, such as "Guide the buyer to upload proof", and return the rights protection guidance information corresponding to the "Guide the buyer to upload proof” node to the human-computer interaction device 11.
  • the human-computer interaction device 11 can obtain the interaction result corresponding to the "Guide Buyer to Upload Evidence” node in the manner described above and continue to The management platform 12 requests the rights protection guidance information corresponding to the next rights protection guidance node until the rights protection process ends.
  • FIG. 4 is a method flowchart of a human-machine interaction method provided by an exemplary embodiment of the present application. This embodiment can be implemented based on the human-machine interaction system shown in FIG. 1 to FIG. Angle of description. As shown in Figure 4, the method includes:
  • Step 401 Determine the target interaction intention to which the current round of interaction belongs according to the human-computer interaction requirements.
  • Step 402 Request the interaction guidance information corresponding to the current round of interaction from the management platform according to the target interaction intention, and the management platform maintains the interaction guidance process under each interaction intention.
  • Step 403 Receive interaction guidance information corresponding to the current round of interaction sent by the management platform.
  • Step 404 Obtain the interaction result of the current round of interaction according to the interaction guidance information corresponding to the current round of interaction.
  • the human-computer interaction device may perform human-computer interaction with the user, and determine the interaction intention of the current round of interaction according to the human-computer interaction requirements. For the convenience of description and partitioning, the interaction intention of the current round of interaction is recorded as the target interaction. intention. Depending on the round of this round of interaction, the situation of human-computer interaction needs will be different.
  • the human-computer interaction requirements may be initiated by the user through the human-computer interaction device; if this round of interaction is a non-first round of interaction under the target interaction intent, for example, it may be The second round of interaction, the third round of interaction, etc., the human-computer interaction requirements may be caused by the interaction result of the previous round of interaction, which is not limited in this embodiment.
  • the human-machine interaction device may communicate with the management platform. After determining the target interaction intention to which the current round of interaction belongs, the human-machine interaction device may request the management platform for the interaction corresponding to the current round of interaction according to the target interaction intention.
  • Guidance information so that the management platform obtains a target interaction guidance process adapted to the target interaction intention from the interaction guidance process under at least one interaction intention maintained.
  • the interactive guidance information refers to information for guiding the human-computer interaction process, and the information may indicate the operation that the human-machine interaction process should perform at the current or next moment.
  • step 403 after the human-computer interaction device obtains the interaction guidance information, it can guide the user to complete the entire interaction process based on the interaction guidance information.
  • the human-computer interaction device determines the target interaction intention to which the current round of interaction belongs according to the human-computer interaction requirements, and requests the management platform for the interaction guidance corresponding to the current round of interaction based on the target interaction intention.
  • the information can be obtained according to the interactive guidance information corresponding to the current round of interaction sent by the management platform.
  • the interactive result of this round of interaction can be obtained without manual guidance of the interactive process, which realizes automatic human-machine interaction, saves manpower costs, and improves human-machine interaction efficiency. .
  • the human-machine interaction requirement may be initiated by the user through the human-machine interaction device; in other exemplary embodiments, the current round of interaction Interaction is a non-first-round interaction under the intention of target interaction. For example, it may be the second-round interaction, the third-round interaction, and so on.
  • the human-computer interaction requirement may be caused by the interaction result of the previous round of interaction.
  • FIG. 5 is a flowchart of a first-round interaction method of a human-machine interaction method according to another exemplary embodiment of the present application. This embodiment may be implemented based on the human-machine interaction system shown in FIG. 1 to FIG. Description from the perspective of a machine-interactive device. As shown in Figure 5, the method includes:
  • Step 501 Display an interactive interface for a user to input interactive requirements.
  • Step 502 Perform intent recognition on the interaction requirements to determine the target interaction intention to which the first round of interaction belongs.
  • Step 503 Send the description information of the target interaction intention to the management platform for the management platform to determine the target interaction guidance process adapted to the target interaction intention; the management platform maintains the interaction guidance process under each interaction intention.
  • Step 504 Receive an object acquisition request sent by the management platform, and acquire an identifier of an object to be processed under a target interaction intention.
  • Step 505 Send the identifier of the object to be processed under the target interaction intention to the management platform, so that the management platform establishes a correspondence between the identifier of the object to be processed and the target interaction guidance process.
  • Step 506 Receive interaction guidance information corresponding to the first round of interaction sent by the management platform.
  • Step 507 Obtain the interaction result of the first round of interaction according to the interaction guidance information corresponding to the first round of interaction.
  • the human-machine interaction device may display an interaction interface for a user to input an interaction requirement; the user may enter an interaction requirement on the interaction interface.
  • the content of the interaction requirements input by the user will also be different.
  • the user can enter words such as "order rights protection".
  • the human-computer interaction device may perform intent recognition on the interaction requirements input by the user to determine the target interaction intention to which the first round of interaction belongs.
  • the interaction requirement input by the user on the interaction interface is a text character
  • the human-computer interaction device may perform semantic recognition according to the text character to determine the target interaction intention to which the first round of interaction belongs.
  • the interaction requirement input by the user on the interaction interface is a voice signal
  • the voice signal may be firstly speech-recognized, and semantic recognition is performed based on the speech recognition result to determine the target interaction intention to which the first round of interaction belongs.
  • step 503 after the human-computer interaction device determines the target interaction intent to which the first round of interaction belongs, the description information of the target interaction intent is sent to the management platform for the management platform to determine the target adapted to the target interaction intent. Interactively guide the process.
  • the object to be processed refers to an object to which human-computer interaction is directed under the target interaction intention.
  • the human-computer interaction device may display the interactive interface to the user, and prompt the user to input the identifier of the object to be processed on the interactive interface, thereby obtaining the identifier of the object to be processed that the user enters on the interactive interface, such as an order number, a product ID or game item name.
  • the manner in which the user inputs the identifier of the object to be processed on the interactive interface may be handwriting input, voice input, keyboard input, input using an input pen, and the like.
  • step 505 optionally, after the identifier of the object to be processed is determined, the identifier of the object to be processed is sent to the management platform.
  • steps 506 to 507 after receiving the interaction guidance information corresponding to the first round of interaction sent by the management platform, based on the interaction guidance information, the interaction result of the first round of interaction is obtained.
  • the human-machine interaction device cooperates with the management platform to realize automatic human-machine interaction, which saves manpower costs and improves human-machine interaction efficiency.
  • the human-computer interaction device displays the interactive interface to obtain the interaction requirements input by the user, and performs intent recognition based on the interaction requirements, so as to be able to accurately grasp the user's interaction intention, which is beneficial to accurately respond to the user's interaction needs.
  • the method may further include the step of receiving a process identifier of the target interactive guidance process returned by the management platform.
  • the human-computer interaction device may determine the interaction intent of the next round of interaction based on the process identification received in the previous round of interaction, and may directly request the management platform for the interaction corresponding to the next round of interaction based on the process identification. Guide information.
  • the following embodiment will describe in detail with reference to FIG. 6 an implementation manner of requesting the management platform for interaction guidance information corresponding to the next round of interaction according to the process identifier received in the previous round of interaction.
  • FIG. 6 is a flowchart of a non-first-round interaction method of a human-machine interaction method according to another exemplary embodiment of the present application. This embodiment can be implemented based on the human-machine interaction system shown in FIG. 1 to FIG. Description from the perspective of human-computer interaction devices. As shown in Figure 6, the method includes:
  • Step 601 Determine the interaction intention corresponding to the process identifier as the target interaction intention to which the current round of interaction belongs according to the process identifier associated with the previous round of interaction results.
  • Step 602 Send the process identifier and the identifier of the object to be processed under the target interaction intention to the management platform for the management platform to obtain the interaction guidance information corresponding to the current round of interaction from the target interaction guidance process identified by the process identifier.
  • Step 603 Receive interaction guidance information corresponding to the current round of interaction sent by the management platform.
  • Step 604 Obtain the interaction result of the current round of interaction according to the interaction guidance information corresponding to this round of interaction.
  • the human-computer interaction device may determine the target interaction intention to which the current round of interaction belongs according to the process identifier obtained in the previous round of interaction, and assign the process identifier and the target interaction intention to the object to be processed.
  • the identifier is sent to the management platform to request the management platform for the interaction guidance information corresponding to this round of interaction.
  • the identifier of the object to be processed may be obtained during the first round of interaction. For details, reference may be made to the foregoing embodiment. After the identifier of the object to be processed is obtained during the first round of interaction, it may be saved locally for subsequent interaction.
  • the request for the interaction guidance information corresponding to this round of interaction can be implemented in multiple rounds of interaction, and multiple rounds of interaction Both belong to the target interaction guidance process adapted to the same target interaction intent. Furthermore, without manual guidance, the complex interaction process including multiple rounds of interaction under the target interaction intent can be completed, making human-computer interaction more intelligent.
  • a method for obtaining the interaction result of the current round of interaction according to the interaction guidance information corresponding to this round of interactions recorded in steps 404, 507, and 604 in the above embodiments may include:
  • a way to obtain the filled-in information page and send it to the service server includes: associating the filled-in information page with the process identifier and sending it To the business server.
  • a method for receiving the interaction result of the current round of interaction returned by the service server according to the filled-in information page includes receiving the interaction result of the current round of interaction and the process identifier returned by the service server, and the current round of interaction.
  • the interaction results are associated with process identification.
  • FIG. 7 is a method flowchart of a human-machine interaction method according to another exemplary embodiment of the present application. This embodiment may be implemented based on the human-machine interaction system shown in FIG. 1 to FIG. 3, mainly from a management platform perspective. Description. As shown in Figure 7, the method includes:
  • Step 701 Receive a request sent by a human-machine interaction device, where the request carries a target interaction intention to which the human-machine interaction device belongs in this round of interaction.
  • Step 702 Obtain a target interaction guidance process adapted to the target interaction intention from the interaction guidance process under at least one interaction intention.
  • Each interaction guidance process includes at least one guidance node, and each guidance node corresponds to one interaction guidance information. .
  • Step 703 Determine the interactive guidance information corresponding to the next guidance node according to the guidance node where the target interactive guidance process is currently located.
  • Step 704 Return the interaction guidance information corresponding to the next guidance node to the human-machine interaction device as the interaction guidance information corresponding to the current round of interaction.
  • the management platform maintains an interaction guidance process under at least one interaction intent, and different interaction intents may correspond to different human-computer interaction scenarios.
  • the management platform may receive a request sent by the human-computer interaction device, and obtain a target interaction guidance process adapted to the target interaction intention from the interaction guidance process under at least one interaction intention maintained according to the received request.
  • Each interactive guiding process includes at least one guiding node, and each guiding node represents a guiding link in the interactive guiding process.
  • the number of guiding nodes represents the number of guiding links in the interactive guiding process.
  • the downstream relationship can reflect the sequence of the guidance links in the interactive guidance process.
  • Each guidance node corresponds to an interactive guidance information, which is used to indicate how to guide the human-computer interaction process in the guidance link represented by the guidance node.
  • the management platform may determine the interactive guidance information corresponding to the next guidance node according to the current guidance node where the target interactive guidance process is located, and use the interactive guidance information corresponding to the next guidance node as the current round of interaction.
  • the corresponding interactive guidance information is returned to the human-machine interactive device.
  • the management platform when the management platform receives the target interaction intent, it determines a target interaction guidance process adapted to the target interaction intent from the interaction guidance process under the maintained at least one interaction intent, Based on the current guidance node where the target interaction guidance process is located, determine the interaction guidance information corresponding to the next guidance node, so that the human-computer interaction device can obtain the interaction result of the current round of interaction based on the interaction guidance information corresponding to this round of interaction.
  • This makes the human-computer interaction process unnecessary to be guided by humans, realizes automatic human-computer interaction, saves manpower costs, and improves human-computer interaction efficiency.
  • this round of interaction is the first round of interaction under the target interaction intent
  • the request sent by the human-machine interaction device received by the management platform carries the description information of the target interaction intent; in other exemplary implementations
  • the current round of interaction is a non-first round of interaction under the intention of the target interaction.
  • the request sent by the human-machine interaction device received by the management platform carries the process identifier.
  • FIG. 8 is a flowchart of a first-round interaction method of a human-machine interaction method according to another exemplary embodiment of the present application. This embodiment can be implemented based on the human-machine interaction system shown in FIG. 1 to FIG. Description from the perspective of the platform. As shown in Figure 8, the method includes:
  • Step 801 Receive a request sent by a human-computer interaction device, where the request carries description information of the target interaction intention.
  • Step 802 Select an interactive guidance process under an interactive intent whose description information matches the description information carried in the request.
  • each interactive guidance process includes at least one guidance node, and each guidance node corresponds to An interactive guide.
  • Step 803 Send an object acquisition request to the human-computer interaction device to request the identification of the object to be processed under the target interaction intent.
  • Step 804 Receive the identifier of the processing object sent by the human-computer interaction device, and establish a correspondence between the identifier of the object to be processed and the target interactive guidance process.
  • Step 805 Determine the interactive guidance information corresponding to the next guidance node according to the guidance node where the target interactive guidance process is currently located.
  • Step 806 Return the interaction guidance information corresponding to the next guidance node to the human-machine interaction device as the interaction guidance information corresponding to the current round of interaction.
  • the manner in which the management platform maintains the interaction guidance process under at least one interaction intent may be: maintaining the description information of each interaction intent and the mapping relationship between the interaction guidance processes under each interaction intent.
  • the interaction guidance process under the interaction intent whose description information is M includes PM
  • the interaction guidance process under the interaction intent whose description information is N includes PN.
  • the management platform When the management platform receives the request carrying the description information of the target interaction intention, it can select the interactive guidance process under the interaction intent where the description information matches the description information carried in the request sent by the human-computer interaction device as an adaptation to the target interaction intention.
  • Target interactive guidance process Target interactive guidance process.
  • the correspondence between the identifier of the object to be processed and the target interactive guidance process established by the management platform in this embodiment may be used to receive the interaction request associated with the target guidance process next time according to the identifier of the object to be processed and The corresponding relationship of the target interaction guide process determines the object to be processed corresponding to the next interaction requirement.
  • the method may further include the step of sending a process identifier of the target interactive guidance process to the human-machine interaction device for the human-machine interaction device. Request interaction guidance information corresponding to the next round of interaction based on the process identification.
  • the human-computer interaction device may determine the interaction intention to which the next round of interaction belongs based on the process identifier sent by the management platform, and may directly request the management platform for interaction guidance information corresponding to the next round of interaction based on the process identifier.
  • the following embodiment will describe in detail the implementation of how the management platform requests the interaction guidance information corresponding to the next round of interaction according to the received process identification in conjunction with FIG. 9.
  • FIG. 9 is a flowchart of a non-first-round interaction method of a human-machine interaction method according to another exemplary embodiment of the present application. This embodiment may be implemented based on the human-machine interaction system shown in FIG. 1 to FIG. Description from the perspective of the management platform. As shown in Figure 9, the method includes:
  • Step 901 Receive a request sent by a human-computer interaction device, and the request carries a process identifier.
  • Step 902 Obtain the interaction guidance process identified by the process identifier as the target interaction guidance process from the interaction guidance processes under at least one interaction intent.
  • Each interaction guidance process includes at least one guidance node, and each guidance node corresponds to one interaction guidance information. .
  • Step 903 Determine the interactive guidance information corresponding to the next guidance node according to the guidance node where the target interactive guidance process is currently located.
  • Step 904 Return the interactive guidance information corresponding to the next guidance node to the human-computer interaction device as the interactive guidance information corresponding to the current round of interaction.
  • the request received by the process platform carries a process identifier
  • the management platform may obtain the interaction guidance process identified by the process identifier from the interaction guidance process under at least one interaction intent maintained by the management platform.
  • Target interaction guides the process. Furthermore, based on the process identification, multiple rounds of interaction can be realized, and the multiple rounds of interaction belong to the target interaction guidance process adapted to the same target interaction intent. Without manual guidance, the target interaction intent can be completed.
  • the complex interaction process of round interaction makes human-computer interaction more intelligent.
  • the human-machine interaction device 11 may include: a memory 110, a processor 111, a communication component 112, and an electronic display. Screen 113, audio component 114, and power component 115.
  • the memory 110 may be configured to store various other data to support operations on the human-machine interaction device 11. Examples of these data include instructions, contact data, phone book data, messages, pictures, videos, etc. for any application or method for operating on the human-machine interactive device 11.
  • Memory can be implemented by any type of volatile or non-volatile storage devices or combinations thereof, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable Read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic disk or optical disk.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read-only memory
  • EPROM erasable programmable Read-only memory
  • PROM programmable read-only memory
  • ROM read-only memory
  • magnetic memory flash memory
  • flash memory magnetic disk or optical disk.
  • the memory 110 is configured to store one or more computer instructions.
  • the processor 111 is coupled to the memory 110 and is configured to execute one or more computer instructions in the memory 110 for: determining a target interaction intention to which the current round of interaction belongs according to a human-computer interaction requirement;
  • the management platform requests the interaction guidance information corresponding to the current round of interaction, and the management platform maintains the interaction guidance process under each interaction intent; receives the interaction guidance information corresponding to the current round of interaction sent by the management platform through the communication component 112; Interaction guidance information corresponding to the round of interaction, to obtain the interaction result of this round of interaction.
  • the processor 110 when the processor 110 determines the target interaction intention to which the current round of interaction belongs according to the human-computer interaction requirements, the processor 110 is specifically configured to: display an interactive interface for the user to input the interaction requirements; Intent recognition to determine the target interaction intention to which this round of interaction belongs; or to receive voice information provided by the user, the voice information carrying the user's interaction needs; perform intent recognition on the voice information to determine which round of interaction belongs to Target interaction intent.
  • the processor 110 when the processor 110 requests the management platform for the interaction guidance information corresponding to the current round of interaction according to the target interaction intention, it is specifically configured to: send the description information of the target interaction intention to the management platform For the management platform to determine a target interaction guidance process adapted to the target interaction intent.
  • the processor 110 before receiving, by the communication component 112, the interaction guidance information corresponding to the current round of interaction sent by the management platform through the communication component 112, the processor 110 is further configured to receive, through the communication component 112, object acquisition sent by the management platform. Request to obtain the identifier of the object to be processed under the target interaction intention; send the identifier of the object to be processed under the target interaction intention to the management platform through the communication component 112, so that the management platform establishes the identifier of the object to be processed Correspondence with the target interactively guide the process.
  • the processor 110 is further configured to receive, through the communication component 112, a process identifier of the target interactive guidance process returned by the management platform.
  • the processor 110 determines the target interaction intention to which the current round of interaction belongs according to the human-computer interaction requirements
  • the processor 110 is specifically configured to: determine the corresponding flow identifier according to the flow identifier associated with the previous round of interaction results.
  • the interaction intent is the target interaction intent to which this round of interaction belongs.
  • the processor 110 when the processor 110 requests the management platform for interaction guidance information corresponding to the current round of interaction according to the target interaction intention, it is specifically configured to: use the communication component 112 to associate the process identifier with the target interaction.
  • the identifier of the object to be processed is sent to the management platform for the management platform to obtain the interaction guidance information corresponding to the current round of interaction from the target interaction guidance process identified by the process identifier.
  • the processor 110 is further configured to: when it is determined according to the interaction result of the current round of interaction that there is a next round of interaction, according to the process identifier and the The identifier of the object to be processed under the target interaction intent requests the management platform for interaction guidance information corresponding to the next round of interaction.
  • the processor 110 when the processor 110 obtains the interaction result of the current round of interaction according to the interaction guidance information corresponding to the current round of interaction, the processor 110 is specifically configured to display according to the interactive guidance information corresponding to the current round of interaction. To-be-filled information page; in response to the user's input operation on the to-be-filled information page, obtaining the filled-in information page and sending it to the business server; and receiving all information returned by the business server based on the filled-in information page The interaction results of this round of interaction are described.
  • the processor 110 when the processor 110 responds to a user's input operation on the to-be-filled information page, obtains the filled-in information page and sends it to the service server, it is specifically configured to: The completed information page is sent to the service server after being associated with the process identifier; when receiving the interaction result of the current round of interaction returned by the service server according to the filled information page, it is specifically used for Receiving the interaction result of the current round of interaction and the process identifier returned by the service server through the communication component 112, and the interaction result of the current round of interaction is associated with the process identifier.
  • the target interaction intent to which the current round of interaction belongs includes: an intent to protect rights for an order.
  • the electronic display screen 113 is configured to display an information page to be filled in and an interaction result.
  • the electronic display 113 includes a liquid crystal display (LCD) and a touch panel (TP). If the electronic display screen 113 includes a touch panel, the electronic display screen 113 may be implemented as a touch screen to receive an input signal from a user.
  • the touch panel includes one or more touch sensors to sense touch, swipe, and gestures on the touch panel. A touch sensor can not only sense the boundaries of a touch or slide action, but also detect the duration and pressure associated with a touch or slide operation.
  • the audio component 114 is stored as an output and / or input audio signal.
  • the audio component 114 includes a microphone (MIC).
  • the microphone is configured to receive an external audio signal.
  • the received audio signal may be further stored in the memory 110 or transmitted via the communication component 112.
  • the processor 111 determines the target interaction intention of the current round of interaction according to the human-computer interaction requirements, it can receive the voice signal input by the user through the microphone of the audio component 114, and perform voice recognition and semantic recognition on the voice signal to determine the target. Interaction intent.
  • the audio component 114 further includes a speaker for outputting audio signals.
  • the power supply component 115 is configured to provide power to various components of the human-computer interaction device 11.
  • the power components may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the human-machine interaction device 11.
  • the human-computer interaction device 11 determines the target interaction intention to which the current round of interaction belongs based on the human-computer interaction requirements, and requests the management platform for the interaction corresponding to the current round of interaction based on the target interaction intention.
  • Guidance information, and according to the interactive guidance information corresponding to the current round of interaction sent by the management platform the interactive results of this round of interaction can be obtained without manual guidance of the interactive process, which realizes automatic human-machine interaction, saves labor costs, and improves human-machine interaction effectiveness.
  • an embodiment of the present application further provides a computer-readable storage medium storing a computer program.
  • the computer program When executed, it can implement the steps in the method embodiment that can be executed by a human-computer interaction device in the foregoing method embodiment.
  • the management platform 12 may include: a memory 120, a processor 121, a communication component 122, and a power supply component 123.
  • the memory 120 may be configured to store various other data to support operations on the management platform 12. Examples of such data include instructions for any application or method operating on the management platform 12, contact data, phone book data, messages, pictures, videos, and the like.
  • Memory can be implemented by any type of volatile or non-volatile storage devices or combinations thereof, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable Read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic disk or optical disk.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read-only memory
  • EPROM erasable programmable Read-only memory
  • PROM programmable read-only memory
  • ROM read-only memory
  • magnetic memory flash memory
  • flash memory magnetic disk or optical disk.
  • the memory 110 is configured to store one or more computer instructions.
  • the processor 121 is coupled to the memory 120 and is configured to execute one or more computer instructions in the memory 120 for receiving a request sent by the human-computer interaction device through the communication component 122, where the request carries the current round of the human-computer interaction device.
  • the target interaction intent to which the interaction belongs from the interaction guidance process under at least one interaction intent, obtain a target interaction guidance process adapted to the target interaction intent, each interaction guidance process including at least one guidance node, each guidance node Corresponds to an interactive guidance information; determines the interactive guidance information corresponding to the next guidance node according to the guidance node where the target interactive guidance process is currently located; and uses the communication guidance information corresponding to the next guidance node as the communication component 122 as the interactive guidance information
  • the interaction guidance information corresponding to this round of interaction is returned to the human-computer interaction device.
  • the request carries a process identifier; when the processor 120 obtains a target interaction guidance process adapted to the target interaction intention from the interaction guidance process under at least one interaction intention, the processor 120 is specifically configured to: Obtaining, from the interaction guidance process under the at least one interaction intent, the interaction guidance process identified by the process identifier as the target interaction guidance process.
  • the request carries description information of the target interaction intention; the processor 120 obtains a target interaction guide adapted to the target interaction intention in an interaction guidance process under at least one interaction intention In the process, it is specifically used to select an interactive guidance process under an interaction intent whose description information matches the description information carried in the request as the target interactive guidance process; in the interactive guidance process under at least one interaction intent
  • the processor 120 is further configured to send the process identifier of the target interaction guidance process to the human-machine interaction device for the human-machine interaction The device requests interaction guidance information corresponding to the next round of interaction based on the process identification.
  • the processor 120 is further configured to: interact with the human-computer through the communication component 122 The device sends an object acquisition request to request the identification of the object to be processed under the target interaction intention; receives the identification of the processing object sent by the human-computer interaction device through the communication component 122; establishes the identification of the object to be processed and the target Correspondence of interactive guidance process.
  • the power supply component 123 is configured to provide power to various components of the management platform 12.
  • the power components may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the management platform 12.
  • the management platform 12 may determine the interaction guidance information corresponding to the current round of interaction according to the request of the human-machine interaction device 11, and send the interaction guidance information to the human-machine interaction device.
  • the human-computer interaction device In order for the human-computer interaction device to obtain the interaction result of the current round of interaction according to the interaction guidance information corresponding to the current round of interaction, it is not necessary to manually guide the interaction process, thereby realizing automatic human-machine interaction, saving manpower costs, and improving human-machine interaction efficiency.
  • the embodiment of the present application further provides a computer-readable storage medium storing a computer program.
  • the computer program When the computer program is executed, it can implement the steps in the method embodiment that can be executed by the human management platform 12 in the foregoing method embodiment.
  • the embodiments of the present invention may be provided as a method, a system, or a computer program product. Therefore, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Moreover, the present invention may take the form of a computer program product implemented on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) containing computer-usable program code.
  • computer-usable storage media including, but not limited to, disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions may also be stored in a computer-readable memory capable of directing a computer or other programmable data processing device to work in a specific manner such that the instructions stored in the computer-readable memory produce a manufactured article including an instruction device, the instructions
  • the device implements the functions specified in one or more flowcharts and / or one or more blocks of the block diagram.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device, so that a series of steps can be performed on the computer or other programmable device to produce a computer-implemented process, which can be executed on the computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more flowcharts and / or one or more blocks of the block diagrams.
  • a computing device includes one or more processors (CPUs), input / output interfaces, network interfaces, and memory.
  • processors CPUs
  • input / output interfaces output interfaces
  • network interfaces network interfaces
  • memory volatile and non-volatile memory
  • Memory may include non-persistent memory, random access memory (RAM), and / or non-volatile memory in computer-readable media, such as read-only memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
  • RAM random access memory
  • ROM read-only memory
  • flash RAM flash memory
  • Computer-readable media includes permanent and non-persistent, removable and non-removable media.
  • Information storage can be accomplished by any method or technology.
  • Information may be computer-readable instructions, data structures, modules of a program, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), and read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technologies, read-only disc read-only memory (CD-ROM), digital versatile disc (DVD) or other optical storage, Magnetic tape cartridges, magnetic tape storage or other magnetic storage devices or any other non-transmitting medium may be used to store information that can be accessed by a computing device.
  • computer-readable media does not include temporary computer-readable media, such as modulated data signals and carrier waves.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请实施例提供一种人机交互方法、设备、系统及存储介质。在本实施例中,人机交互设备根据人机交互需求,确定本轮交互所属的目标交互意图,并根据目标交互意图向管理平台请求本轮交互对应的交互引导信息;管理平台维护各交互意图下的交互引导流程,并基于目标交互意图确定本轮交互对应的交互引导信息发送至人机交互设备;人机交互设备根据本轮交互对应的交互引导信息,获取本轮交互的交互结果,这样无需通过人工引导交互过程,实现了人机自动交互,节约了人力成本,提高了人机交互效率。

Description

人机交互方法、设备、系统及存储介质
本申请要求2018年07月06日递交的申请号为201810735560.X、发明名称为“人机交互方法、设备、系统及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及人工智能技术领域,尤其涉及一种人机交互方法、设备、系统及存储介质。
背景技术
随着电子商务技术的发展,以信息网络技术为手段,消费者的网上购物、商户之间的网上交易越来越广泛。网上购物、网上交易的过程主要是订单处理过程,即以订单为处理对象的信息处理过程。
现有订单处理方式,例如在针对订单进行维权处理时,一般是用户通过人机交互设备在线发起维权请求,在线客服转人工客服,由人工客服跟进用户的维权请求,引导用户完成维权操作,最终给出解决方案。
但是,现有方式需要大量的人工参与,人力成本较高,且效率较低。
发明内容
本申请实施例的多个方面提供一种人机交互方法、设备、系统及存储介质,用以降低人机交互过程中的人力成本,提高人机交互效率。
本申请实施例提供一种人机交互方法,适用于人机交互设备,包括:
根据人机交互需求,确定本轮交互所属的目标交互意图;
根据所述目标交互意图向管理平台请求本轮交互对应的交互引导信息,所述管理平台维护各交互意图下的交互引导流程;
接收所述管理平台发送的本轮交互对应的交互引导信息;
根据所述本轮交互对应的交互引导信息,获取本轮交互的交互结果。
本申请实施例还提供一种交互方法,适用于管理平台,包括:
接收人机交互设备发送的请求,所述请求携带所述人机交互设备本轮交互所所属的目标交互意图;
从至少一个交互意图下的交互引导流程中,获取与所述目标交互意图适配的目标交互引导流程,每个交互引导流程包括至少一个引导节点,每个引导节点对应一个交互引导信息;
根据所述目标交互引导流程当前所处的引导节点,确定下一引导节点对应的交互引导信息;
将所述下一引导节点对应的交互引导信息作为所述本轮交互对应的交互引导信息返回给所述人机交互设备。
本申请实施例还提供一种人机交互设备,包括:存储器、处理器以及通信组件;
所述存储器,用于存储一条或多条计算机指令;
所述处理器,用于执行一条或多条计算机指令,以用于:
根据人机交互需求,确定本轮交互所属的目标交互意图;
根据所述目标交互意图向管理平台请求本轮交互对应的交互引导信息,所述管理平台维护各交互意图下的交互引导流程;
通过所述通信组件接收所述管理平台发送的本轮交互对应的交互引导信息;
根据所述本轮交互对应的交互引导信息,获取本轮交互的交互结果。
本申请实施例还提供一种存储有计算机程序的计算机可读存储介质,所述计算机程序被执行时能够实现上述人机交互设备侧的人机交互方法中的步骤。
本申请实施例还提供一种管理平台,包括:存储器、处理器以及通信组件;
所述存储器,用于存储一条或多条计算机指令;
所述处理器,用于执行一条或多条计算机指令,以用于:
通过所述通信组件接收人机交互设备发送的请求,所述请求携带所述人机交互设备本轮交互所所属的目标交互意图;
从至少一个交互意图下的交互引导流程中,获取与所述目标交互意图适配的目标交互引导流程,每个交互引导流程包括至少一个引导节点,每个引导节点对应一个交互引导信息;
根据所述目标交互引导流程当前所处的引导节点,确定下一引导节点对应的交互引导信息;
通过所述通信组件将所述下一引导节点对应的交互引导信息作为所述本轮交互对应的交互引导信息返回给所述人机交互设备。
本申请实施例还提供一种存储有计算机程序的计算机可读存储介质,所述计算机程 序被执行时能够实现上述管理平台侧的人机交互方法中的步骤。
本申请实施例还提供一种人机交互系统,包括:人机交互设备和管理平台;
所述人机交互设备,用于根据人机交互需求,确定本轮交互所属的目标交互意图;根据所述目标交互意图向所述管理平台请求本轮交互对应的交互引导信息,接收所述管理平台发送的本轮交互对应的交互引导信息;根据所述本轮交互对应的交互引导信息,获取本轮交互的交互结果;
所述管理平台,用于根据所述人机交互设备的请求,从所维护的至少一个交互意图下的交互引导流程中,获取与所述目标交互意图适配的目标交互引导流程,根据所述目标交互引导流程当前所处的引导节点,确定下一引导节点对应的交互引导信息;将所述下一引导节点对应的交互引导信息作为所述本轮交互对应的交互引导信息返回给所述人机交互设备;其中,每个交互引导流程包括至少一个引导节点,每个引导节点对应一个交互引导信息。
在本申请实施例中,在人机交互的场景中,通过管理平台维护各交互意图下的交互引导流程,并使人机交互设备与管理平台配合,人机交互设备根据人机交互需求,确定本轮交互所属的目标交互意图,基于该目标交互意图向管理平台请求本轮交互对应的交互引导信息,进而根据管理平台发送的本轮交互对应的交互引导信息可获取本轮交互的交互结果,无需通过人工引导交互过程,实现了人机自动交互,节约了人力成本,提高了人机交互效率。
附图说明
此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1为本申请一示例性实施例提供的一种人机交互系统的结构示意图;
图2a为本申请另一示例性实施例提供的一种人机交互系统的结构示意图;
图2b为本申请又一示例性实施例提供的一种人机交互系统的结构示意图;
图3为本申请一示例性实施例提供的人机交互系统在订单维权场景下的交互示意图;
图4为本申请一示例性实施例提供的一种人机交互方法的方法流程图;
图5为本申请另一示例性实施例提供的一种人机交互方法的首轮交互方法流程图;
图6为本申请又一示例性实施例提供的一种人机交互方法的非首轮交互方法流程 图;
图7为本申请又一示例性实施例提供的一种人机交互方法的方法流程图;
图8为本申请又一示例性实施例提供的一种人机交互方法的首轮交互方法流程图;
图9为本申请又一示例性实施例提供的一种人机交互方法的非首轮交互方法流程图;
图10为本申请又一示例性实施例提供的一种人机交互设备的结构示意图;
图11为本申请又一示例性实施例提供的一种管理平台的结构示意图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合本申请具体实施例及相应的附图对本申请技术方案进行清楚、完整地描述。显然,所描述的实施例仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
在一些现有的人机交互场景中,需要人工对人机交互的流程进行引导,这种方式需要大量的人工参与,人力成本较高,人机交互效率较低。针对该技术问题,在本申请一些示例性实施例中,针对人机交互的场景,通过管理平台维护各交互意图下的交互引导流程,并使人机交互设备与管理平台配合,人机交互设备根据人机交互需求,确定本轮交互所属的目标交互意图,基于该目标交互意图向管理平台请求本轮交互对应的交互引导信息,进而根据管理平台发送的本轮交互对应的交互引导信息可获取本轮交互的交互结果;在这个过程中,无需通过人工引导人机交互过程,实现了人机自动交互,节约了人力成本,提高了人机交互效率。
图1为本申请一示例性实施例提供的一种人机交互系统的结构示意图。如图1所示,该人机交互系统10包括:人机交互设备11以及管理平台12。
在本实施例中,人机交互设备11是指能够与用户进行交互的设备。根据人机交互场景的不同,人机交互设备的实现形态也会有所不同。
例如,在在线购物等业务场景中,人机交互设备11可以是这些业务场景中的在线客服设备、机器人小蜜等。这些业务场景中的人机交互设备11可包括前端(面向用户的一端)和后端(用户不可见的一端)。前端的实现形态可以是网页、或者应用页面,或者窗口等,主要面向用户提供人机交互界面、语音输入接口等各种形式的人机交互接口,一般部署在用户侧的终端设备上。后端主要提供计算、数据处理、人机交互逻辑控制等 功能,可部署业务场景中的服务器或机器人等设备上。
又例如,在自动驾驶、无人机这些业务场景中,人机交互设备11可以是安装于车辆或无人机内的电子设备,这些电子设备可以向用户提供人机交互界面、语音输入接口等各种形式的人机交互接口。
在本实施例中,管理平台12维护有至少一个交互意图下的交互引导流程,可与人机交互设备11进行通信,并可基于所维护的各交互意图下的交互引导流程对人机交互设备11的人机交互过程进行引导。本实施例并不限定管理平台12的实现形态,可以是常规服务器、云服务器、云主机、虚拟中心等服务器设备。
其中,服务器设备的构成主要包括处理器、硬盘、内存、系统总线等,和通用的计算机架构类似。另外,管理平台12与人机交互设备11之间可以是以是无线或有线连接。在本实施例中,若人机交互设备11通过移动网络与管理平台12通信连接,该移动网络的网络制式可以为2G(GSM)、2.5G(GPRS)、3G(WCDMA、TD-SCDMA、CDMA2000、UTMS)、4G(LTE)、4G+(LTE+)、WiMax等中的任意一种。除此之外,人机交互设备11还可以采用WiFi、蓝牙、红外等通信方式与管理平台12建立通信连接。
其中,人机交互设备11可以在管理平台12的引导下自动完成人机交互过程,而无需人工引导。以下部分将详细介绍人机交互设备11如何在管理平台12的引导自动地实现人机交互。
在该人机交互系统10中,人机交互设备11可与用户进行人机交互。一个完整的人机交互过程可包含至少一轮交互,每一轮交互可视作该人机交互过程中的一个交互环节。每个完整的人机交互过程对应一个交互意图。在本实施例中,人机交互设备11可在管理平台12的配合下,自动完成一个交互意图下的人机交互过程。换句话说,人机交互设备11可在管理平台12的配合下,自动完成一个交互意图下的至少一轮交互。由于人机交互设备11在管理平台12的配合下,完成每轮交互的过程相类似,本实施例以人机交互设备11完成本轮交互的过程为例进行描述。本轮交互,指的是至少一轮交互中,当前需要执行或正在执行的交互。
人机交互设备11可根据人机交互需求,确定本轮交互所属的交互意图,为便于描述和分区,将本轮交互所属的交互意图记为目标交互意图。根据本轮交互的轮次,人机交互需求的情况会有所不同。若本轮交互是目标交互意图下的第一轮交互,则人机交互需求可以是用户通过人机交互设备11发起的;若本轮交互是目标交互意图下的非第一轮交互,例如可能是第二轮交互、第三轮交互等,则人机交互需求可以是上一轮交互的交互 结果引发的,本实施例不做限制。
基于管理平台12和人机交互设备11之间的通信,在确定本轮交互所属的目标交互意图后,人机交互设备11可根据该目标交互意图向管理平台12请求本轮交互对应的交互引导信息。其中,交互引导信息,指的是用于对人机交互过程进行引导的信息,该信息可指示人机交互过程当前或者下一时刻应该执行的操作;人机交互设备11可基于该交互引导信息,引导用户完成整个交互过程。
管理平台12可接收人机交互设备11发送的请求,并根据接收到的请求,从所维护的至少一个交互意图下的交互引导流程中,获取与目标交互意图适配的目标交互引导流程。在本实施例中,管理平台12维护至少一个交互意图下的交互引导流程,不同交互意图可以对应不同的人机交互场景。例如,在线维权场景对应的交互意图是在线维权。又例如,在线咨询场景对应的交互意图是在线咨询。再例如,在线问答场景对应的交互意图是在线问答。不论是哪个交互意图下的交互引导流程,每个交互引导流程包括至少一个引导节点,每个引导节点代表该交互引导流程中的一个引导环节,引导节点的数量代表了该交互引导流程中引导环节的数量,引导节点之间的上下游关系可体现交互引导流程中引导环节之间先后顺序,每个引导节点对应一个交互引导信息,用于指示如何在该引导节点所代表的引导环节对人机交互过程进行引导。
基于上述,管理平台12可根据目标交互引导流程当前所处的引导节点,确定下一引导节点对应的交互引导信息,并将该下一引导节点对应的交互引导信息作为本轮交互对应的交互引导信息返回给人机交互设备11。人机交互设备11接收管理平台12发送的本轮交互对应的交互引导信息,并可根据本轮交互对应的交互引导信息,完成本轮交互过程并获取本轮交互的交互结果。
在本实施例中,在人机交互的场景中,通过管理平台维护各交互意图下的交互引导流程,并使人机交互设备11与管理平台配合,人机交互设备11根据人机交互需求,确定本轮交互所属的目标交互意图,并基于该目标交互意图向管理平台12请求本轮交互对应的交互引导信息,进而根据管理平台12发送的本轮交互对应的交互引导信息可获取本轮交互的交互结果。相对于现有技术采用人工进行引导的方式,本实施例中,人机交互设备11向管理平台12请求到的交互引导信息可实现自动地对人机交互过程进行引导,无需通过人工引导交互过程,节约了人力成本,提高了人机交互效率。
在本申请实施例中,以人机交互设备11执行本轮交互为例,对人机交互设备11在管理平台12的配合下自动完成人机交互的过程进行说明。根据本轮交互所属交互轮次的 不同,人机交互设备11在管理平台12的配合下自动完成人机交互的详细实施过程会有所不同,下面将针对不同轮次展开说明。
在情况A下,本轮交互为目标交互意图下的首轮交互,该人机交互需求是用户通过人机交互设备11发起的。人机交互设备11可响应于用户发起的人机交互需求,对该人机交互需求进行意图识别,并根据识别结果确定首轮交互所属的目标交互意图。
根据人机交互设备11面向用户提供的人机交互接口的不同,用户发起人机交互需求的方式也会有所不同。例如,在一种可选实施方式中,人机交互设备11可展示交互界面,以供用户输入交互需求;用户可以在该交互界面上输入交互需求。用户可以采用手写方式在交互界面上输入交互需求,也可以通过鼠标、键盘、输入笔、录音笔、麦克风等外设在交互界面上输入交互需求。在另一种可选实施方式中,人机交互设备11还可接收用户提供的语音信息,该语音信息携带用户的交互需求,并针对该语音信息进行意图识别,以确定本轮交互所属的目标交互意图。其中,用户提供语音信息的方式可以是:由用户直接发出语音,或者由用户通过语音播放设备播放语音信息,例如,用户可通过手机、电脑、录音笔等设备播放已生成或者已收录的语音信息,本实施例不做限制。
在一些示例性实施例中,对语音信息进行意图识别的一种方式,包括:将接收到的语音信息与参考语音库中的参考语音信息进行比对,并根据比对结果从该参考语音库中确定与该语音信息对应的目标参考语音信息。其中,参考语音库中的参考语音信息与交互意图存在对应关系,例如,参考语音信息1对应的交互意图1,参考语音信息2对应交互意图2。基于此,在确定目标参考语音信息之后,可将该目标参考语音信息对应的交互意图作为本轮交互所属的交互意图。
在另一些示例性实施例中,对语音信息进行意图识别的一种方式,包括:对接收到的语音信息进行语音识别,以将语音信息转换为文本信息;接着,对转换得到的文本信息进行语义识别,根据语义识别的结果确定该语音信息对应的目标交互意图,此处不赘述。
根据人机交互场景的不同,用户输入的交互需求的内容也会有所不同。例如,在维权场景中,用户可以在交互界面上输入“订单维权”等字样。人机交互设备11可获取用户在交互界面上输入的交互需求,之后可对用户输入的交互需求进行意图识别,以确定首轮交互所属的目标交互意图。
可选的,若用户在交互界面输入的交互需求为文本字符,则人机交互设备11可根据该文本字符进行语义识别,以确定首轮交互所属的目标交互意图。可选的,若用户在交 互界面输入的交互需求为语音信号,则可先对该语音信号进行语音识别,基于语音识别结果进行语义识别以确定首轮交互所属的目标交互意图。
在确定首轮交互所属的目标交互意图之后,人机交互设备11可向管理平台12发送目标交互意图的描述信息。相应地,管理平台12维护至少一个交互意图下的交互引导流程的方式可以是:维护各交互意图的描述信息和各交互意图下的交互引导流程的映射关系。例如,描述信息为M的交互意图下的交互引导流程包括PM;描述信息为N的交互意图下的交互引导流程包括PN。管理平台12接收到携带目标交互意图的描述信息的请求时,可选择描述信息与人机交互设备11发送的请求中携带的描述信息相匹配的交互意图下的交互引导流程,作为与目标交互意图适配的目标交互引导流程。
可选的,在本轮交互为目标交互意图下的首轮交互的实施方式中,管理平台12在根据所述目标交互引导流程当前所处的引导节点,确定下一引导节点对应的交互引导信息之前,还可向人机交互设备11发送对象获取请求,以请求目标交互意图下待处理对象的标识。
其中,待处理对象,是指在目标交互意图下,人机交互所针对的对象。例如,目标交互意图为商品咨询意图时,待处理对象可以是商品;目标交互意图为维权意图时,待处理对象可以是订单;目标交互意图为游戏参数配置时,目标交互对象可以是游戏角色或者游戏道具。
人机交互设备11接收到管理平台12发送的对象获取请求之后,可获取目标交互意图下待处理对象的标识,并将待处理对象的标识发送至管理平台12。可选地,人机交互设备11可以向用户展示交互界面,并提示用户在交互界面上输入待处理对象的标识,进而可获取用户在交互界面上输入的待处理对象的标识,例如订单号、商品ID或游戏道具的名称等。用户在交互界面上输入待处理对象的标识的方式可以是手写输入、语音输入、键盘输入、利用输入笔输入等。管理平台12接收人机交互设备11发送的待处理对象的标识,并可建立待处理对象的标识与目标交互引导流程的对应关系。该对应关系可用于在下一次接收到与目标引导流程关联的交互需求时,根据该待处理对象的标识与目标交互引导流程的对应关系,确定下一次交互需求对应的待处理对象。
可选的,在上述实施方式中,管理平台12除了维护各交互意图与交互引导流程之间的映射关系之外,还会维护交互引导流程的流程标识,每个流程标识具有唯一性,可唯一标识一个交互引导流程,也可以唯一标识该交互引导流程所属的交互意图。基于此,管理平台12在确定目标交互引导流程之后,可将目标交互引导流程的流程标识发送至人 机交互设备11。人机交互设备11接收到该流程标识时,可保存在本地。
在一些可选实施例中,若本轮交互之后还存在下一轮交互,则人机交互设备11可基于该流程标识确定下一轮交互所属的交互意图,并可直接根据该流程标识向管理平台12请求下一轮交互对应的交互引导信息,此部分内容的详细实施过程将在后续针对本轮交互不属于首轮交互的实施例中描述,此处不赘述。
可选的,管理平台12可在向人机交互设备11发送本轮交互对应的交互引导信息时,一并将目标交互引导流程的流程标识发送至人机交互设备11,或者可在向人机交互设备11发送本轮交互对应的交互引导信息之后,将目标交互引导流程的流程标识发送至人机交互设备11;也可以在向人机交互设备11发送对象获取请求之前,将目标交互引导流程的流程标识发送至人机交互设备11,本实施例不做限制。
在情况B下,本轮交互为目标交互意图下的非首轮交互,例如可以是第二轮交互、第三轮交互等,相应地,人机交互需求可由上一轮交互结果引发。在该情况下,上一轮交互结果可指示目标交互意图下的交互过程未结束,可继续执行下一轮交互,则人机交互设备11可将上一轮交互结果视为人机交互需求的来源,确定本轮交互所属的目标交互意图。当然,通过上一轮交互结果指示目标交互意图下的交互过程未结束的方式可以有多种,在不同方式下,人机交互设备11确定本轮交互所属的目标交互意图的方式也会有所不同。
例如,上一轮交互结果可以与其所属交互意图对应的流程标识关联,基于此,人机交互设备11可确定与该流程标识对应的交互意图作为本轮交互所属的目标交互意图。在一可选实施方式中,在上一轮交互结果中可以携带对应的流程标识,基于此,人机交互设备11可以从上一轮交互结果中获取流程标识,根据该流程标识确定本轮交互所属的目标交互意图。在另一可选实施方式中,在获得上一轮交互结果时可以一并获得与该结果关联的流程标识,基于此,人机交互设备11可以根据所获得的流程标识确定本轮交互所属的目标交互意图。
接着,人机交互设备11可将该流程标识和目标交互意图下待处理对象的标识发送给管理平台12,以向管理平台12请求本轮交互对应的交互引导信息。其中,待处理对象的标识可在首轮交互过程中获取,具体可参考上述实施例;首轮交互过程中获取到待处理对象的标识后,可保存在本地,以供后续交互过程使用。管理平台12接收到人机交互设备11发送的流程标识后,可从所维护的至少一个交互意图下的交互引导流程中,选择该流程标识所标识的交互引导流程作为目标交互引导流程。
在上述情况A和B下,管理平台13采用不同的方式获取本轮交互对应的目标交互引导流程,之后,可采用相同的方式从目标交互引导流程中获取本轮交互对应的交互引导信息。其中,目标交互引导流程可包含至少一个引导节点,每个引导节点代表该交互引导流程中的一个引导环节,不同的引导环节中,待处理对象的处理状态不同。在一些实施例中,待处理对象的处理状态可用于表征针对待处理对象的人机交互的进度。例如,在一目标交互意图下,待处理对象在人机交互过程中,在三个不同的引导环节中需经历三种处理状态,例如处理状态1、2、3;这三种处理状态可分别对应目标交互引导流程中的引导节点1、2、3。基于此,管理平台12在确定目标交互引导流程之后,可根据待处理对象当前的处理状态,确定目标交互引导流程当前所处的引导节点以及下一引导节点。
可选的,目标交互引导流程中的每个引导节点可对应一个交互引导信息。交互引导信息可以多种形式进行展示,例如,交互引导信息可以卡片形式展示,或者以话术形式展示,或者以网页链接的形式展示,本实施例不做限制。
基于此,管理平台12在确定下一引导节点之后,可确定该下一引导节点对应的交互引导信息。例如,假设目标交互引导流程当前所处的引导节点为空节点,则下一引导节点为节点1,此时,可将引导节点1对应的交互引导信息作为本轮交互对应的引导信息。再例如,假设目标交互引导流程当前所处的引导节点为引导节点2,则下一引导节点为节点3,此时,可将引导节点3对应的交互引导信息作为本轮交互对应的引导信息。
接着,管理平台12可将本轮交互对应的交互引导信息返回给人机交互设备11。人机交互设备11基于获取到的交互引导信息,可获取本轮交互的交互结果。根据人机交互场景的不同,人机交互设备11基于本轮交互对应的交互引导信息获得本轮交互的交互结果的方式也会有所不同。在一些人机交互场景中,人机交互过程可能与相应场景中的业务服务器相关,本申请以下部分将结合图2a所示人机交互系统进行详细说明。
图2a为本申请另一示例性实施例提供的一种人机交互系统的结构示意图。如图2a所示,除上述实施例记载的人机交互设备11以及管理平台12之外,该人机交互系统10还包括:业务服务器13。
关于人机交互设备11和管理平台12的描述可参见上述实施例,在此不再赘述。在本实施例中,重点描述人机交互设备11获得本轮交互对应的交互引导信息之后,如何根据该交互引导信息从业务服务器13获取本轮交互的交互结果的过程。
在一种可选实施方式中:人机交互设备11可根据交互引导信息,展示待填写信息页, 以供用户输入待填写信息;其中,展示待填写信息页所需的数据可向业务服务器13或者其他可选的平台获取。人机交互设备11响应用户在待填写信息页上的输入操作,获取填写后的信息页并发送给业务服务器13。业务服务器13接收人机交互设备11发送的填写后的信息页之后,可根据填写后的信息页生成本轮交互的交互结果返回给人机交互设备11。
以订单维权为例,待填写信息页可能是确认用户诉求的一个信息页,该信息页可能包括退款原因、是否本人签收、联系方式等需要用户填写的信息。人机交互设备11将确认用户诉求的信息页展示给用户,用户填写有关信息之后提交;人机交互设备11将填写后的信息页发送给业务服务器13;业务服务器13根据人机交互设备11发送的填写后的信息页,确定是否同意用户维权,将是否同意用户维权的结果作为本轮交互的交互结果返回给人机交互设备11。
可选的,人机交互设备11在将填写后的信息页发送给业务服务器13时,可将填写后的信息页与流程标识关联后发送给业务服务器13,业务服务器13可接收并保存该流程标识。在一些实施例中,本轮交互结束后,仍旧存在下一轮交互方能实现目标交互意图。在这种情况下,可设置该流程标识作为交互过程未结束的指示信息,业务服务器13可将该流程标识和交互结果一并发送至人机交互设备11,以告知人机交互设备11启动下一轮交互。
在一可选的实施方式中,如图2b所示,除上述实施例记载的人机交互设备11、管理平台12以及业务服务器13之外,该人机交互系统10还包括:渲染平台14。
人机交互设备11获得本轮交互对应的交互引导信息之后,在根据该交互引导信息从业务服务器13获取本轮交互的交互结果的过程中,可能涉及到页面的渲染操作。例如,人机交互设备11从管理平台12请求到本轮的交互引导信息后,可以请求渲染平台14渲染该交互引导信息得到一个引导信息页,以向用户展示经渲染得到的引导信息页,该引导信息页包含交互引导信息。再例如,人机交互设备11根据交互引导信息,展示待填写信息页时,可以请求渲染平台14渲染出待填写信息页,然后向用户展示经渲染得到的待填写信息页。当然,实际交互过程包含的与页面渲染相关的其他操作也可以通过渲染平台14实现,此处不赘述。
在上述实施例中,基于管理平台12的交互引导作用,用户可与人机交互设备11进行单伦次或者多轮次交互,并基于单伦次或多轮次交互过程实现最终的交互意图。在这个过程中,不需要额外的人力参与即可实现简单或是复杂的交互意图,节约了人力成本, 提高了人机交互的效率。
可选的,在一些实施例中,人机交互设备11可包括交互前端设备和交互后端设备。其中,交互前端设备可以是智能手机、智能音箱、个人电脑、穿戴设备、平板电脑等直接与用户接触的设备,用户可通过交互前端设备发起交互请求、执行交互过程以及接收交互结果。交互后端设备可以是交互服务器或者安装在交互服务器上的应用程序,可处理交互前端接收的交互请求,与管理平台12进行通信,以基于管理平台12辅助交互前端执行交互过程以及接收交互结果。
可选的,在一些实施例中,管理平台12可以由SOP(Standard Operating Procedure,标准操作流程)平台实现。根据不同交互引导意图对应的交互操作内容,可在SOP平台上创建不同的交互引导流程,并生成不同交互引导意图与交互引导流程的对应关系;SOP平台可对已创建的交互引导流程及对应关系进行统一维护,便于后续调用。除此之外,SOP平台作为统一的流程管理平台便于对不同的交互引导流程进行管理,有利于交互引导流程与其他设备或服务器之间进行通信或者整合,提升了交互效率。
根据人机交互场景的不同,人机交互过程以及交互意图也会有所不同。在本申请下述实施例中,将以电子商务领域中用户针对订单发起维权的人机交互过程(交互意图为订单的维权意图)为例,对本申请实施例提供的人机交互方案进行详细说明。
在电子商务领域中,用户通过网络购物平台消费。当用户的消费权益受损时,可通过网络购物平台针对订单发起维权处理。现有技术中,用户可通过购物平台提供的维权工具在线发起维权请求,在线客服转人工客服,由人工客服跟进用户的维权请求,引导用户完成维权操作。但是,这种基于现有技术的维权方案需要大量的人工参与,人力成本较高。若基于本申请实施例提供的人机交互方法进行维权处理,则通过人机交互设备11与管理平台12之间的相互配合即可自动引导用户完成维权操作,无需人工客户的参与。以下部分将结合图3,以具体的维权场景为例对本发明实施提供的人机交互方案进行进一步说明。
当需要维权时,用户可通过人机交互设备11发起人机交互需求,该需求可具体表现为维权请求。例如,用户可在人机交互设备11提供的交互界面中输入“我要投诉卖家”、“我要申请售后”等文字或者语音内容,以请求进入首轮人机交互。人机交互设备11获取到用户发起的人机交互需求后,可对该人机交互需求进行语音识别和/或语义识别。基于识别的结果,确定首轮人机交互所属的目标交互意图。例如,可在对“我要申请售后”的人机交互需求进行语义识别,确定首轮人机交互所属的目标交互意图为“订单的维权 意图”。
接着,在确定首轮交互所属的目标交互意图为订单的维权意图与之后,人机交互设备11可向管理平台12发送与订单的维权意图相关的描述信息。管理平台12接收到携带与订单的维权意图相关的描述信息的请求时,可确定与订单的维权意图适配的目标交互引导流程。在这个场景下,该目标引导流程可以是:订单维权引导流程;假设该订单维权引导流程包含“确认买家诉求”、“引导买家上传举证”、“审核举证图片”以及“判断”四个维权引导节点,每个维权引导节点对应一个与订单维权操作相关的维权引导信息。以“确认买家诉求”对应的维权引导信息为例,可以包含“退款原因”、“是否本人签收”、“联系方式”等引导信息。以“引导买家上传举证”对应的维权引导信息为例,可以包含“上传商品图片”、“上传订单状态”等引导信息。
管理平台12在确定与订单的维权意图适配的订单维权引导流程之后,可向人机交互设备11发送对象获取请求,以请求订单的维权意图下待处理订单的标识。在订单维权的场景下,维权是针对订单发起的,待处理订单即为上述待处理对象。待处理订单的标识可以包括但不限于:订单编号、支付序列号、买家信息或者收货地址等。
人机交互设备11可在接收到管理平台12发送的订单获取请求时,获取订单维权意图下待处理订单的标识。可选的,人机交互设备11可以从本地预存的订单信息中获取待处理订单的标识。或者,人机交互设备11可以从保存有订单信息的数据库中获取待处理订单的标识。或者,人机交互设备11可以请求用户手动提供待处理订单的标识;在这种实施方式中,管理平台12可以向人机交互设备11发送订单选择器;人机交互设备11接收到订单选择器之后,可向用户展示该订单选择器,以供用户基于该订单选择器选择待处理订单,这样人机交互设备11就可以得到待处理订单的标识等信息。
在获得待处理订单的标识后,人机交互设备11可将用户选择的待处理订单的标识发送至管理平台12。管理平台12维护的订单维权引导流程包含四个维权引导节点,即四个维权环节,需要按照这四个维权环节的顺序引导用户与人机交互设备11完成维权。管理平台12接收到待处理订单的标识后,确定需要针对待处理订单启动订单维权引导流程,建立待处理订单的标识与订单维权引导流程的对应关系。
接着,管理平台12确定订单维权引导流程中首个维权引导节点,例如“确认买家诉求”对应的维权引导信息作为本轮交互对应的维权引导信息,并将该维权引导信息发送至人机交互设备11。其中,维权引导信息可以是用于引导维权流程的“信息卡片”的ID(identification,身份)信息,也可以是用于引导维权流程的话术信息,本实施例不做限 制。以“确认买家诉求”节点对应的维权引导信息为“信息卡片”的ID为例,人机交互设备11接收管理平台12发送的该“信息卡片”的ID后,可将该“信息卡片”的ID发送至渲染平台,以请求渲染平台渲染该“信息卡片”ID标识的信息卡片,并在获得渲染平台返回的信息卡片的渲染数据之后,展示该信息卡片。
可选的,信息卡片上可展示有可操作的按钮以及触发该按钮以执行何种操作的提示信息。用户可在阅读提示信息后,触发信息卡片上的按钮以执行信息卡片提示的操作。可选的,响应于用户的触发操作,人机交互设备11可向业务服务器13请求加载待填写信息页,并在业务服务器13返回待填写信息页的数据时展示待填写信息页。例如,在订单维权的场景下,该待填写信息页可以是待填写表单,该待填写表单包含若干待输入的内容,例如,订单中的商品存在质量问题的凭证上传区域、与卖家沟通的对话截图上传区域、退款金额输入区等,以供用户进行输入。
响应于用户在待填写信息页的输入操作,人机交互设备11可获取填写后的信息页并发送给业务服务器13。可选的,人机交互设备11在将填写后的信息页发送给业务服务器13时,可将填写后的信息页与流程标识关联后发送给业务服务器13。业务服务器13接收人机交互设备11发送的填写后的信息页之后,根据填写后的信息页生成本轮交互的交互结果,并将交互结果返回给人机交互设备11。
其中,业务服务器13可将上一轮交互对应的订单维权引导流程的流程标识一并返回给人机交互设备11,以指示人机交互设备11继续执行下一轮人机交互过程。当然,业务服务13也可以将交互结束标识一并返回给人机交互设备11,以指示人机交互设备11结束人机交互过程。
若人机交互设备11接收到的上一轮交互的交互结果包含流程标识,则人机交互设备11可判定目标交互意图下的交互过程未结束,可继续执行下一轮交互。人机交互设备11可基于该流程标识,确定本轮交互所属的目标交互意图为订单的维权意图。
接着,人机交互设备11可将该流程标识和订单的维权意图下待处理订单的标识发送给管理平台12。管理平台12接收到人机交互设备11发送的流程标识后,可从所维护的至少一个交互意图下的交互引导流程中,确定本轮交互所采用的流程为订单维权引导流程,并基于当前处理到的“确认买家诉求”节点,确定需要进入一下维权引导节点,例如“引导买家上传举证”,将“引导买家上传举证”节点对应的维权引导信息返回给人机交互设备11。若“引导买家上传举证”节点对应的维权引导信息为“信息卡片”ID,则人机交互设备11可采用前面描述的方式获取“引导买家上传举证”节点对应的交互结 果,并继续向管理平台12请求下一维权引导节点对应的维权引导信息,直至维权流程结束为止。
上述各实施例描述了本申请提供的人机交互系统的系统架构以及系统功能,以下部分将结合附图对本申请实施例提供的人机交互方法进行具体说明。
图4是本申请一示例性实施例提供的一种人机交互方法的方法流程图,该实施例可基于图1至图3所示的人机交互系统实现,主要是从人机交互设备的角度进行的描述。如图4所示,该方法包括:
步骤401、根据人机交互需求,确定本轮交互所属的目标交互意图。
步骤402、根据所述目标交互意图向管理平台请求本轮交互对应的交互引导信息,所述管理平台维护各交互意图下的交互引导流程。
步骤403、接收所述管理平台发送的本轮交互对应的交互引导信息。
步骤404、根据所述本轮交互对应的交互引导信息,获取本轮交互的交互结果。
在步骤401中,人机交互设备可与用户进行人机交互,根据人机交互需求,确定本轮交互所属的交互意图,为便于描述和分区,将本轮交互所属的交互意图记为目标交互意图。根据本轮交互的轮次,人机交互需求的情况会有所不同。若本轮交互是目标交互意图下的第一轮交互,则人机交互需求可以是用户通过人机交互设备发起的;若本轮交互是目标交互意图下的非第一轮交互,例如可能是第二轮交互、第三轮交互等,则人机交互需求可以是上一轮交互的交互结果引发的,本实施例不做限制。
在步骤402中,人机交互设备可与管理平台之间进行通信,在确定本轮交互所属的目标交互意图后,人机交互设备可根据该目标交互意图向管理平台请求本轮交互对应的交互引导信息,以使管理平台从所维护的至少一个交互意图下的交互引导流程中,获取与目标交互意图适配的目标交互引导流程。其中,交互引导信息,指的是用于对人机交互过程进行引导的信息,该信息可指示人机交互过程当前或者下一时刻应该执行的操作.
在步骤403中,人机交互设备获取到交互引导信息之后,可基于该交互引导信息,引导用户完成整个交互过程。
在本实施例中,在人机交互的场景中,人机交互设备根据人机交互需求,确定本轮交互所属的目标交互意图,基于该目标交互意图向管理平台请求本轮交互对应的交互引导信息,进而根据管理平台发送的本轮交互对应的交互引导信息可获取本轮交互的交互结果,无需通过人工引导交互过程,实现了人机自动交互,节约了人力成本,提高了人机交互效率。
在一些示例性实施例中,若本轮交互是目标交互意图下的第一轮交互,则人机交互需求可以是用户通过人机交互设备发起的;在另一些示例性实施例中,本轮交互是目标交互意图下的非第一轮交互,例如可能是第二轮交互、第三轮交互等,则人机交互需求可以是上一轮交互的交互结果引发的。以下部分将分别结合图5以及图6对上述两种情况下的人机交互方法进行详细说明。
图5是本申请另一示例性实施例提供的一种人机交互方法的首轮交互方法流程图,该实施例可基于图1至图3所示的人机交互系统实现,主要是从人机交互设备的角度进行的描述。如图5所示,该方法包括:
步骤501、展示交互界面,以供用户输入交互需求。
步骤502、对交互需求进行意图识别,以确定首轮交互所属的目标交互意图。
步骤503、向管理平台发送目标交互意图的描述信息,以供管理平台确定与目标交互意图适配的目标交互引导流程;管理平台维护各交互意图下的交互引导流程。
步骤504、接收管理平台发送的对象获取请求,获取目标交互意图下待处理对象的标识。
步骤505、将目标交互意图下待处理对象的标识发送至管理平台,以使管理平台建立待处理对象的标识与目标交互引导流程的对应关系。
步骤506、接收管理平台发送的首轮交互对应的交互引导信息。
步骤507、根据首轮交互对应的交互引导信息,获取首轮交互的交互结果。
在步骤501中,可选的,人机交互设备可展示交互界面,以供用户输入交互需求;用户可以在该交互界面上输入交互需求。
根据人机交互场景的不同,用户输入的交互需求的内容也会有所不同。例如,在维权场景中,用户可以输入“订单维权”等字样。
在步骤502中,人机交互设备获取用户在交互界面上输入的交互需求之后,可对用户输入的交互需求进行意图识别,以确定首轮交互所属的目标交互意图。可选的,若用户在交互界面输入的交互需求为文本字符,则人机交互设备可根据该文本字符进行语义识别,以确定首轮交互所属的目标交互意图。可选的,若用户在交互界面输入的交互需求为语音信号,则可先对该语音信号进行语音识别,基于语音识别结果进行语义识别以确定首轮交互所属的目标交互意图。
在步骤503中,人机交互设备确定首轮交互所属的目标交互意图后,向管理平台发送所述目标交互意图的描述信息,以供所述管理平台确定与所述目标交互意图适配的目 标交互引导流程。
在步骤504中,可选的,待处理对象,是指在目标交互意图下,人机交互所针对的对象。可选地,人机交互设备可以向用户展示交互界面,并提示用户在交互界面上输入待处理对象的标识,进而可获取用户在交互界面上输入的待处理对象的标识,例如订单号、商品ID或游戏道具的名称等。用户在交互界面上输入待处理对象的标识的方式可以是手写输入、语音输入、键盘输入、利用输入笔输入等。
在步骤505中,可选的,确定待处理对象的标识后,将该待处理对象的标识发送至管理平台。
在步骤506-步骤507中,接收到管理平台发送的首轮交互对应的交互引导信息之后,基于该交互引导信息,获取首轮交互的交互结果。
本实施例中,人机交互设备在管理平台的配合下,实现了人机自动交互,节约了人力成本,提高了人机交互效率。除此之外,人机交互设备展示交互界面获取用户输入的交互需求,并基于交互需求进行意图识别,进而能够准确地把握用户的交互意图,有利于准确地对用户的交互需求作出响应。
在一些示例性的实施例中,在步骤503之后,该方法还可包括如下的步骤:接收管理平台返回的所述目标交互引导流程的流程标识。
在一些可选实施例中,首轮交互之后还存在下一轮交互,例如第二轮交互、第三轮交互或者更多轮次的交互。在这种情况下,人机交互设备可基于上一轮交互中接收到的流程标识确定下一轮交互所属的交互意图,并可直接根据该流程标识向管理平台请求下一轮交互对应的交互引导信息。以下实施例将结合图6,对如何根据上一轮交互中接收到的流程标识向管理平台请求下一轮交互对应的交互引导信息的实施方式进行详细描述。
图6是本申请又一示例性实施例提供的一种人机交互方法的非首轮交互方法流程图,该实施例可基于图1至图3所示的人机交互系统实现,主要是从人机交互设备的角度进行的描述。如图6所示,该方法包括:
步骤601、根据上一轮交互结果关联的流程标识,确定流程标识对应的交互意图作为本轮交互所属的目标交互意图。
步骤602、将流程标识和目标交互意图下待处理对象的标识发送给管理平台,以供管理平台从流程标识所标识的目标交互引导流程中获取本轮交互对应的交互引导信息。
步骤603、接收管理平台发送的本轮交互对应的交互引导信息。
步骤604、根据本轮交互对应的交互引导信息,获取本轮交互的交互结果。
不同于上述实施例,本实施例中,人机交互设备可以根据上一轮交互所获得的流程标识确定本轮交互所属的目标交互意图,并将该流程标识和目标交互意图下待处理对象的标识发送给管理平台,以向管理平台请求本轮交互对应的交互引导信息。
其中,待处理对象的标识可在首轮交互过程中获取,具体可参考上述实施例;首轮交互过程中获取到待处理对象的标识后,可保存在本地,以供后续交互过程使用。
在本实施例中,针对人机交互场景下的非首轮交互,基于上一轮交互结果关联的流程标识请求本轮交互对应的交互引导信息,可以实现多轮次交互,并且多轮次交互同属于与同一目标交互意图适配的目标交互引导流程,进而,在无人工引导的情况下,可完成目标交互意图下包含多轮次交互的复杂交互过程,使得人机交互更加智能化。
在一些示例性实施例中,上述实施例中的步骤404、步骤507以及步骤604所记载的根据本轮交互对应的交互引导信息,获取本轮交互的交互结果的一种方式,可包括:
根据本轮交互对应的交互引导信息,展示待填写信息页;接着,响应用户在所述待填写信息页上的输入操作,获取填写后的信息页并发送给业务服务器;以及接收业务服务器根据所述填写后的信息页返回的本轮交互的交互结果。
在一些示例性实施例中,响应用户在待填写信息页上的输入操作,获取填写后的信息页并发送给业务服务器的一种方式,包括:将填写后的信息页与流程标识关联后发送给所述业务服务器。
在一些示例性实施例中,接收业务服务器根据填写后的信息页返回的本轮交互的交互结果的一种方式,包括:接收业务服务器返回的本轮交互的交互结果以及流程标识,本轮交互的交互结果和流程标识相关联。
图7是本申请又一示例性实施例提供的一种人机交互方法的方法流程图,该实施例可基于图1至图3所示的人机交互系统实现,主要是从管理平台的角度进行的描述。如图7所示,该方法包括:
步骤701、接收人机交互设备发送的请求,所述请求携带所述人机交互设备本轮交互所所属的目标交互意图。
步骤702、从至少一个交互意图下的交互引导流程中,获取与所述目标交互意图适配的目标交互引导流程,每个交互引导流程包括至少一个引导节点,每个引导节点对应一个交互引导信息。
步骤703、根据所述目标交互引导流程当前所处的引导节点,确定下一引导节点对应的交互引导信息。
步骤704、将所述下一引导节点对应的交互引导信息作为所述本轮交互对应的交互引导信息返回给所述人机交互设备。
在本实施例中,管理平台维护至少一个交互意图下的交互引导流程,不同交互意图可以对应不同的人机交互场景。管理平台可接收人机交互设备发送的请求,并根据接收到的请求,从所维护的至少一个交互意图下的交互引导流程中,获取与目标交互意图适配的目标交互引导流程。
其中,每个交互引导流程包括至少一个引导节点,每个引导节点代表该交互引导流程中的一个引导环节,引导节点的数量代表了该交互引导流程中引导环节的数量,引导节点之间的上下游关系可体现交互引导流程中引导环节之间先后顺序,每个引导节点对应一个交互引导信息,用于指示如何在该引导节点所代表的引导环节对人机交互过程进行引导。
在确定目标交互引导流程之后,管理平台可根据目标交互引导流程当前所处的引导节点,确定下一引导节点对应的交互引导信息,并将该下一引导节点对应的交互引导信息作为本轮交互对应的交互引导信息返回给人机交互设备。
在本实施例中,在人机交互的场景中,管理平台接收到目标交互意图时,从所维护的至少一个交互意图下的交互引导流程中确定与目标交互意图适配的目标交互引导流程,并基于目标交互引导流程当前所处的引导节点,确定下一引导节点对应的交互引导信息,以使人机交互设备基于本轮交互对应的交互引导信息可获取本轮交互的交互结果,进而,使得人机交互过程无需通过人工进行引导,实现了人机自动交互,节约了人力成本,提高了人机交互效率。
在一些示例性实施例中,若本轮交互是目标交互意图下的第一轮交互,则管理平台接收到的人机交互设备发送的请求携带目标交互意图的描述信息;在另一些示例性实施例中,本轮交互是目标交互意图下的非第一轮交互,例如可能是第二轮交互、第三轮交互等,则管理平台接收到的人机交互设备发送的请求携带流程标识。以下部分将分别结合图8以及图9对上述两种情况下的人机交互方法进行详细说明。
图8是本申请又一示例性实施例提供的一种人机交互方法的首轮交互方法流程图,该实施例可基于图1至图3所示的人机交互系统实现,主要是从管理平台的角度进行的描述。如图8所示,该方法包括:
步骤801、接收人机交互设备发送的请求,所述请求携带所述目标交互意图的描述信息。
步骤802、选择描述信息与所述请求中携带的描述信息相匹配的交互意图下的交互引导流程,作为所述目标交互引导流程,每个交互引导流程包括至少一个引导节点,每个引导节点对应一个交互引导信息。
步骤803、向所述人机交互设备发送对象获取请求,以请求所述目标交互意图下待处理对象的标识。
步骤804、接收所述人机交互设备发送的所述处理对象的标识,建立所述待处理对象的标识与所述目标交互引导流程的对应关系。
步骤805、根据所述目标交互引导流程当前所处的引导节点,确定下一引导节点对应的交互引导信息。
步骤806、将所述下一引导节点对应的交互引导信息作为所述本轮交互对应的交互引导信息返回给所述人机交互设备。
在本实施例中,可选的,管理平台维护至少一个交互意图下的交互引导流程的方式可以是:维护各交互意图的描述信息和各交互意图下的交互引导流程的映射关系。例如,描述信息为M的交互意图下的交互引导流程包括PM;描述信息为N的交互意图下的交互引导流程包括PN。
管理平台接收到携带目标交互意图的描述信息的请求时,可选择描述信息与人机交互设备发送的请求中携带的描述信息相匹配的交互意图下的交互引导流程,作为与目标交互意图适配的目标交互引导流程。
可选的,本实施例中管理平台建立的待处理对象的标识与目标交互引导流程的对应关系,可用于在下一次接收到与目标引导流程关联的交互需求时,根据该待处理对象的标识与目标交互引导流程的对应关系,确定下一次交互需求对应的待处理对象。
在一些示例性的实施例中,在步骤802之后,该方法还可包括如下的步骤:将所述目标交互引导流程的流程标识发送至所述人机交互设备,以供所述人机交互设备基于所述流程标识请求下一轮交互对应的交互引导信息。
在一些示例性的实施例中,首轮交互之后还存在下一轮交互,例如第二轮交互、第三轮交互或者更多轮次的交互。在这种情况下,人机交互设备可基于管理平台发送的流程标识确定下一轮交互所属的交互意图,并可直接根据该流程标识向管理平台请求下一轮交互对应的交互引导信息。以下实施例将结合图9,对管理平台如何根据接收到的流 程标识请求下一轮交互对应的交互引导信息的实施方式进行详细描述。
图9是本申请又一示例性实施例提供的一种人机交互方法的非首轮交互方法流程图,该实施例可基于图1至图3所示的人机交互系统实现,主要是从管理平台的角度进行的描述。如图9所示,该方法包括:
步骤901、接收人机交互设备发送的请求,请求携带流程标识。
步骤902、从至少一个交互意图下的交互引导流程中,获取流程标识所标识的交互引导流程作为目标交互引导流程,每个交互引导流程包括至少一个引导节点,每个引导节点对应一个交互引导信息。
步骤903、根据目标交互引导流程当前所处的引导节点,确定下一引导节点对应的交互引导信息。
步骤904、将下一引导节点对应的交互引导信息作为本轮交互对应的交互引导信息返回给人机交互设备。
不同于上述实施例,本实施例中,流程平台接收到的请求中携带流程标识,管理平台可从所维护的至少一个交互意图下的交互引导流程中,获取流程标识所标识的交互引导流程作为目标交互引导流程。进而,基于该流程标识,可以实现多轮次交互,并且多轮次交互同属于与同一目标交互意图适配的目标交互引导流程,在无人工引导的情况下,可完成目标交互意图下包含多轮次交互的复杂交互过程,使得人机交互更加智能化。
需要说明的是,在上述实施例及附图中的描述的一些流程中,包含了按照特定顺序出现的多个操作,但是应该清楚了解,这些操作可以不按照其在本文中出现的顺序来执行或并行执行,操作的序号如901、902等,仅仅是用于区分开各个不同的操作,序号本身不代表任何的执行顺序。另外,这些流程可以包括更多或更少的操作,并且这些操作可以按顺序执行或并行执行。需要说明的是,本文中的“第一”、“第二”等描述,是用于区分不同的消息、设备、模块等,不代表先后顺序,也不限定“第一”和“第二”是不同的类型。
以上描述了人机交互方法适用于人机交互设备侧的可选实施例,如图10所示,实际中,人机交互设备11可包括:存储器110、处理器111以及通信组件112、电子显示屏113、音频组件114以及电源组件115。
存储器110可被配置为存储其它各种数据以支持在人机交互设备11上的操作。这些数据的示例包括用于在人机交互设备11上操作的任何应用程序或方法的指令,联系人数据,电话簿数据,消息,图片,视频等。存储器可以由任何类型的易失性或非易失性存 储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。
在本实施例中,存储器110用于存储一条或多条计算机指令。
处理器111,耦合至存储器110,用于执行存储器110中的一条或多条计算机指令,以用于:根据人机交互需求,确定本轮交互所属的目标交互意图;根据所述目标交互意图向管理平台请求本轮交互对应的交互引导信息,所述管理平台维护各交互意图下的交互引导流程;通过通信组件112接收所述管理平台发送的本轮交互对应的交互引导信息;根据所述本轮交互对应的交互引导信息,获取本轮交互的交互结果。
在一可选实施方式中,处理器110在根据人机交互需求,确定本轮交互所属的目标交互意图时,具体用于:展示交互界面,以供用户输入交互需求;对所述交互需求进行意图识别,以确定本轮交互所属的目标交互意图;或者接收用户提供的语音信息,所述语音信息携带所述用户的交互需求;针对所述语音信息进行意图识别,以确定本轮交互所属的目标交互意图。
在一可选实施方式中,处理器110在根据所述目标交互意图向管理平台请求本轮交互对应的交互引导信息时,具体用于:向所述管理平台发送所述目标交互意图的描述信息,以供所述管理平台确定与所述目标交互意图适配的目标交互引导流程。
在一可选实施方式中,处理器110在通过通信组件112接收所述管理平台发送的本轮交互对应的交互引导信息之前,还用于:通过通信组件112接收所述管理平台发送的对象获取请求,获取所述目标交互意图下待处理对象的标识;通过通信组件112将所述目标交互意图下待处理对象的标识发送至管理平台,以使所述管理平台建立所述待处理对象的标识与所述目标交互引导流程的对应关系。
在一可选实施方式中,处理器110还用于:通过通信组件112接收所述管理平台返回的所述目标交互引导流程的流程标识。
在一可选实施方式中,处理器110在根据人机交互需求,确定本轮交互所属的目标交互意图时,具体用于:根据上一轮交互结果关联的流程标识,确定所述流程标识对应的交互意图作为本轮交互所属的目标交互意图。
在一可选实施方式中,处理器110在根据所述目标交互意图向管理平台请求本轮交互对应的交互引导信息时,具体用于:通过通信组件112将所述流程标识和所述目标交互意图下待处理对象的标识发送给所述管理平台,以供所述管理平台从所述流程标识所 标识的目标交互引导流程中获取本轮交互对应的交互引导信息。
在一可选实施方式中,处理器110在获得本轮交互的交互结果之后,还用于:当根据所述本轮交互的交互结果确定存在下一轮交互时,根据所述流程标识和所述目标交互意图下待处理对象的标识向所述管理平台请求下一轮交互对应的交互引导信息。
在一可选实施方式中,处理器110在根据所述本轮交互对应的交互引导信息,获取本轮交互的交互结果时,具体用于:根据所述本轮交互对应的交互引导信息,展示待填写信息页;响应所述用户在所述待填写信息页上的输入操作,获取填写后的信息页并发送给业务服务器;以及接收所述业务服务器根据所述填写后的信息页返回的所述本轮交互的交互结果。
在一可选实施方式中,处理器110在响应所述用户在所述待填写信息页上的输入操作,获取填写后的信息页并发送给业务服务器时,具体用于:通过通信组件112将所述填写后的信息页与所述流程标识关联后发送给所述业务服务器;在接收所述业务服务器根据所述填写后的信息页返回的所述本轮交互的交互结果时,具体用于:通过通信组件112接收所述业务服务器返回的所述本轮交互的交互结果以及所述流程标识,所述本轮交互的交互结果和所述流程标识相关联。
在一可选实施方式中,所述本轮交互所属的目标交互意图包括:针对订单的维权意图。
在一可选实施方式中,电子显示屏113,用于显示待填写信息页以及交互结果。其中,电子显示屏113包括液晶显示器(LCD)和触摸面板(TP)。如果电子显示屏113包括触摸面板,电子显示屏113可以被实现为触摸屏,以接收来自用户的输入信号。触摸面板包括一个或多个触摸传感器以感测触摸、滑动和触摸面板上的手势。触摸传感器可以不仅感测触摸或滑动动作的边界,而且还检测与触摸或滑动操作相关的持续时间和压力。
在一可选实施方式中,音频组件114被存储为输出和/或输入音频信号。例如,音频组件114包括一个麦克风(MIC),当音频组件114所在设备处于操作模式,如呼叫模式、记录模式和语音识别模式时,麦克风被配置为接收外部音频信号。所接收的音频信号可以被进一步存储在存储器110或经由通信组件112发送。例如,处理器111在根据人机交互需求,确定本轮交互所属的目标交互意图时,可通过音频组件114的麦克风接收用户输入的语音信号,对语音信号进行语音识别以及语义识别,以确定目标交互意图。在一些实施例中,音频组件114还包括一个扬声器,用于输出音频信号。
在一可选实施方式中,电源组件115用于为人机交互设备11的各种组件提供电力。电源组件可以包括电源管理系统,一个或多个电源,及其他与为人机交互设备11生成、管理和分配电力相关联的组件。
在本实施例中,在人机交互的场景中,人机交互设备11根据人机交互需求,确定本轮交互所属的目标交互意图,基于该目标交互意图向管理平台请求本轮交互对应的交互引导信息,进而根据管理平台发送的本轮交互对应的交互引导信息可获取本轮交互的交互结果,无需通过人工引导交互过程,实现了人机自动交互,节约了人力成本,提高了人机交互效率。
相应地,本申请实施例还提供一种存储有计算机程序的计算机可读存储介质,计算机程序被执行时能够实现上述方法实施例中可由人机交互设备执行的方法实施例中各步骤。
以上描述了人机交互方法适用于管理平台侧的可选实施例,如图11所示,实际中,管理平台12可包括:存储器120、处理器121、通信组件122以及电源组件123。
存储器120可被配置为存储其它各种数据以支持在管理平台12上的操作。这些数据的示例包括用于在管理平台12上操作的任何应用程序或方法的指令,联系人数据,电话簿数据,消息,图片,视频等。存储器可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。
在本实施例中,存储器110用于存储一条或多条计算机指令。
处理器121,耦合至存储器120,用于执行存储器120中的一条或多条计算机指令,以用于:通过通信组件122接收人机交互设备发送的请求,所述请求携带人机交互设备本轮交互所所属的目标交互意图;从至少一个交互意图下的交互引导流程中,获取与所述目标交互意图适配的目标交互引导流程,每个交互引导流程包括至少一个引导节点,每个引导节点对应一个交互引导信息;根据所述目标交互引导流程当前所处的引导节点,确定下一引导节点对应的交互引导信息;通过通信组件122将所述下一引导节点对应的交互引导信息作为所述本轮交互对应的交互引导信息返回给人机交互设备。
在一可选实施方式中,所述请求携带流程标识;处理器120在从至少一个交互意图下的交互引导流程中,获取与所述目标交互意图适配的目标交互引导流程时,具体用于: 从所述至少一个交互意图下的交互引导流程中,获取所述流程标识所标识的交互引导流程作为所述目标交互引导流程。
在一可选实施方式中,所述请求携带所述目标交互意图的描述信息;处理器120在从至少一个交互意图下的交互引导流程中,获取与所述目标交互意图适配的目标交互引导流程时,具体用于:选择描述信息与所述请求中携带的描述信息相匹配的交互意图下的交互引导流程,作为所述目标交互引导流程;在从至少一个交互意图下的交互引导流程中,获取与所述目标交互意图适配的目标交互引导流程之后,处理器120还用于:将所述目标交互引导流程的流程标识发送至所述人机交互设备,以供所述人机交互设备基于所述流程标识请求下一轮交互对应的交互引导信息。
在一可选实施方式中,处理器120在根据所述目标交互引导流程当前所处的引导节点,确定下一引导节点对应的交互引导信息之前,还用于:通过通信组件122向人机交互设备发送对象获取请求,以请求所述目标交互意图下待处理对象的标识;通过通信组件122接收人机交互设备发送的所述处理对象的标识;建立所述待处理对象的标识与所述目标交互引导流程的对应关系。
在一可选实施方式中,电源组件123用于为管理平台12的各种组件提供电力。电源组件可以包括电源管理系统,一个或多个电源,及其他与为管理平台12生成、管理和分配电力相关联的组件。
在本实施例中,在人机交互的场景中,管理平台12可根据人机交互设备11的请求,确定本轮交互对应的交互引导信息,并将该交互引导信息发送至人机交互设备,以使人机交互设备根据本轮交互对应的交互引导信息获取本轮交互的交互结果,无需通过人工引导交互过程,实现了人机自动交互,节约了人力成本,提高了人机交互效率。
相应地,本申请实施例还提供一种存储有计算机程序的计算机可读存储介质,计算机程序被执行时能够实现上述方法实施例中可由人管理平台12执行的方法实施例中各步骤。
本领域内的技术人员应明白,本发明的实施例可提供为方法、系统、或计算机程序产品。因此,本发明可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本发明可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本发明是参照根据本发明实施例的方法、设备(系统)、和计算机程序产品的流程 图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
在一个典型的配置中,计算设备包括一个或多个处理器(CPU)、输入/输出接口、网络接口和内存。
内存可能包括计算机可读介质中的非永久性存储器,随机存取存储器(RAM)和/或非易失性内存等形式,如只读存储器(ROM)或闪存(flash RAM)。内存是计算机可读介质的示例。
计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括暂存电脑可读媒体(transitory media),如调制的数据信号和载波。
还需要说明的是,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、商品或者设备不仅包括那些要素,而 且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、商品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、商品或者设备中还存在另外的相同要素。
以上所述仅为本申请的实施例而已,并不用于限制本申请。对于本领域技术人员来说,本申请可以有各种更改和变化。凡在本申请的精神和原理之内所作的任何修改、等同替换、改进等,均应包含在本申请的权利要求范围之内。

Claims (21)

  1. 一种人机交互方法,适用于人机交互设备,其特征在于,包括:
    根据人机交互需求,确定本轮交互所属的目标交互意图;
    根据所述目标交互意图向管理平台请求本轮交互对应的交互引导信息,所述管理平台维护各交互意图下的交互引导流程;
    接收所述管理平台发送的本轮交互对应的交互引导信息;
    根据所述本轮交互对应的交互引导信息,获取本轮交互的交互结果。
  2. 根据权利要求1所述的方法,其特征在于,根据人机交互需求,确定本轮交互所属的目标交互意图,包括:
    展示交互界面,以供用户输入交互需求;对所述交互需求进行意图识别,以确定本轮交互所属的目标交互意图;或者
    接收用户提供的语音信息,所述语音信息携带所述用户的交互需求;针对所述语音信息进行意图识别,以确定本轮交互所属的目标交互意图。
  3. 根据权利要求2所述的方法,其特征在于,根据所述目标交互意图向管理平台请求本轮交互对应的交互引导信息,包括:
    向所述管理平台发送所述目标交互意图的描述信息,以供所述管理平台确定与所述目标交互意图适配的目标交互引导流程。
  4. 根据权利要求3所述的方法,其特征在于,在接收所述管理平台发送的本轮交互对应的交互引导信息之前,所述方法还包括:
    接收所述管理平台发送的对象获取请求,获取所述目标交互意图下待处理对象的标识;
    将所述目标交互意图下待处理对象的标识发送至管理平台,以使所述管理平台建立所述待处理对象的标识与所述目标交互引导流程的对应关系。
  5. 根据权利要求3所述的方法,其特征在于,还包括:
    接收所述管理平台返回的所述目标交互引导流程的流程标识。
  6. 根据权利要求1所述的方法,其特征在于,根据人机交互需求,确定本轮交互所属的目标交互意图,包括:
    根据上一轮交互结果关联的流程标识,确定所述流程标识对应的交互意图作为本轮交互所属的目标交互意图。
  7. 根据权利要求6所述的方法,其特征在于,根据所述目标交互意图向管理平台请 求本轮交互对应的交互引导信息,包括:
    将所述流程标识和所述目标交互意图下待处理对象的标识发送给所述管理平台,以供所述管理平台从所述流程标识所标识的目标交互引导流程中获取本轮交互对应的交互引导信息。
  8. 根据权利要求5-7任一项所述的方法,其特征在于,在获得本轮交互的交互结果之后,还包括:
    当根据所述本轮交互的交互结果确定存在下一轮交互时,根据所述流程标识和所述目标交互意图下待处理对象的标识向所述管理平台请求下一轮交互对应的交互引导信息。
  9. 根据权利要求5-7任一项所述的方法,其特征在于,根据所述本轮交互对应的交互引导信息,获取本轮交互的交互结果,包括:
    根据所述本轮交互对应的交互引导信息,展示待填写信息页;
    响应用户在所述待填写信息页上的输入操作,获取填写后的信息页并发送给业务服务器;以及
    接收所述业务服务器根据所述填写后的信息页返回的所述本轮交互的交互结果。
  10. 根据权利要求9所述的方法,其特征在于,响应所述用户在所述待填写信息页上的输入操作,获取填写后的信息页并发送给业务服务器,包括:
    将所述填写后的信息页与所述流程标识关联后发送给所述业务服务器;
    所述接收所述业务服务器根据所述填写后的信息页返回的所述本轮交互的交互结果,包括:
    接收所述业务服务器返回的所述本轮交互的交互结果以及所述流程标识,所述本轮交互的交互结果和所述流程标识相关联。
  11. 根据权利要求1-7中任一项所述的方法,其特征在于,所述本轮交互所属的目标交互意图包括:针对订单的维权意图。
  12. 一种交互方法,适用于管理平台,其特征在于,包括:
    接收人机交互设备发送的请求,所述请求携带所述人机交互设备本轮交互所属的目标交互意图;
    从至少一个交互意图下的交互引导流程中,获取与所述目标交互意图适配的目标交互引导流程,每个交互引导流程包括至少一个引导节点,每个引导节点对应一个交互引导信息;
    根据所述目标交互引导流程当前所处的引导节点,确定下一引导节点对应的交互引导信息;
    将所述下一引导节点对应的交互引导信息作为所述本轮交互对应的交互引导信息返回给所述人机交互设备。
  13. 根据权利要求12所述的方法,其特征在于,所述请求携带流程标识;
    从至少一个交互意图下的交互引导流程中,获取与所述目标交互意图适配的目标交互引导流程,包括:
    从所述至少一个交互意图下的交互引导流程中,获取所述流程标识所标识的交互引导流程作为所述目标交互引导流程。
  14. 根据权利要求12所述的方法,其特征在于,所述请求携带所述目标交互意图的描述信息;
    从至少一个交互意图下的交互引导流程中,获取与所述目标交互意图适配的目标交互引导流程,包括:
    选择描述信息与所述请求中携带的描述信息相匹配的交互意图下的交互引导流程,作为所述目标交互引导流程;在从至少一个交互意图下的交互引导流程中,获取与所述目标交互意图适配的目标交互引导流程之后,还包括:
    将所述目标交互引导流程的流程标识发送至所述人机交互设备,以供所述人机交互设备基于所述流程标识请求下一轮交互对应的交互引导信息。
  15. 根据权利要求14所述的方法,其特征在于,根据所述目标交互引导流程当前所处的引导节点,确定下一引导节点对应的交互引导信息之前,还包括:
    向所述人机交互设备发送对象获取请求,以请求所述目标交互意图下待处理对象的标识;
    接收所述人机交互设备发送的所述处理对象的标识;
    建立所述待处理对象的标识与所述目标交互引导流程的对应关系。
  16. 一种人机交互设备,其特征在于,包括:存储器、处理器以及通信组件;
    所述存储器,用于存储一条或多条计算机指令;
    所述处理器,用于执行一条或多条计算机指令,以用于:
    根据人机交互需求,确定本轮交互所属的目标交互意图;
    根据所述目标交互意图向管理平台请求本轮交互对应的交互引导信息,所述管理平台维护各交互意图下的交互引导流程;
    通过所述通信组件接收所述管理平台发送的本轮交互对应的交互引导信息;
    根据所述本轮交互对应的交互引导信息,获取本轮交互的交互结果。
  17. 一种存储有计算机程序的计算机可读存储介质,其特征在于,所述计算机程序被执行时能够实现权利要求1-11所述方法中的步骤。
  18. 一种管理平台,其特征在于,包括:存储器、处理器以及通信组件;
    所述存储器,用于存储一条或多条计算机指令;
    所述处理器,用于执行一条或多条计算机指令,以用于:
    通过所述通信组件接收人机交互设备发送的请求,所述请求携带所述人机交互设备本轮交互所所属的目标交互意图;
    从至少一个交互意图下的交互引导流程中,获取与所述目标交互意图适配的目标交互引导流程,每个交互引导流程包括至少一个引导节点,每个引导节点对应一个交互引导信息;
    根据所述目标交互引导流程当前所处的引导节点,确定下一引导节点对应的交互引导信息;
    通过所述通信组件将所述下一引导节点对应的交互引导信息作为所述本轮交互对应的交互引导信息返回给所述人机交互设备。
  19. 一种存储有计算机程序的计算机可读存储介质,其特征在于,所述计算机程序被执行时能够实现权利要求12-15所述方法中的步骤。
  20. 一种人机交互系统,其特征在于,包括:人机交互设备和管理平台;
    所述人机交互设备,用于根据人机交互需求,确定本轮交互所属的目标交互意图;根据所述目标交互意图向所述管理平台请求本轮交互对应的交互引导信息,接收所述管理平台发送的本轮交互对应的交互引导信息;根据所述本轮交互对应的交互引导信息,获取本轮交互的交互结果;
    所述管理平台,用于根据所述人机交互设备的请求,从所维护的至少一个交互意图下的交互引导流程中,获取与所述目标交互意图适配的目标交互引导流程,根据所述目标交互引导流程当前所处的引导节点,确定下一引导节点对应的交互引导信息;将所述下一引导节点对应的交互引导信息作为所述本轮交互对应的交互引导信息返回给所述人机交互设备;其中,每个交互引导流程包括至少一个引导节点,每个引导节点对应一个交互引导信息。
  21. 根据权利要求20所述的系统,其特征在于,还包括:业务服务器;
    所述人机交互设备在获取本轮交互的交互结果时,具体用于:根据所述本轮交互对应的交互引导信息,展示待填写信息页;响应用户在所述待填写信息页上的输入操作,获取填写后的信息页并发送给所述业务服务器;以及接收所述业务服务器根据所述填写后的信息页返回的所述本轮交互的交互结果;
    所述业务服务器,用于接收所述人机交互设备发送的所述填写后的信息页,根据所述填写后的信息页生成所述本轮交互的交互结果并返回给所述人机交互设备。
PCT/CN2019/092679 2018-07-06 2019-06-25 人机交互方法、设备、系统及存储介质 WO2020007214A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810735560.XA CN110689393B (zh) 2018-07-06 2018-07-06 人机交互方法、设备、系统及存储介质
CN201810735560.X 2018-07-06

Publications (1)

Publication Number Publication Date
WO2020007214A1 true WO2020007214A1 (zh) 2020-01-09

Family

ID=69059197

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/092679 WO2020007214A1 (zh) 2018-07-06 2019-06-25 人机交互方法、设备、系统及存储介质

Country Status (2)

Country Link
CN (1) CN110689393B (zh)
WO (1) WO2020007214A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111651571A (zh) * 2020-05-19 2020-09-11 腾讯科技(深圳)有限公司 基于人机协同的会话实现方法、装置、设备及存储介质
CN113268336A (zh) * 2021-06-25 2021-08-17 中国平安人寿保险股份有限公司 一种服务的获取方法、装置、设备以及可读介质
CN113761183A (zh) * 2020-07-30 2021-12-07 北京汇钧科技有限公司 意图识别方法和意图识别装置

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111506698A (zh) * 2020-03-13 2020-08-07 浙江执御信息技术有限公司 一种基于sop智能流程处理方法
CN111933134A (zh) * 2020-07-23 2020-11-13 珠海大横琴科技发展有限公司 人机交互的方法及装置、电子设备、存储介质
CN113220272B (zh) * 2021-04-27 2022-11-29 支付宝(杭州)信息技术有限公司 一种业务平台的开放能力接入方法、装置以及设备

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107656996A (zh) * 2017-09-19 2018-02-02 北京百度网讯科技有限公司 基于人工智能的人机交互方法和装置
CN107831903A (zh) * 2017-11-24 2018-03-23 科大讯飞股份有限公司 多人参与的人机交互方法及装置

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105159977B (zh) * 2015-08-27 2019-01-25 百度在线网络技术(北京)有限公司 信息交互处理方法及装置
CN105068661B (zh) * 2015-09-07 2018-09-07 百度在线网络技术(北京)有限公司 基于人工智能的人机交互方法和系统
CN106776936B (zh) * 2016-12-01 2020-02-18 上海智臻智能网络科技股份有限公司 智能交互方法和系统
CN107193978A (zh) * 2017-05-26 2017-09-22 武汉泰迪智慧科技有限公司 一种基于深度学习的多轮自动聊天对话方法及系统
CN107273477A (zh) * 2017-06-09 2017-10-20 北京光年无限科技有限公司 一种用于机器人的人机交互方法及装置
CN108090177B (zh) * 2017-12-15 2020-05-05 上海智臻智能网络科技股份有限公司 多轮问答系统的生成方法、设备、介质及多轮问答系统
CN107977236B (zh) * 2017-12-21 2020-11-13 上海智臻智能网络科技股份有限公司 问答系统的生成方法、终端设备、存储介质及问答系统

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107656996A (zh) * 2017-09-19 2018-02-02 北京百度网讯科技有限公司 基于人工智能的人机交互方法和装置
CN107831903A (zh) * 2017-11-24 2018-03-23 科大讯飞股份有限公司 多人参与的人机交互方法及装置

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111651571A (zh) * 2020-05-19 2020-09-11 腾讯科技(深圳)有限公司 基于人机协同的会话实现方法、装置、设备及存储介质
CN111651571B (zh) * 2020-05-19 2023-10-17 腾讯科技(深圳)有限公司 基于人机协同的会话实现方法、装置、设备及存储介质
CN113761183A (zh) * 2020-07-30 2021-12-07 北京汇钧科技有限公司 意图识别方法和意图识别装置
CN113268336A (zh) * 2021-06-25 2021-08-17 中国平安人寿保险股份有限公司 一种服务的获取方法、装置、设备以及可读介质
CN113268336B (zh) * 2021-06-25 2023-09-19 中国平安人寿保险股份有限公司 一种服务的获取方法、装置、设备以及可读介质

Also Published As

Publication number Publication date
CN110689393A (zh) 2020-01-14
CN110689393B (zh) 2022-08-02

Similar Documents

Publication Publication Date Title
WO2020007214A1 (zh) 人机交互方法、设备、系统及存储介质
US9454779B2 (en) Assisted shopping
US12019703B2 (en) Systems and methods for providing a marketplace where data and algorithms can be chosen and interact via encryption
US8630851B1 (en) Assisted shopping
US20190370799A1 (en) Application for creating real time smart contracts
US11106420B2 (en) Method, device, system and storage medium for information transmission and data processing
CN108064373B (zh) 资源转移方法及装置
KR20180051556A (ko) 서비스 기능을 구현하는 방법 및 디바이스
JP2020502693A (ja) サービス処理方法および装置
TWI684149B (zh) 交互資訊的處理方法、裝置及系統
WO2016062203A1 (zh) 对离散数据进行集中处理的方法、客户端、服务器及系统
CN108073429A (zh) 一种支付方式配置方法、装置、设备及存储介质
WO2013097024A1 (en) Graphical interface and input method for allocating an invoice amongst a plurality of accounts
CN110490747A (zh) 一种信托对接方法、装置、服务器及存储介质
WO2019062270A1 (zh) 订单的快捷处理方法和装置
CN111507698A (zh) 用于转账的处理方法和装置、计算设备及介质
CN106302368A (zh) 事务处理方法及装置
CN114153362A (zh) 信息处理方法及装置
CN110415067A (zh) 下单方法、设备及存储介质
US10867068B2 (en) Personal computing devices with assisted form completion
WO2021036894A1 (zh) 电子名片处理方法、设备、系统及存储介质
CN112016993B (zh) 服务订单处理方法、系统、设备及存储介质
CN111626834B (zh) 智能税务处理方法、装置、终端和介质
KR102246611B1 (ko) 인앱 소프트웨어 구매에 대한 마켓에 의한 가격 차별화 기법
CN107515720A (zh) 一种消息处理方法、介质、装置和计算设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19830551

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19830551

Country of ref document: EP

Kind code of ref document: A1