CN118484337A - Workflow recovery method and device - Google Patents

Workflow recovery method and device Download PDF

Info

Publication number
CN118484337A
CN118484337A CN202410741119.8A CN202410741119A CN118484337A CN 118484337 A CN118484337 A CN 118484337A CN 202410741119 A CN202410741119 A CN 202410741119A CN 118484337 A CN118484337 A CN 118484337A
Authority
CN
China
Prior art keywords
information
workflow
suspension
user
guide
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410741119.8A
Other languages
Chinese (zh)
Inventor
杨慕葵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jingdong City Beijing Digital Technology Co Ltd
Jingdong Technology Information Technology Co Ltd
Original Assignee
Jingdong City Beijing Digital Technology Co Ltd
Jingdong Technology Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jingdong City Beijing Digital Technology Co Ltd, Jingdong Technology Information Technology Co Ltd filed Critical Jingdong City Beijing Digital Technology Co Ltd
Priority to CN202410741119.8A priority Critical patent/CN118484337A/en
Publication of CN118484337A publication Critical patent/CN118484337A/en
Pending legal-status Critical Current

Links

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a workflow recovery method and device, and relates to the technical field of computers. One embodiment of the method for recovering the workflow comprises the following steps: responding to the obtained suspension information of the workflow, and generating a corresponding prompt word according to the suspension information; inputting the prompt word into a preset large model, so that the large model generates the guide information of the workflow according to the prompt word; and displaying the guide information to a preset front-end page, so that a user processing the workflow returns to a flow node included in the workflow according to the guide information, and the workflow is continuously processed. According to the embodiment, the suspension information of the workflow is analyzed based on the large model, so that corresponding guide information is obtained, a user returns to a flow node of the workflow according to the guide information, the workflow is continuously processed, the processing efficiency of the workflow can be improved, the difficulty of processing the workflow by the user is reduced, and the user experience is improved.

Description

Workflow recovery method and device
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a method and apparatus for workflow recovery.
Background
In business systems, workflow business scenarios are quite common, and users often need to process workflows. For example, before purchasing insurance services, workflows that a user needs to process include: basic information, consultation related questions, insurance company quotation and insurance application are input. The workflow is generally composed of a plurality of pages, and the user needs to submit information on the pages according to page prompts, complete necessary interactive operations, such as clicking, dragging, hooking, etc., and gradually process the workflow corresponding to each page until the whole workflow is completed. In the process of processing the workflow, if a user has a question or a page abnormality occurs, the user can perform self diagnosis through page prompt or consult the traditional robot customer service to obtain processing advice.
In carrying out the present invention, the inventors have found that at least the following problems exist in the prior art:
the page prompt and the traditional robot customer service expression are often professional, have a certain use threshold, and can not give accurate help to the user when the user has a question or the page is abnormal, so that the workflow processing efficiency is lower.
Disclosure of Invention
In view of this, the embodiments of the present invention provide a method and an apparatus for recovering a workflow, which can improve the processing efficiency of the workflow, reduce the difficulty of processing the workflow by a user, and improve the user experience.
To achieve the above object, according to a first aspect of an embodiment of the present invention, there is provided a method for workflow restoration, including: responding to the obtained suspension information of the workflow, and generating a corresponding prompt word according to the suspension information; inputting the prompt word into a preset large model, so that the large model generates the guide information of the workflow according to the prompt word; and displaying the guide information to a preset front-end page, so that a user processing the workflow returns to a flow node included in the workflow according to the guide information, and the workflow is continuously processed.
Optionally, generating a corresponding prompting word according to the suspension information, including: determining a suspension scene of the workflow according to the suspension information; determining a guide scene corresponding to the suspension scene according to preset service arrangement information; and updating the prompting word template corresponding to the guiding scene according to the suspension information, and taking the updated prompting word template as the prompting word.
Optionally, determining, according to preset service arrangement information, a guide scene corresponding to the suspension scene includes: determining business logic corresponding to the suspension scene according to the business arrangement information; executing the business logic and obtaining an execution result of the business logic; and determining a guide scene corresponding to the suspension scene according to the execution result.
Optionally, before the guiding information is displayed on the preset front-end page, determining intention information associated with the guiding information, and determining that the intention information does not meet preset guiding conditions; in the case that the intention information meets the preset guiding condition, the following steps are circularly executed to determine guiding information displayed to the front page:
The intention information is used as the intention information of the first round, user destination information corresponding to the intention information of the current round is determined, and a new prompt word of the current round is generated according to the user destination information; determining a large model of the current round according to the large model, inputting a new prompting word of the current round into the large model of the current round, and acquiring new intention information and new guide information output by the large model of the current round; under the condition that the new intention information of the current round does not accord with the guide condition, the circulation is terminated, and the new guide information of the current round is used as the guide information displayed to the front page; and taking the new intention information of the current round as the intention information of the next round under the condition that the new intention information of the current round meets the guiding condition.
Optionally, determining the large model of the current round according to the large model includes: responding to the judgment that the new prompt word of the current round is matched with the large model, and taking the large model as the large model of the current round; responding to the fact that the new prompting words are not matched with the large models, and screening target large models matched with the new prompting words of the current turn from a plurality of preset large models; the target large model is used as the large model of the current round.
Optionally, before acquiring new guiding information and new intention information output by the large model of the current turn, acquiring historical guiding information associated with the user; and inputting the history guide information into the large model of the current turn, so that the large model of the current turn outputs the new guide information and the new meaning information of the current turn according to the new prompt word and the history guide information of the current turn.
Optionally, obtaining the suspension information of the workflow includes: in response to receiving the dialogue request, determining dialogue information of a user according to the dialogue request, and taking the dialogue information as stop information of a workflow; or in response to detecting the occurrence of the target event, judging whether the target event belongs to a preset suspension event set, and determining event information of the target event as suspension information of the workflow under the condition that the target event belongs to the suspension event set.
According to a second aspect of an embodiment of the present invention, there is provided an apparatus for workflow restoration, including:
The first generation module is used for responding to the obtained suspension information of the workflow and generating corresponding prompt words according to the suspension information;
the second generation module is used for inputting the prompt word into a preset large model, so that the large model generates the guide information of the workflow according to the prompt word;
And the guiding module is used for displaying the guiding information to a preset front-end page, so that a user for processing the workflow returns to the flow node included in the workflow according to the guiding information, and the workflow is continuously processed.
Optionally, generating a corresponding prompting word according to the suspension information, including: determining a suspension scene of the workflow according to the suspension information; determining a guide scene corresponding to the suspension scene according to preset service arrangement information; and updating the prompting word template corresponding to the guiding scene according to the suspension information, and taking the updated prompting word template as the prompting word.
Optionally, determining, according to preset service arrangement information, a guide scene corresponding to the suspension scene includes: determining business logic corresponding to the suspension scene according to the business arrangement information; executing the business logic and obtaining an execution result of the business logic; and determining a guide scene corresponding to the suspension scene according to the execution result.
Optionally, the guiding module is further configured to: before the guiding information is displayed on a preset front-end page, determining intention information associated with the guiding information, and judging that the intention information does not accord with preset guiding conditions;
In the case that the intention information meets the preset guiding condition, the following steps are circularly executed to determine guiding information displayed to the front page:
The intention information is used as the intention information of the first round, user destination information corresponding to the intention information of the current round is determined, and a new prompt word of the current round is generated according to the user destination information; determining a large model of the current round according to the large model, inputting a new prompting word of the current round into the large model of the current round, and acquiring new intention information and new guide information output by the large model of the current round; under the condition that the new intention information of the current round does not accord with the guide condition, the circulation is terminated, and the new guide information of the current round is used as the guide information displayed to the front page; and taking the new intention information of the current round as the intention information of the next round under the condition that the new intention information of the current round meets the guiding condition.
Optionally, the guiding module is further configured to: responding to the judgment that the new prompt word of the current round is matched with the large model, and taking the large model as the large model of the current round; responding to the fact that the new prompting words are not matched with the large models, and screening target large models matched with the new prompting words of the current turn from a plurality of preset large models; the target large model is used as the large model of the current round.
Optionally, the guiding module is further configured to: acquiring history guidance information associated with a user; and inputting the history guide information into the large model of the current turn, so that the large model of the current turn outputs the new guide information and the new meaning information of the current turn according to the new prompt word and the history guide information of the current turn.
Optionally, obtaining the suspension information of the workflow includes: in response to receiving the dialogue request, determining dialogue information of a user according to the dialogue request, and taking the dialogue information as stop information of a workflow; or in response to detecting the occurrence of the target event, judging whether the target event belongs to a preset suspension event set, and determining event information of the target event as suspension information of the workflow under the condition that the target event belongs to the suspension event set.
According to a third aspect of an embodiment of the present invention, there is provided an electronic apparatus including:
One or more processors;
storage means for storing one or more programs,
The one or more processors implement the method of any of the embodiments described above when the one or more programs are executed by the one or more processors.
According to a fourth aspect of embodiments of the present invention, there is provided a computer readable medium having stored thereon a computer program which, when executed by a processor, implements the method of any of the embodiments described above.
According to a fifth aspect of embodiments of the present invention, there is provided a computer program product comprising a computer program which, when executed by a processor, implements the method of any of the embodiments described above.
One embodiment of the above invention has the following advantages or benefits: analyzing the suspension information of the workflow based on the large model to obtain corresponding guide information, so that a user returns to a flow node of the workflow according to the guide information to continuously process the workflow, the processing efficiency of the workflow can be improved, the difficulty of the user in processing the workflow is reduced, and the user experience is improved; according to the business arrangement information, determining a guide scene corresponding to the suspension scene, and generating a prompt word according to the guide scene, so that the prompt word can be flexibly and accurately generated; executing business logic corresponding to the suspension scene, and determining a guide scene according to an execution result of the business logic, so that the guide scene can be flexibly and efficiently determined; judging whether new guide information and intention information need to be generated again according to intention information associated with the guide information, and obtaining guide information finally returned to a user through multiple large-model output, so that intention and purpose of the user can be accurately determined, accurate guide information can be conveniently provided for the user, and the user is helped to quickly return to a flow node of a workflow; under the condition that the prompt words are not matched with the large models, a target large model is screened out from a plurality of large models, so that the flexibility of large model output can be improved, and different large model analysis requirements can be met; based on the historical guide information output to the user before, new guide information is generated, so that the accuracy of the guide information can be further improved, and the workflow processing efficiency is improved; the triggering time of the method comprises the following steps: the user asks or requests the dialogue to the execution main body of the embodiment of the invention, and detects that the workflow is abnormally withdrawn or abnormally interrupted due to certain operations of the user, thereby reducing the use difficulty of the user and improving the efficiency of the user for processing the workflow.
Further effects of the above-described non-conventional alternatives are described below in connection with the embodiments.
Drawings
The drawings are included to provide a better understanding of the invention and are not to be construed as unduly limiting the invention. Wherein:
FIG. 1 is a schematic diagram of the main flow of a method of workflow restoration according to an embodiment of the invention;
FIG. 2 is a schematic diagram of an application scenario for workflow restoration according to one referenceable embodiment of the invention;
FIG. 3 is a schematic diagram of a large model dialog chain in accordance with a referenceable embodiment of the present invention;
FIG. 4 is a schematic diagram of a technical architecture for workflow restoration according to one referenceable embodiment of the invention;
FIG. 5 is a schematic diagram of the main flow of workflow restoration in multi-module collaboration in accordance with a referenceable embodiment of the invention;
FIG. 6 is a schematic diagram of the main flow of a method of workflow restoration according to one referenceable embodiment of the invention;
FIG. 7 is a schematic diagram of the major modules of an apparatus for workflow restoration according to an embodiment of the invention;
FIG. 8 is an exemplary system architecture diagram in which embodiments of the present invention may be applied;
fig. 9 is a schematic diagram of a computer system suitable for use in implementing an embodiment of the invention.
Detailed Description
Exemplary embodiments of the present invention will now be described with reference to the accompanying drawings, in which various details of the embodiments of the present invention are included to facilitate understanding, and are to be considered merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
In the technical scheme of the invention, the related processes of collecting, using, storing, sharing, transferring and the like of the personal information of the user accord with the regulations of related laws and regulations, the user needs to be informed and obtain the consent or the authorization of the user, and when the personal information of the user is applicable, the technical processes of de-identification and/or anonymization and/or encryption are performed on the personal information of the user.
In business systems, workflow business scenarios are quite common, and users often need to process workflows. For example, before purchasing insurance services, workflows that a user needs to process include: basic information, consultation related questions, insurance company quotation and insurance application are input. The workflow is generally composed of a plurality of pages, and the user needs to submit information on the pages according to page prompts, complete necessary interactive operations, such as clicking, dragging, hooking, etc., and gradually process the workflow corresponding to each page until the whole workflow is completed. In the process of processing the workflow, if a user has a question or a page abnormality occurs, the user can perform self diagnosis through page prompt or consult the traditional robot customer service to obtain processing advice.
The page prompt and the traditional robot customer service expression are often professional, have a certain use threshold, and can not give accurate help to the user when the user has a question or the page is abnormal, so that the workflow processing efficiency is lower and the user experience is poorer.
In view of this, according to a first aspect of embodiments of the present invention, a method of workflow restoration is provided.
FIG. 1 is a schematic diagram of the main flow of a method of workflow restoration according to an embodiment of the invention. As shown in fig. 1, the method for recovering a workflow according to an embodiment of the present invention mainly includes the following steps S101 to S103.
Step S101, responding to the obtained stop information of the workflow, and generating a corresponding prompt word according to the stop information.
The execution main body of the embodiment of the invention monitors the process of processing the workflow by the user, acquires information related to the workflow, such as user behavior data, form collection data, page processing results and the like generated in the process of processing the workflow by the user, and extracts the suspension information from the information.
Illustratively, the workflow processed by the user includes a plurality of pages, and the execution body of the embodiment of the present invention obtains a page processing result of each page, where the page processing result includes: whether processing is successful, information entered by the user on the page form, basic information for page presentation, and the like. And under the condition that the page processing result is successful, displaying the workflow of the next page to the user, under the condition that the page processing result is unsuccessful, judging that the workflow processing flow of the user is stopped, acquiring user behavior data such as texts and uploaded files which are input by the user on the page, the times of clicking a button by the user, the times of refreshing the page by the user and the like, and basic information of page display, and taking the acquired information as stopping information.
After the suspension information is acquired, filling the suspension information into a preset cue word template, and taking the cue word template after filling the suspension information as a cue word. For example, the preset prompting word template is "you are a customer service person, and a user has a problem when processing a workflow, so that the workflow is stopped, and related information includes: text entered by the user on the page: [A1] behavior data of the user: [A2] basic information of page display: [A3] workflow processing results: [A4] please you provide a solution for the user according to the above information, guide the user to complete the processing of the workflow ", replace the corresponding stop information with the A1, A2, A3 and A4 in the above prompt word templates, and the replaced prompt word templates become" you are a customer service person, the user has a problem when processing the workflow, resulting in the workflow to be stopped, and the related information includes: text entered by the user on the page: user,20,., behavior data of the user: contact clicks the "confirm" button 5 times, basic information for page presentation: name, age..the workflow processes the results: and if the partial information processing is unsuccessful, please provide a solution for the user according to the information, and guide the user to complete the workflow processing.
It should be noted that, the alert word templates are configurable, and the execution body of the embodiment of the present invention adds new alert word templates, modifies or deletes existing alert word templates according to the received alert word update request.
Monitoring the workflow processing process can timely acquire workflow suspension information, and according to the prompt word template, the suspension information is converted into prompt words, so that a data basis can be provided for a large model, the workflow processing efficiency is improved, and a user can timely return to the workflow processing process.
Step S102, inputting the prompt word into a preset large model, and enabling the large model to generate the guide information of the workflow according to the prompt word.
After the prompt word is generated, the prompt word is input into a preset large model, wherein the large model is a generated large model and has business knowledge of the business field of the workflow. Specifically, before the large model is used for outputting the guide information, the large model is pre-trained by using the business knowledge, so that the large model is familiar with various processes in the business field and has the existing business knowledge in the business field. The cue words are then input to the large model after pre-training.
And outputting corresponding guide information by the large model according to the received prompt words. The guidance information is used for helping the user to continue processing the workflow which is suspended before, and the guidance information comprises: description of the current workflow process scenario, cause of workflow suspension, operation that the user can perform (i.e., return to the method of workflow node that was suspended before).
Illustratively, the guidance information output by the large model is: the flow node where the workflow suspension just happens is node B1, the meaning of which is data B2, the information that you need to provide is information B3, if you have no other questions, you can click on page link B4, and continue the flow node just suspended. Among the different pieces of guidance information, the node B1, the data B2, the information B3, and the page link B4 are different in value, and are generated based on the received suspension information.
The guiding information can solve the doubt generated by the user in the workflow processing process, so that the user can clearly know the workflow processing process, and the user can timely return to the workflow which is stopped before through the page link, thereby being beneficial to improving the workflow processing efficiency and accuracy and improving the user experience.
Step S103, the guiding information is displayed on a preset front-end page, so that a user processing the workflow returns to a flow node included in the workflow according to the guiding information, and the workflow is continuously processed.
After generating the guidance information, the guidance information is presented to the front page. Specifically, the front-end page is a preset dialog box, and the execution subject of the embodiment of the invention carries out a dialog with the user by using the identity of the page intelligent robot, and the dialog content is the guide information generated before.
The user understands the meaning of the workflow according to the guiding information, knows the information to be filled in and the operation to be executed, clicks the link in the guiding information, sends the flow node access request to the execution body of the embodiment of the invention, the execution body of the embodiment of the invention analyzes the received flow node access request, determining a target flow node to be accessed by a user, displaying a page corresponding to the target flow node to the user, which is equivalent to the workflow suspended before the user returns again, wherein the user continues to process the workflow which is not completed before the user continues to process in the page corresponding to the target flow node.
It should be noted that after receiving the guiding information, the user may continue to talk with the page intelligent robot (i.e. the execution body in the embodiment of the present invention) in the dialog box, for example, input information such as text, image or video in the dialog box, where the execution body in the embodiment of the present invention generates new guiding information according to the dialog content of the user, and displays the new guiding information in the dialog box, specifically, converts the dialog content of the user into a new prompting word and inputs a large model, so as to obtain the new guiding information. The steps S101 to S103 are repeated continuously in the form of a dialogue between the intelligent page robot and the user, the user sends dialogue content to the intelligent page robot, the intelligent page robot uses the dialogue content of the user as suspension information, generates corresponding guiding information based on the large model, displays the guiding information as reply content in the dialogue box, and repeats the steps until the user returns to the workflow suspended before according to the link in the guiding information.
Fig. 2 is a schematic diagram of an application scenario for workflow restoration according to one referenceable embodiment of the invention. Illustratively, as shown in fig. 2, the dialogue content of the user (i.e., user) and the execution subject (i.e., AI) of the embodiment of the present invention is shown in the dialogue page 201, and the execution subject of the embodiment of the present invention collects the suspension information in the case that the suspension of the workflow processed by the user is detected, determines that the workflow is suspended in the "product usage description link", generates the first piece of guidance information (i.e., the first sentence sent by AI) based on the large model, and shows the guidance information in the dialogue page 201 to perform the dialogue with the user; the user describes his own query (i.e. the first sentence sent by the user) based on the first piece of guiding information, and the execution subject of the embodiment of the invention generates a second piece of guiding information (i.e. the second sentence sent by the AI) based on the big model again according to the query sent by the user, wherein the second piece of guiding information comprises a page link, so that the user can be helped to return to the workflow which is suspended before, if the user wants to continue to communicate, the user can input content in the bottommost dialog box of the dialog page 201 and send the content to the execution subject of the embodiment of the invention, and the execution subject of the embodiment of the invention can continue to output the guiding information according to the content sent by the user so as to help the user to answer the question.
Based on the large model, the suspension information of the workflow is analyzed to obtain corresponding guide information, so that a user returns to a flow node of the workflow according to the guide information to continuously process the workflow, the processing efficiency of the workflow can be improved, the difficulty of the user in processing the workflow is reduced, and the user experience is improved.
According to a referenceable embodiment of the invention, when generating the corresponding prompt word according to the suspension information, determining the suspension scene of the workflow according to the suspension information. Specifically, the suspension scenario includes: the suspension scene is a set of the page states, the page states are determined according to suspension information, and the page states are combined into the suspension scene. For example, the suspension information includes: the user accesses the page C, the page C is closed, the workflow corresponding to the page C is not successfully processed, and the suspension information is converted into a suspension scene, so that the page C is closed and no user input is obtained.
After determining the suspension scene, determining a guide scene corresponding to the suspension scene according to preset business arrangement information. The business arrangement information includes an association relationship between the suspension scenes and the guidance scenes, each of which corresponds to one guidance scene, for example, in the case where the suspension scene is "page C1 is closed and no user input", the business arrangement information is queried, and it is determined whether the corresponding guidance scene is "query user accesses C2 page".
For each change (e.g., user completed page flow, user posed question, user abnormal operation, etc.) that occurs on the front-end page that causes the workflow to be suspended, a corresponding suspension scene, and a guidance scene corresponding to the suspension scene, are set in advance in the traffic arrangement information. It should be noted that, the service arrangement information is configurable, and the execution body of the embodiment of the present invention updates the service arrangement information according to the received update request, adds a new suspension scene and/or a guiding scene, and modifies the association relationship between the suspension scene and the guiding scene.
Each guide scene corresponds to a prompting word template, and the prompting word template is an incomplete prompting word. After determining the guiding scene, updating a prompting word template corresponding to the guiding scene according to the stopping information, specifically, filling the stopping information into the prompting word template, wherein the prompting word template filled with the stopping information (namely after updating) is a complete prompting word, taking the updated prompting word template as the prompting word, and inputting a preset large model to obtain the corresponding guiding information.
According to the method, the stop scene is determined according to the stop information, the guide scene associated with the stop scene is determined according to the business arrangement information, and the prompt word template is determined according to the guide scene, so that the prompt word template can be determined more flexibly and accurately, and accurate guide information can be generated conveniently. The business arrangement information is configurable, and can more flexibly associate the suspension scene with the guide scene, so as to meet different business processing requirements and efficiently and accurately enable the user to return to the workflow.
According to another embodiment of the present invention, when determining a guiding scene corresponding to a suspension scene according to preset service arrangement information, service logic corresponding to the suspension scene is determined according to the service arrangement information. Specifically, the business arrangement information is used for managing business logic under different suspension scenes, the business arrangement information comprises business logic corresponding to each suspension scene, and the business logic guides a user to return to the workflow and continues to process the workflow.
And executing the business logic corresponding to the suspension scene, and acquiring an execution result of the business logic. The data type of the execution result may be a character string, for example, "yes", "no", "success", "unsuccessful", or a numerical value, for example, "10", "20", or a boolean type, for example, "true", "false". After the execution result is obtained, determining a guide scene corresponding to the suspension scene according to the execution result. For example, when the data type of the execution result is a numerical value, the execution result is compared with a plurality of numerical value ranges set in advance, each numerical value range corresponds to one guidance scene, the numerical value range to which the execution result belongs is regarded as a target numerical value range, and the guidance scene corresponding to the target numerical value range is regarded as a guidance scene corresponding to the suspension scene.
The method includes that a suspension scene is an abnormal page D, business logic corresponding to the suspension scene is a method http:// api/test, the business logic is called, an execution result of the business logic is obtained, a guide scene of the suspension scene is determined to be a guide scene E1 under the condition that the execution result is yes, and the guide scene of the suspension scene is determined to be a guide scene E2 under the condition that the execution result is no.
It should be noted that, the business logic corresponding to the suspension scenario may be implemented by a business interface or a custom script. The service logic corresponding to the suspension scene is configurable, and the execution body of the embodiment of the invention modifies the service interface or the custom script corresponding to the suspension scene according to the received update request, thereby modifying the service logic corresponding to the suspension scene.
And executing the business logic corresponding to the suspension scene, and determining the guide scene corresponding to the suspension scene according to the execution result of the business logic, so that the guide scene can be flexibly and accurately determined, and the accuracy of the guide information is improved.
According to still another referenceable embodiment of the present invention, before presenting the guidance information to the preset front page, intention information associated with the guidance information is determined, and it is determined that the intention information does not meet the preset guidance conditions. The large model generates guide information according to the prompt words and outputs corresponding intention information at the same time, the intention information is used for describing user intention, and the user intention comprises: purchase, consult, subscribe, return to the previous page, etc.
The guiding condition comprises an intention information set, and when the intention information output by the large model belongs to the intention information set, the intention information is judged to be in accordance with the guiding condition, so that the inaccuracy of the guiding information output by the large model is indicated, and further processing is required to be executed, so that more accurate guiding information is obtained.
Specifically, in the case where the intended information meets the guidance condition, the following steps are circularly performed to determine the guidance information presented to the front page as: the intention information is used as the intention information of the first round, user destination information corresponding to the intention information of the current round is determined, and a new prompt word of the current round is generated according to the user destination information; determining a large model of the current round according to the large model, inputting a new prompting word of the current round into the large model of the current round, and acquiring new intention information and new guide information output by the large model of the current round; under the condition that the new intention information of the current round does not accord with the guide condition, the circulation is terminated, and the new guide information of the current round is used as the guide information displayed to the front page; and taking the new intention information of the current round as the intention information of the next round under the condition that the new intention information of the current round meets the guiding condition. Specifically, the preset business arrangement information is queried, and the purpose information associated with the intention information is determined, for example, in the case that the intention information is "purchase", the corresponding purpose information is "get purchase offer consultation". And filling the corresponding prompt word template according to the user destination information, generating a corresponding new prompt word, inputting the new prompt word into the large model, and obtaining new guide information and new intention information output by the large model. And then judging whether the new intention information meets the guide condition.
FIG. 3 is a schematic diagram of a large model dialog chain in accordance with a referenceable embodiment of the present invention. As shown in fig. 3, a user initiates a dialogue on a front-end page, and triggers a workflow suspension mechanism, and an execution subject of the embodiment of the invention takes a text input by the user on the front-end page as suspension information, and generates a prompt word 1 according to the suspension information; then inputting the prompt word 1 into a large model to generate guide information 1 and intention information 1; judging whether the intention information 1 is a target intention, and if the intention information 1 is not the target intention, displaying the guide information 1 on a front page and returning to a user; when the intention information 1 is the target intention, generating a prompt word 2 according to the suspension information and the guide information 1; then inputting the prompt word 2 into a large model to generate guide information 2 and intention information 2; and repeating the previous judging step until the intention information is not the target intention, and returning the corresponding guide information to the user. The input for generating the guidance information includes: the user inputs the text, the user behavior data and the guide information which is generated before and returned to the user on the front-end page; the generation process of the guiding information is used as a primary input and output process, and the guiding information output by the previous process is the information input by the next process.
According to the intention information associated with the guide information, whether new guide information and intention information need to be generated again or not is judged, the guide information finally returned to the user is obtained through multiple times of large-model output, the intention and the purpose of the user can be accurately determined, the accurate guide information is conveniently provided for the user, and the user is helped to quickly return to a flow node of a workflow.
According to a referenceable embodiment of the present invention, when determining a large model of a current round according to the large model, it is determined whether a new prompting word of the current round is matched with the large model, for example, a prompting word template corresponding to the new prompting word is determined, preset business arrangement information is queried, whether the large model associated with the prompting word template is the large model to be input is determined, and if yes, it is determined that the new prompting word is matched with the large model. For another example, query preset business arrangement information, determine a condition of a prompt word corresponding to the large model, where the condition of the prompt word includes: the maximum length of the prompt word, the keywords which the prompt word should (or should not) include, the absence of null values of the prompt word, and the like, and under the condition that the new prompt word accords with the prompt word condition corresponding to the large model, the new prompt word is judged to be matched with the large model. And in response to determining that the new prompting word of the current turn is matched with the large model, taking the large model as the large model of the current turn, and inputting the new prompting word into the large model.
And in response to judging that the new prompting words of the current turn are not matched with the large models, screening target large models matched with the new prompting words of the current turn from a plurality of preset large models. Specifically, the condition of the new prompt word and the condition of the prompt word of the rest big models are compared, the big model corresponding to the condition of the prompt word which is met by the new prompt word is used as a target big model, or the service arrangement information is queried, the prompt word template corresponding to the new prompt word is determined, and the big model associated with the prompt word template is used as the target big model. Then inputting a new prompt word into the target large model to acquire new guide information and new intention information output by the target large model; and judging whether the new intention information meets the guide condition.
It should be noted that, corresponding workflows exist in multiple business fields such as insurance, logistics, electronic commerce and finance, business knowledge in the multiple business fields is obtained, the business knowledge in the same business field is used for pre-training large models, so as to obtain large models corresponding to different business fields, and each large model has business knowledge in each business field.
Under the condition that the prompt words are not matched with the large models, the target large model is screened out from the large models, so that the flexibility of large model output can be improved, and different large model analysis requirements can be met.
According to another referenceable embodiment of the present invention, historical guidance information associated with a user is obtained before new guidance information and new intent information output by a large model of a current round are obtained. Specifically, the historical guidance information is guidance information that was previously generated and returned to the user, which has helped the user return to the workflow and continue processing the workflow. The execution main body of the embodiment of the invention has a short-term memory function and stores the guide information generated before, when new guide information and new intention information need to be generated, the execution main body of the embodiment of the invention acquires the history guide information of the corresponding user, namely the guide information which is generated before and returned to the user, combines the history guide information with the suspension information, and inputs the combined information into a large model, so that the large model outputs the new guide information and the new intention information according to the new prompt word and the history guide information.
The method includes the steps that when a user F processes a workflow last time, the user F actively asks an execution main body of the embodiment of the invention about how to purchase accident risk, the workflow is stopped at the moment, a user inputs only a text, the execution main body of the embodiment of the invention analyzes guiding information and intention information according to stopping information such as the text input by the user and the like by combining a large model, and the guiding information is stored as history guiding information; when the user F clicks the popup window to close at this time, the suspension information comprises: the user closes the page and closes the page which stays before closing, and then the execution main body of the embodiment of the invention acquires the history guide information corresponding to the user, namely, the user wants to buy the accident risk last time, but the page is abnormally closed when the workflow of the life risk is processed, and the user is judged to generate the idea of buying the life risk, so that the corresponding guide information is generated, and the user is guided to return to the workflow of the life risk.
Based on the historical guide information output to the user before, new guide information is generated, so that the accuracy of the guide information can be further improved, and the workflow processing efficiency is improved.
According to another embodiment of the present invention, a plurality of mechanisms for triggering workflow suspension are preset, and each mechanism for triggering workflow suspension corresponds to a method for acquiring suspension information of a workflow. Specifically, the execution body of the embodiment of the invention generates a page component iframe and a dialog box in a front-end page, then displays a sub-service page (i.e. a workflow page) in the iframe, a user processes a workflow in the sub-service page, when the user encounters a problem, the user presents the problem in the dialog box, at this time, a workflow suspension mechanism is triggered, the front-end page sends a dialog request to the execution body of the embodiment of the invention through a postMessage, and then the execution body of the embodiment of the invention determines the dialog information of the user according to the received dialog request, and takes the dialog information as the suspension information of the workflow.
Or the execution main body of the embodiment of the invention monitors the event occurring on the front-end page, and in response to detecting the occurrence of the target event, judges whether the target event belongs to a preset suspension event set, and determines the event information of the target event under the condition that the target event belongs to the suspension event set, and takes the information included in the event information as the suspension information of the workflow. The events in the abort event set include: page output exception information, pages closed if not processed successfully, page load failure, etc.
The user asks or requests the dialogue to the execution main body of the embodiment of the invention, and detects that the workflow is abnormally withdrawn or abnormally interrupted due to certain operations of the user, thereby reducing the use difficulty of the user and improving the efficiency of the user for processing the workflow.
FIG. 4 is a schematic diagram of a technical architecture for workflow restoration according to one referenceable embodiment of the invention. Illustratively, as shown in fig. 4, the execution body 401 of the embodiment of the present invention includes: a dialog box front end, a dialog service back end, a large model dialog back end, service arrangement management, dialog arrangement management and other modules. Specifically, a business component iframe is generated in the front end of the dialog box, sub business front end pages are displayed in the iframe, each sub business front end page corresponds to one link (namely one workflow page) in the workflow, a user executes a business flow in the sub business front end page, then a problem is put forward in the front end page of the dialog box, the sub business front end synchronizes business information to the front end of the dialog box through postMessage, the front end of the dialog box calls the corresponding sub business front end according to a user request, and the user returns to the workflow which is stopped before. The sub-service front end and the sub-service back end internally exchange service information, and the sub-service back end and the dialogue service back end synchronize service information, for example, abnormal information in the process of synchronizing sub-service execution, workflow execution results and the like. The front end of the dialogue box reports the front end service state to the back end of the dialogue service. The dialogue service back end is used for determining an abort scene, the large model dialogue back end is used for determining a guiding scene, guiding information is generated according to the abort information, and the guiding information is returned to the user through the dialogue service back end and the dialogue box front end. Business orchestration management is used to manage the suspension scenario, business logic corresponding to the suspension scenario, and so on. Dialog orchestration management is used to manage the boot scene to which the abort scene corresponds, generate a large model dialog chain, and so on. Business orchestration management and dialog orchestration management dynamically configure the dialog business backend and the large model dialog backend.
FIG. 5 is a schematic diagram of the main flow of workflow restoration in multi-module collaboration in accordance with one referenceable embodiment of the invention. Illustratively, as shown in fig. 5, the execution body of the embodiment of the present invention includes: the system comprises a page dialogue receiving module, a page event monitoring module, a business result receiving module, a prompt word generating module, a dialogue business processing module, a dialogue assembling module, a page dialogue displaying module and the like. The page dialogue receiving module is used for receiving dialogue requests sent by users, triggering workflow suspension mechanism flows, the page event monitoring module is used for monitoring page events, triggering workflow suspension flows, the business result receiving module is used for receiving execution results of business logic corresponding to the workflow, and triggering workflow suspension flows under the condition of abnormal information. The prompt word generation module is used for acquiring the suspension information, converting the suspension scene into a guide scene, determining a prompt word template, generating a prompt word, the dialogue service processing module is used for generating guide information according to the prompt word, the historical dialogue between the execution main body and the user, the historical guide information, the suspension information and the like, and the dialogue assembling module is used for assembling the historical dialogue; the page dialogue display module is used for displaying the guide information, carrying out dialogue with the user, solving the problem of answering the user, and helping the user to return to the workflow.
FIG. 6 is a schematic diagram of the main flow of a method of workflow restoration according to one referenceable embodiment of the invention. As shown in fig. 6, the method for workflow restoration may include:
Step S601, in response to receiving the dialogue request, determining dialogue information of a user according to the dialogue request, and taking the dialogue information as the stop information of the workflow;
step S602, determining a suspension scene of the workflow according to the suspension information;
Step S603, determining service logic corresponding to the suspension scene according to preset service arrangement information;
step S604, executing business logic, obtaining an execution result of the business logic, and determining a guide scene corresponding to the suspension scene according to the execution result;
step S605, updating a prompting word template corresponding to the guiding scene according to the suspension information, and taking the updated prompting word template as a prompting word;
step S606, inputting the prompt word into a preset large model, so that the large model generates the guide information of the workflow according to the prompt word;
Step S607, the guiding information is displayed to a preset front end page, so that the user processing the workflow returns to the flow node included in the workflow according to the guiding information, and the workflow is continuously processed.
The above-mentioned specific implementation of a method for recovering a workflow according to an embodiment of the present invention has been described in detail in the above method for recovering a workflow, and thus the description thereof will not be repeated here.
According to a second aspect of an embodiment of the present invention, an apparatus for workflow restoration is provided.
Fig. 7 is a schematic diagram of main modules of an apparatus for workflow restoration according to an embodiment of the present invention, and as shown in fig. 7, an apparatus 700 for workflow restoration mainly includes:
A first generating module 701, configured to respond to obtaining the suspension information of the workflow, and generate a corresponding prompt word according to the suspension information;
A second generating module 702, configured to input the prompt word into a preset large model, so that the large model generates guiding information of the workflow according to the prompt word;
And the guiding module 703 is configured to display the guiding information to a preset front-end page, so that a user who processes the workflow returns to a flow node included in the workflow according to the guiding information, and continue to process the workflow.
According to a referenceable embodiment of the present invention, generating a corresponding alert word according to the suspension information includes: determining a suspension scene of the workflow according to the suspension information; determining a guide scene corresponding to the suspension scene according to preset service arrangement information; and updating the prompting word template corresponding to the guiding scene according to the suspension information, and taking the updated prompting word template as the prompting word.
According to another embodiment of the present invention, determining a guidance scenario corresponding to a suspension scenario according to preset service arrangement information includes: determining business logic corresponding to the suspension scene according to the business arrangement information; executing the business logic and obtaining an execution result of the business logic; and determining a guide scene corresponding to the suspension scene according to the execution result.
According to a further referenceable embodiment of the invention, the guidance module 703 is further adapted to: before the guiding information is displayed on a preset front-end page, determining intention information associated with the guiding information, and judging that the intention information does not accord with preset guiding conditions; in the case that the intention information meets the preset guiding condition, the following steps are circularly executed to determine guiding information displayed to the front page:
The intention information is used as the intention information of the first round, user destination information corresponding to the intention information of the current round is determined, and a new prompt word of the current round is generated according to the user destination information; determining a large model of the current round according to the large model, inputting a new prompting word of the current round into the large model of the current round, and acquiring new intention information and new guide information output by the large model of the current round; under the condition that the new intention information of the current round does not accord with the guide condition, the circulation is terminated, and the new guide information of the current round is used as the guide information displayed to the front page; and taking the new intention information of the current round as the intention information of the next round under the condition that the new intention information of the current round meets the guiding condition.
According to a further referenceable embodiment of the invention, the guidance module 703 is further configured to: responding to the judgment that the new prompt word of the current round is matched with the large model, and taking the large model as the large model of the current round; responding to the fact that the new prompting words are not matched with the large models, and screening target large models matched with the new prompting words of the current turn from a plurality of preset large models; the target large model is used as the large model of the current round.
According to a further referenceable embodiment of the invention, the guidance module 703 is further configured to: acquiring history guidance information associated with a user; and inputting the history guide information into the large model of the current turn, so that the large model of the current turn outputs the new guide information and the new meaning information of the current turn according to the new prompt word and the history guide information of the current turn.
According to a referenceable embodiment of the present invention, obtaining the suspension information of the workflow includes: in response to receiving the dialogue request, determining dialogue information of a user according to the dialogue request, and taking the dialogue information as stop information of a workflow; or in response to detecting the occurrence of the target event, judging whether the target event belongs to a preset suspension event set, and determining event information of the target event as suspension information of the workflow under the condition that the target event belongs to the suspension event set.
In the embodiment of the present invention, the specific implementation of the apparatus for recovering a workflow is described in detail in the above method for recovering a workflow, and thus the description is not repeated here.
According to the technical scheme provided by the embodiment of the invention, the suspension information of the workflow is analyzed based on the large model to obtain the corresponding guide information, so that a user returns to the flow node of the workflow according to the guide information to continue to process the workflow, the processing efficiency of the workflow can be improved, the difficulty of the user in processing the workflow is reduced, and the user experience is improved; according to the business arrangement information, determining a guide scene corresponding to the suspension scene, and generating a prompt word according to the guide scene, so that the prompt word can be flexibly and accurately generated; executing business logic corresponding to the suspension scene, and determining a guide scene according to an execution result of the business logic, so that the guide scene can be flexibly and efficiently determined; judging whether new guide information and intention information need to be generated again according to intention information associated with the guide information, and obtaining guide information finally returned to a user through multiple large-model output, so that intention and purpose of the user can be accurately determined, accurate guide information can be conveniently provided for the user, and the user is helped to quickly return to a flow node of a workflow; under the condition that the prompt words are not matched with the large models, a target large model is screened out from a plurality of large models, so that the flexibility of large model output can be improved, and different large model analysis requirements can be met; based on the historical guide information output to the user before, new guide information is generated, so that the accuracy of the guide information can be further improved, and the workflow processing efficiency is improved; the triggering time of the method comprises the following steps: the user asks or requests the dialogue to the execution main body of the embodiment of the invention, and detects that the workflow is abnormally withdrawn or abnormally interrupted due to certain operations of the user, thereby reducing the use difficulty of the user and improving the efficiency of the user for processing the workflow.
According to a third aspect of an embodiment of the present invention, there is provided an electronic apparatus including: one or more processors; and a storage device for storing one or more programs, which when executed by the one or more processors, cause the one or more processors to implement the method provided by the first aspect of the embodiment of the present invention.
According to a fourth aspect of embodiments of the present invention there is provided a computer readable medium having stored thereon a computer program which when executed by a processor implements the method provided by the first aspect of embodiments of the present invention.
Fig. 8 illustrates an exemplary system architecture 800 of a workflow restoration method or apparatus to which embodiments of the present invention may be applied.
As shown in fig. 8, a system architecture 800 may include terminal devices 801, 802, 803, a network 804, and a server 805. The network 804 serves as a medium for providing communication links between the terminal devices 801, 802, 803 and the server 805. The network 804 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
A user may interact with the server 805 through the network 804 using the terminal devices 801, 802, 803 to receive or send messages or the like. Various communication client applications may be installed on the terminal devices 801, 802, 803, such as a workflow restoration class application, a workflow query class application, a search class application, an instant messaging tool, a mailbox client, social platform software, and the like (by way of example only).
The terminal devices 801, 802, 803 may be a variety of electronic devices having a display screen and supporting web browsing, including but not limited to smartphones, tablets, laptop and desktop computers, and the like.
The server 805 may be a server providing various services, such as a background management server (by way of example only) that provides support for requests for workflow restoration sent upstream with the terminal devices 801, 802, 803. The background management server can respond to the obtained suspension information of the workflow and generate corresponding prompt words according to the suspension information; inputting the prompt word into a preset large model, so that the large model generates the guide information of the workflow according to the prompt word; displaying the guide information to a preset front-end page, so that a user processing the workflow returns to a flow node included in the workflow according to the guide information, and the workflow is continuously processed; and feeds back the workflow restoration situation (only by way of example) to the terminal device.
It should be noted that, the method for recovering a workflow provided by the embodiment of the present invention is generally executed by the server 805, and accordingly, the device for recovering a workflow is generally disposed in the server 805. The method for recovering the workflow provided by the embodiment of the invention can also be executed by the terminal devices 801, 802 and 803, and correspondingly, the devices for recovering the workflow can be arranged in the terminal devices 801, 802 and 803.
It should be understood that the number of terminal devices, networks and servers in fig. 8 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Referring now to FIG. 9, there is illustrated a schematic diagram of a computer system 900 suitable for use in implementing an embodiment of the present invention. The terminal device shown in fig. 9 is only an example, and should not impose any limitation on the functions and the scope of use of the embodiment of the present invention.
As shown in fig. 9, the computer system 900 includes a Central Processing Unit (CPU) 901, which can execute various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 902 or a program loaded from a storage section 908 into a Random Access Memory (RAM) 903. In the RAM 903, various programs and data necessary for the operation of the system 900 are also stored. The CPU 901, ROM 902, and RAM 903 are connected to each other through a bus 904. An input/output (I/O) interface 905 is also connected to the bus 904.
The following components are connected to the I/O interface 905: an input section 906 including a keyboard, a mouse, and the like; an output portion 907 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and a speaker; a storage portion 908 including a hard disk or the like; and a communication section 909 including a network interface card such as a LAN card, a modem, or the like. The communication section 909 performs communication processing via a network such as the internet. The drive 910 is also connected to the I/O interface 905 as needed. A removable medium 911 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is installed as needed on the drive 910 so that a computer program read out therefrom is installed into the storage section 908 as needed.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from the network via the communication portion 909 and/or installed from the removable medium 911. When the computer program is executed by a Central Processing Unit (CPU) 901, the above-described functions defined in the system of the embodiment of the present invention are performed.
It should be noted that, the computer readable medium shown in the embodiments of the present invention may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In embodiments of the present invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in embodiments of the present invention, the computer-readable signal medium may comprise a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer programs according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules involved in the embodiments of the present invention may be implemented in software or in hardware. The described modules may also be provided in a processor, for example, as: a processor includes a first generation module, a second generation module, and a guidance module, where the names of the modules do not in some cases define the module itself, and for example, the first generation module may also be described as "a module that generates a hint word based on abort information".
As another aspect, the embodiment of the present invention also provides a computer-readable medium that may be contained in the apparatus described in the above embodiment; or may be present alone without being fitted into the device. The computer readable medium carries one or more programs which, when executed by a device, implement the method of: responding to the obtained suspension information of the workflow, and generating a corresponding prompt word according to the suspension information; inputting the prompt word into a preset large model, so that the large model generates the guide information of the workflow according to the prompt word; and displaying the guide information to a preset front-end page, so that a user processing the workflow returns to a flow node included in the workflow according to the guide information, and the workflow is continuously processed.
According to the technical scheme provided by the embodiment of the invention, the suspension information of the workflow is analyzed based on the large model to obtain the corresponding guide information, so that a user returns to the flow node of the workflow according to the guide information to continue to process the workflow, the processing efficiency of the workflow can be improved, the difficulty of the user in processing the workflow is reduced, and the user experience is improved; according to the business arrangement information, determining a guide scene corresponding to the suspension scene, and generating a prompt word according to the guide scene, so that the prompt word can be flexibly and accurately generated; executing business logic corresponding to the suspension scene, and determining a guide scene according to an execution result of the business logic, so that the guide scene can be flexibly and efficiently determined; judging whether new guide information and intention information need to be generated again according to intention information associated with the guide information, and obtaining guide information finally returned to a user through multiple large-model output, so that intention and purpose of the user can be accurately determined, accurate guide information can be conveniently provided for the user, and the user is helped to quickly return to a flow node of a workflow; under the condition that the prompt words are not matched with the large models, a target large model is screened out from a plurality of large models, so that the flexibility of large model output can be improved, and different large model analysis requirements can be met; based on the historical guide information output to the user before, new guide information is generated, so that the accuracy of the guide information can be further improved, and the workflow processing efficiency is improved; the triggering time of the method comprises the following steps: the user asks or requests the dialogue to the execution main body of the embodiment of the invention, and detects that the workflow is abnormally withdrawn or abnormally interrupted due to certain operations of the user, thereby reducing the use difficulty of the user and improving the efficiency of the user for processing the workflow.
The above detailed description should not be construed as limiting the scope of the embodiments of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives can occur depending upon design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the embodiments of the present invention should be included in the scope of the embodiments of the present invention.

Claims (11)

1. A method of workflow restoration, comprising:
Responding to the obtained suspension information of the workflow, and generating a corresponding prompt word according to the suspension information;
inputting the prompt word into a preset large model, and enabling the large model to generate the guide information of the workflow according to the prompt word;
and displaying the guide information to a preset front-end page, so that a user processing the workflow returns to a flow node included in the workflow according to the guide information, and continues to process the workflow.
2. The method of claim 1, wherein generating the corresponding alert word based on the suspension information comprises:
Determining a suspension scene of the workflow according to the suspension information;
determining a guide scene corresponding to the suspension scene according to preset service arrangement information;
and updating the prompt word template corresponding to the guide scene according to the suspension information, and taking the updated prompt word template as the prompt word.
3. The method according to claim 2, wherein determining the guidance scenario corresponding to the suspension scenario according to the preset service arrangement information includes:
Determining service logic corresponding to the suspension scene according to the service arrangement information;
Executing the service logic and obtaining an execution result of the service logic;
and determining a guide scene corresponding to the suspension scene according to the execution result.
4. The method of claim 1, wherein prior to presenting the guidance information to the pre-set front-end page, the method further comprises:
determining intention information associated with the guide information, and judging that the intention information does not accord with preset guide conditions;
And under the condition that the intention information accords with a preset guiding condition, circularly executing the following steps to determine guiding information displayed to the front-end page:
The intention information is used as intention information of a first round, user destination information corresponding to the intention information of the current round is determined, and a new prompt word of the current round is generated according to the user destination information; determining a large model of the current round according to the large model, inputting a new prompt word of the current round into the large model of the current round, and acquiring new intention information and new guide information output by the large model of the current round; when the new intention information of the current round does not accord with the guide condition, the circulation is terminated, and the new guide information of the current round is used as the guide information displayed to the front page; and taking the new intention information of the current round as the intention information of the next round under the condition that the new intention information of the current round meets the guide condition.
5. The method of claim 4, wherein determining a large model of a current round from the large model comprises:
responding to the judgment that the new prompt word of the current round is matched with the large model, and taking the large model as the large model of the current round;
responding to the judgment that the new prompting words of the current round are not matched with the large models, and screening target large models matched with the new prompting words of the current round from a plurality of preset large models; and taking the target large model as a large model of the current round.
6. The method of claim 5, wherein prior to obtaining new guidance information and new intent information for the large model output for the current round, the method further comprises:
acquiring history guidance information associated with the user;
And inputting the history guide information into a large model of the current round, so that the large model of the current round outputs new guide information and new intention information of the current round according to the new prompt word of the current round and the history guide information.
7. The method of claim 1, wherein obtaining the workflow suspension information comprises:
In response to receiving a dialogue request, determining dialogue information of the user according to the dialogue request, and taking the dialogue information as the stopping information of the workflow;
Or in response to detecting that a target event occurs, judging whether the target event belongs to a preset suspension event set, and determining event information of the target event under the condition that the target event belongs to the suspension event set, wherein the event information is used as suspension information of the workflow.
8. An apparatus for workflow restoration, comprising:
The first generation module is used for responding to the obtained suspension information of the workflow and generating corresponding prompt words according to the suspension information;
The second generation module is used for inputting the prompt word into a preset large model, so that the large model generates the guide information of the workflow according to the prompt word;
and the guiding module is used for displaying the guiding information to a preset front-end page, so that a user processing the workflow returns to a flow node included in the workflow according to the guiding information and continues to process the workflow.
9. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs,
The one or more processors implement the method of any of claims 1-7 when the one or more programs are executed by the one or more processors.
10. A computer readable medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the method according to any of claims 1-7.
11. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any of claims 1-7.
CN202410741119.8A 2024-06-07 2024-06-07 Workflow recovery method and device Pending CN118484337A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410741119.8A CN118484337A (en) 2024-06-07 2024-06-07 Workflow recovery method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410741119.8A CN118484337A (en) 2024-06-07 2024-06-07 Workflow recovery method and device

Publications (1)

Publication Number Publication Date
CN118484337A true CN118484337A (en) 2024-08-13

Family

ID=92197243

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410741119.8A Pending CN118484337A (en) 2024-06-07 2024-06-07 Workflow recovery method and device

Country Status (1)

Country Link
CN (1) CN118484337A (en)

Similar Documents

Publication Publication Date Title
US10725827B2 (en) Artificial intelligence based virtual automated assistance
US11475374B2 (en) Techniques for automated self-adjusting corporation-wide feature discovery and integration
US20210081837A1 (en) Machine learning (ml) infrastructure techniques
AU2020201883B2 (en) Call center system having reduced communication latency
US20210081819A1 (en) Chatbot for defining a machine learning (ml) solution
US10951768B2 (en) Measuring cognitive capabilities of automated resources and related management thereof in contact centers
CN115485690A (en) Batch technique for handling unbalanced training data of chat robots
US10063497B2 (en) Electronic reply message compositor and prioritization apparatus and method of operation
US20200042295A1 (en) Ai-generated instant micro apps
US11256733B2 (en) User support with integrated conversational user interfaces and social question answering
CN109359194B (en) Method and apparatus for predicting information categories
US11755973B2 (en) System and method for intelligent contract guidance
CN112559865A (en) Information processing system, computer-readable storage medium, and electronic device
CN116490879A (en) Method and system for over-prediction in neural networks
CN116745792A (en) System and method for intelligent job management and resolution
CN118484337A (en) Workflow recovery method and device
CN114399259A (en) Employee data processing method and device
US20210027155A1 (en) Customized models for on-device processing workflows
CN112131379A (en) Method, device, electronic equipment and storage medium for identifying problem category
CN111782776A (en) Method and device for realizing intention identification through slot filling
US20240177172A1 (en) System And Method of Using Generative AI for Customer Support
US20240062219A1 (en) Granular taxonomy for customer support augmented with ai
CN113344405B (en) Method, device, equipment, medium and product for generating information based on knowledge graph
JP2018132838A (en) Information processing device and program for information processing device
US20220021634A1 (en) Information linkage device, information linkage system, information linkage method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination