CN116450789A - Flow execution method and device - Google Patents

Flow execution method and device Download PDF

Info

Publication number
CN116450789A
CN116450789A CN202310316325.XA CN202310316325A CN116450789A CN 116450789 A CN116450789 A CN 116450789A CN 202310316325 A CN202310316325 A CN 202310316325A CN 116450789 A CN116450789 A CN 116450789A
Authority
CN
China
Prior art keywords
node
flow
path
judging
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310316325.XA
Other languages
Chinese (zh)
Inventor
杨双涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Nuodi Beijing Intelligent Technology Co ltd
Original Assignee
Lenovo Nuodi Beijing Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Nuodi Beijing Intelligent Technology Co ltd filed Critical Lenovo Nuodi Beijing Intelligent Technology Co ltd
Priority to CN202310316325.XA priority Critical patent/CN116450789A/en
Publication of CN116450789A publication Critical patent/CN116450789A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/332Query formulation
    • G06F16/3329Natural language query formulation or dialogue systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/23Updating
    • G06F16/2379Updates performed during online database operations; commit processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2458Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
    • G06F16/2474Sequence data queries, e.g. querying versioned data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/31Indexing; Data structures therefor; Storage structures
    • G06F16/316Indexing structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/01Customer relationship services
    • G06Q30/015Providing customer assistance, e.g. assisting a customer within a business location or via helpdesk

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Business, Economics & Management (AREA)
  • Computational Linguistics (AREA)
  • Fuzzy Systems (AREA)
  • Marketing (AREA)
  • Human Computer Interaction (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Finance (AREA)
  • Probability & Statistics with Applications (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The present disclosure provides a flow execution method and apparatus, the method including: responding to the execution of the flow to a judging node, acquiring response data corresponding to the judging node, and determining at least one optional path corresponding to the judging node; determining a target selectable path in the at least one selectable path based on response conditions corresponding to the at least one selectable path, the response data and related historical data information; and continuing to execute the flow based on the execution node corresponding to the target selectable path.

Description

Flow execution method and device
Technical Field
The disclosure relates to the technical field of voice recognition, and in particular relates to a flow execution method and device.
Background
The intelligent customer service has wide application in various industries, but various intelligent customer services at present often need to predefine a slot system in a scene of solving task type dialogue (needing multiple rounds of interaction), and the dialogue is driven by the slot system; the definition of the slot is complicated, the proper slot system can be defined only by deeply knowing the service, and for the operation of a new adding intention or a new adding flow, the slot system also needs to be updated synchronously, so that the updating cost of the client system is increased; and the slot system cannot be directly transferred and applied to other fields and services.
Disclosure of Invention
The disclosure provides a flow execution method and a flow execution device.
According to a first aspect of the present disclosure, there is provided a flow execution method, including:
responding to the execution of the flow to a judging node, acquiring response data corresponding to the judging node, and determining at least one optional path corresponding to the judging node;
determining a target selectable path in the at least one selectable path based on response conditions corresponding to the at least one selectable path, the response data and related historical data information;
and continuing to execute the flow based on the execution node corresponding to the target selectable path.
According to a second aspect of the present disclosure, there is provided a flow processing apparatus, including:
the system comprises an acquisition unit, a judgment node and a processing unit, wherein the acquisition unit is used for responding to the execution of a flow to the judgment node, acquiring response data corresponding to the judgment node and determining at least one optional path corresponding to the judgment node;
a determining unit, configured to determine a target selectable path in the at least one selectable path based on a response condition corresponding to the at least one selectable path, the response data, and related history data information;
and the execution unit is used for continuing to execute the flow based on the execution node corresponding to the target selectable path.
According to the flow execution method, response data corresponding to a judging node is obtained by responding to flow execution to the judging node, and at least one optional path corresponding to the judging node is determined; determining a target selectable path in the at least one selectable path based on response conditions corresponding to the at least one selectable path, the response data and related historical data information; continuing to execute the flow based on the execution node corresponding to the target selectable path; thus, the nodes in the flow are divided into judging nodes, executing nodes and optional paths, and communication cost and operation cost of the flow modeling are reduced through a flow modeling method which is closer to natural language; in the process of executing the flow, the target selectable path is determined based on the collected response data, the response condition of the selectable path and the historical data information, so that the executing node corresponding to the target selectable path continues to execute the flow, a traditional slot system is abandoned, and the establishment, maintenance and migration of the later-stage business flow are facilitated.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
The above, as well as additional purposes, features, and advantages of exemplary embodiments of the present disclosure will become readily apparent from the following detailed description when read in conjunction with the accompanying drawings. Several embodiments of the present disclosure are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings, in which:
in the drawings, the same or corresponding reference numerals indicate the same or corresponding parts.
FIG. 1 illustrates an alternative flow diagram of a flow execution method provided by an embodiment of the present disclosure;
FIG. 2 illustrates another alternative flow diagram of a flow execution method provided by an embodiment of the present disclosure;
FIG. 3 illustrates an alternative flow diagram of a training flow process model provided by an embodiment of the present disclosure;
FIG. 4 illustrates yet another alternative flow diagram of a flow execution method provided by an embodiment of the present disclosure;
FIG. 5 illustrates an alternative schematic diagram of the graph structure provided by embodiments of the present disclosure;
FIG. 6 illustrates another alternative schematic diagram of a training flow process model provided by an embodiment of the present disclosure;
FIG. 7 illustrates a schematic diagram of determining a target alternative path based on a flow processing model provided by an embodiment of the present disclosure;
FIG. 8 illustrates another schematic diagram of determining a target alternative path based on a flow processing model provided by an embodiment of the present disclosure;
FIG. 9 illustrates yet another diagram of determining a target alternative path based on a flow processing model provided by an embodiment of the present disclosure;
fig. 10 is a schematic diagram showing an alternative configuration of a flow execution device provided by an embodiment of the present disclosure;
fig. 11 shows a schematic diagram of a composition structure of an electronic device according to an embodiment of the present disclosure.
Detailed Description
In order to make the objects, features and advantages of the present disclosure more comprehensible, the technical solutions in the embodiments of the present disclosure will be clearly described in conjunction with the accompanying drawings in the embodiments of the present disclosure, and it is apparent that the described embodiments are only some embodiments of the present disclosure, but not all embodiments. Based on the embodiments in this disclosure, all other embodiments that a person skilled in the art would obtain without making any inventive effort are within the scope of protection of this disclosure.
At present, intelligent customer service has wide application in various industries, but various intelligent customer service at present often needs to predefine a slot system in a scene of solving task type dialogue (needing multiple rounds of interaction), the dialogue is driven by the slot system, the slot is key information which needs to be collected by the system to a user, and the following problems exist in the implementation process:
(1) The definition of the slot is complicated, and the proper slot system can be defined only by deep knowledge of the service;
(2) When aiming at various flow-based services or service problems requiring multiple questions and answers, the intelligent customer service system is often based on a slot definition flow, and the intelligent customer service system is often faced with a new intention or a new flow operation, the slot system needs to be synchronously updated, and a slot extraction module also often needs to be synchronously updated, so that the updating cost of the customer service system is increased;
(3) The slot extraction module oriented to the specific slot architecture cannot be directly migrated and applied to other fields and services.
Based on the drawbacks of the related art, the present disclosure provides a flow execution method to at least solve some or all of the above technical problems.
Fig. 1 shows an alternative flowchart of a flowchart execution method provided by an embodiment of the present disclosure, and will be described according to the steps.
Step S101, in response to the execution of the flow to the determination node, obtaining response data corresponding to the determination node, and determining at least one optional path corresponding to the determination node.
In some embodiments, an execution carrier (hereinafter referred to as carrier) of the flow execution method confirms a graph structure corresponding to a service type based on the service type corresponding to a flow, and an execution node, a judgment node and an optional path included in the graph structure.
In some embodiments, the determining node is a node for obtaining the object information, when the determining node is confirmed, the question information (or the natural language question rule) corresponding to the determining node is needed, and the optional execution path from the determining node is an optional path corresponding to the determining node, where different optional paths correspond to different response conditions. For example, if the determining node is configured to obtain the house information of the object, the determining node may correspond to 3 optional paths: local premises, foreign premises and uncertainty. The executing node is a node for executing the corresponding intention of the object information, and corresponds to the optional path, namely one executing node corresponding to one optional path. In some alternative embodiments, the nodes that export the format information (e.g., welcome information, push content information, call API) in the graph structure may also be considered execution nodes.
Wherein the object information may include object intention information, i.e., all or part of intention information of a service that the object wants to transact.
In some embodiments, the response data includes answer information of an object corresponding to the question information of the determination node, or key information extracted from the answer information.
Step S102, determining a target selectable path in the at least one selectable path based on response conditions corresponding to the at least one selectable path, the response data and related historical data information.
In some embodiments, the carrier may analyze the response data and the historical data information based on natural language understanding techniques, determine response conditions satisfied by the response data, and determine an alternate path corresponding to the response conditions as a target alternate path.
In specific implementation, the carrier is based on a natural language understanding technology, response conditions met by the response data are judged in an auxiliary mode through historical data information, and a target selectable path corresponding to the execution node is determined based on the response conditions.
In some embodiments, the carrier may confirm the historical data information based on the interaction information obtained during the execution of the determination node and the execution of the execution node; the historical data information is used for confirming a target optional path from at least one optional path corresponding to the judging node. Optionally, in the process of executing the flow, the interaction information generated in the process is historical data information relative to the interaction information generated before the current node, and after determining a target optional path corresponding to a certain judgment node and executing an execution node of the target optional path, the interaction information generated in the process is added into the historical data information.
In some embodiments, the response data and the history data information may be text data or voice data, and the disclosure is not particularly limited.
Step S103, continuing to execute the flow based on the execution node corresponding to the target selectable path.
In some embodiments, the carrier continues to execute the flow based on the execution node corresponding to the target selectable path until all execution of the flow is completed.
Thus, by the flow execution method provided by the embodiment of the disclosure, the nodes in the flow are divided into the judging nodes, the executing nodes and the optional paths, and by the flow modeling method which is closer to the natural language, the communication cost and the operation cost of the flow modeling are reduced; in the process of executing the flow, the target selectable path is determined based on the collected response data, the response condition of the selectable path and the historical data information, so that the executing node corresponding to the target selectable path continues to execute the flow, a traditional slot system is abandoned, and the establishment, maintenance and migration of the later-stage business flow are facilitated.
Fig. 2 shows another alternative flowchart of the flowchart execution method provided by the embodiment of the present disclosure, and will be described according to the steps.
Step S201, in response to the execution of the flow to the determination node, obtaining response data corresponding to the determination node, and determining at least one optional path corresponding to the determination node.
In some embodiments, in response to execution of a flow to a determination node, acquiring interaction information currently corresponding to the determination node, and acquiring response data corresponding to the determination node based on the interaction information currently corresponding to the determination node; or, in response to the execution of the flow to the judging node, acquiring response information corresponding to the questioning information based on the questioning information corresponding to the judging node, and acquiring response data corresponding to the judging node based on the response information.
The interaction information may be historical data information, which may include the response data, that is, the response data corresponding to the determination node is provided when the object interacts with other nodes before the determination node, and when the object is executed to the determination node, the response data corresponding to the determination node may be determined based on the interaction information corresponding to the determination node, without obtaining the response data again through the question information. Correspondingly, if the response data corresponding to the judging node is not provided when the object interacts with other nodes before the judging node, the carrier obtains response information corresponding to the question information based on the question information corresponding to the judging node, and obtains the response data corresponding to the judging node based on the response information.
Step S202, taking the response condition, the response data and the related historical data information corresponding to the at least one selectable path as the input of the flow process model, and determining the output of the flow process model as the target selectable path.
In some embodiments, the carrier may determine the target alternative path based on a flow processing model.
In the implementation, the carrier uses the response condition corresponding to the at least one selectable path, the response data and related historical data information as the input of the flow process model, and determines the output of the flow process model as the target selectable path. Alternatively, the output of the flow processing model may be the target selectable path directly, or may be scores corresponding to all the selectable paths respectively, and the carrier confirms that the selectable path with the highest score is the target selectable path.
FIG. 3 shows an alternative flow diagram of a training flow process model provided by an embodiment of the present disclosure.
In some embodiments, before step S201, the method may further include training the flow process model, and may include the steps of:
in step S2021, training data is acquired.
In some embodiments, the carrier confirms a graph structure corresponding to the service type based on the service type corresponding to the flow, and an execution node, a judgment node and an optional path included in the graph structure; training data is validated from a natural language dialogue dataset based on the execution node, the decision node, and the alternative path. Wherein, the training data comprises labeling selectable paths.
In the implementation, the carrier confirms that the question information corresponding to the judging node and the answer information associated with the question information are the first training sub-data corresponding to the judging node in the natural language dialogue data set; confirming the natural language dialogue data set, and according to response conditions of different optional paths corresponding to the judging nodes and response data associated with the response conditions, obtaining second training sub-data corresponding to the optional paths; confirming a target selectable path labeling selectable path corresponding to the response data in the natural language dialogue data set; the first training sub-data, the second training sub-data and the labeling alternative path are the training data.
Step S2022, training a process model based on the training data.
In some embodiments, the carrier inputs training data into the flow processing model, confirms output of the flow processing model, and is a score of each optional path corresponding to the judgment node in the natural language dialogue dataset; and adjusting parameters of the flow processing model based on the scores of the optional paths corresponding to the judging nodes and the marked optional paths.
Step S203, continuing to execute the flow based on the execution node corresponding to the target selectable path.
In some embodiments, the carrier continues to execute the flow based on the execution node corresponding to the target selectable path until all execution of the flow is completed.
In some embodiments, after step S103 and step S203, the method may further include:
step S204, updating the graph structure.
In some embodiments, in response to adding a judging node in the graph structure corresponding to the flow, adding question information corresponding to the judging node; or deleting the question information corresponding to the judging node in response to deleting the judging node in the graph structure.
When the judging node is added or deleted in the graph structure, the optional path corresponding to the judging node and the executing node corresponding to the optional path can be correspondingly added or deleted.
In some alternative embodiments, after adding the determination node, the carrier may further retrain the flow processing model, where the specific process of training the flow processing model is the same as that of step S2021 and step S2022, and the detailed description is not repeated here.
Thus, through the flow processing method provided by the embodiment of the disclosure, firstly, modeling is performed on a business flow based on a graph structure, a predefined slot system is not needed, nodes in the business flow are divided into executing nodes, judging nodes and optional paths, and the communication cost and the operation cost of the business flow modeling can be reduced to a great extent by adopting the business flow modeling method which is closer to natural language; secondly, a flow processing model is adopted to replace a slot extraction module, the flow processing model analyzes a business flow, when the business flow is executed to a judging node, the judging node acquires problem information of the node and response conditions corresponding to an optional path which is sent out from the current judging node, and then the optional path which accords with the response conditions is selected as a target optional path to continue to execute the next step by combining historical data information, so that when an intention and a flow are newly added on line, only the business flow is needed to be newly added or updated, the flow processing model does not need to be synchronously updated, and the normal execution of the newly added flow on line can be supported, and the operation cost is effectively reduced; finally, the flow processing model needs to be trained, is not dependent on specific domain definition any more, and can be applied to other domain migration scenes.
In some embodiments, the flow execution method may be applied to a plurality of scenes, such as civil scenes (e.g., make-up identification cards, etc.), after-sales scenes (e.g., repair of furniture home appliances), transacted scenes (e.g., transacted bank cards, membership cards, etc.), and consultation scenes, etc., which are not particularly limited in this disclosure. The embodiments of the present disclosure will be further described below with respect to the scenario of supplementing an identification card.
Fig. 4 shows still another alternative flowchart of the flowchart execution method provided in the embodiment of the present disclosure, which will be described in connection with the respective steps.
Step S301, determining a graph structure corresponding to the flow.
In some embodiments, corresponding graph structures are determined according to a service flow (i.e., a flow), nodes in the service flow are divided into executing nodes, judging nodes and optional paths, wherein the judging nodes represent nodes requiring an object to provide information, and corresponding natural language question-and-talk (i.e., question information) when the judging nodes need to confirm that the object provides information are defined; the alternative path represents an alternative execution path from the judging node, and the natural language condition (i.e. the response condition) which needs to be satisfied when defining the alternative path and the alternative path is executed needs to be confirmed.
Fig. 5 shows an alternative schematic of the graph structure provided by embodiments of the present disclosure.
As shown in fig. 5, taking the "complement id" as an example, according to the service requirement, it is first required to determine whether the object is a local portal, that is, it is required that the object provides response information, so that a judgment node is added in the graph structure, and a natural language question-and-talk (i.e., information-raising) corresponding to the judgment node is set: whether the user is a local household; three alternative paths from the judging node are provided, and the corresponding natural language conditions are as follows: "is a local household", "is not a local household" and "indeterminate", the three alternative paths point to different execution nodes.
Step S302, training a flow processing model.
In some embodiments, the carrier confirms that the question information corresponding to the judgment node in the natural language dialogue dataset and the answer information associated with the question information are first training sub-data corresponding to the judgment node; confirming the natural language dialogue data set, and according to response conditions of different optional paths corresponding to the judging nodes and response data associated with the response conditions, obtaining second training sub-data corresponding to the optional paths; confirming a target selectable path labeling selectable path corresponding to the response data in the natural language dialogue data set; the first training sub-data, the second training sub-data and the labeling alternative path are the training data.
FIG. 6 illustrates another alternative schematic diagram of a training flow process model provided by an embodiment of the present disclosure.
In some embodiments, as shown in fig. 6, the carrier inputs training data into the flow processing model, and confirms output of the flow processing model as scores of the respective alternative paths corresponding to the judgment nodes in the natural language dialogue dataset; and adjusting parameters of the flow processing model based on the scores of the optional paths corresponding to the judging nodes and the marked optional paths.
Step S303, responding to the execution of the flow to a judging node, and determining a target optional path corresponding to the judging node.
FIG. 7 illustrates a schematic diagram of determining a target alternative path based on a flow processing model provided by an embodiment of the present disclosure.
In some embodiments, the carrier reads the graph structure, and sequentially executes the execution nodes in the graph structure until the nodes are determined. After the execution of the natural language query operation corresponding to the judgment node, the historical data information and all the optional paths going out from the judgment node are obtained, as shown in fig. 7, the input of a flow processing model is constructed according to the natural language conditions corresponding to different optional paths, and the target optional path corresponding to the judgment node is determined according to the output of the flow processing model.
In some embodiments, the flow processing model selects a target selectable path with highest probability from three selectable paths according to dialogue content (response information and historical data information) and corresponding conditions, so as to determine the execution flow of the dialogue.
Fig. 8 illustrates another schematic diagram of determining a target alternative path based on a flow process model provided by an embodiment of the present disclosure, and fig. 9 illustrates yet another schematic diagram of determining a target alternative path based on a flow process model provided by an embodiment of the present disclosure.
As shown in fig. 8, in the dialogue, "what can ask you" is a judging node for obtaining object information, the response information includes "make up identification card", "me is foreign" as history data information, and correspondingly, in the graph structure, the carrier confirms the intention of the object based on the history data information, and then confirms that the target alternative path is an alternative path corresponding to "not local user's mouth" based on the response information flow processing model.
As shown in fig. 9, "please ask what can help you" and "please ask you if you are local user ports" are determination nodes for obtaining object information, the response information includes "complement identification card" and "complement identification card for several days", in the diagram structure, the carrier confirms the intention of the object based on the response information, and then confirms that the target alternative path is an alternative path corresponding to "unable to confirm" based on the response information flow processing model.
Step S304, continuing to execute the flow based on the execution node corresponding to the target selectable path.
In some embodiments, the carrier continues to execute the flow based on the execution node corresponding to the target selectable path until all execution of the flow is completed.
In some embodiments, after step S304, the method may further include:
step S305, update the graph structure.
In some embodiments, in response to adding a judging node in the graph structure corresponding to the flow, adding question information corresponding to the judging node; or deleting the question information corresponding to the judging node in response to deleting the judging node in the graph structure.
When the judging node is added or deleted in the graph structure, the optional path corresponding to the judging node and the executing node corresponding to the optional path can be correspondingly added or deleted.
Thus, through the flow processing method provided by the embodiment of the disclosure, firstly, modeling is performed on a business flow based on a graph structure, a predefined slot system is not needed, nodes in the business flow are divided into executing nodes, judging nodes and optional paths, and the communication cost and the operation cost of the business flow modeling can be reduced to a great extent by adopting the business flow modeling method which is closer to natural language; secondly, a flow processing model is adopted to replace a slot extraction module, the flow processing model analyzes a business flow, when the business flow is executed to a judging node, the judging node acquires problem information of the node and response conditions corresponding to an optional path which is sent out from the current judging node, and then the optional path which accords with the response conditions is selected as a target optional path to continue to execute the next step by combining historical data information, so that when an intention and a flow are newly added on line, only the business flow is needed to be newly added or updated, the flow processing model does not need to be synchronously updated, and the normal execution of the newly added flow on line can be supported, and the operation cost is effectively reduced; finally, the flow processing model needs to be trained, is not dependent on specific domain definition any more, and can be applied to other domain migration scenes.
Fig. 10 is a schematic diagram showing an alternative configuration of a flow execution device provided in an embodiment of the present disclosure, and will be described in terms of the respective sections.
In some embodiments, the flow execution apparatus 700 includes: an acquisition unit 701, a determination unit 702, and an execution unit 703.
The obtaining unit 701 is configured to obtain response data corresponding to a determination node in response to execution of a flow to the determination node, and determine at least one optional path corresponding to the determination node;
the determining unit 702 is configured to determine a target selectable path in the at least one selectable path based on a response condition corresponding to the at least one selectable path, the response data, and related historical data information;
the executing unit 703 is configured to continue executing the flow based on the executing node corresponding to the target selectable path.
The determining unit 702 is implemented based on a flow processing model, and determines, as input of the flow processing model, output of the flow processing model as the target selectable path, response conditions corresponding to the at least one selectable path, the response data and related historical data information;
alternatively, the determining unit 702 analyzes the response data and the history data information based on a natural language understanding technique, and judges a response condition satisfied by the response data to determine an alternative path corresponding to the response condition as a target alternative path.
The obtaining unit 701 is configured to implement at least one of the following:
responding to the execution of the flow to a judging node, acquiring interaction information corresponding to the judging node at present, and acquiring response data corresponding to the judging node based on the interaction information corresponding to the judging node at present;
responding to the execution of the flow to a judging node, acquiring response information corresponding to the question information based on the question information corresponding to the judging node, and acquiring response data corresponding to the judging node based on the response information.
The acquiring unit 701 is further configured to confirm historical data information based on the interaction information acquired during the execution of the judging node and the executing node; the historical data information is used for confirming a target optional path from at least one optional path corresponding to the judging node.
In some embodiments, the flow execution apparatus 700 may further include an updating unit 704.
The updating unit 704 is configured to increase the question information corresponding to the determination node in response to the newly increased determination node in the graph structure corresponding to the flow; or deleting the question information corresponding to the judging node in response to deleting the judging node in the graph structure.
The determining unit 702 is further configured to, before determining a target alternative path in the at least one alternative path, confirm a graph structure corresponding to a service type based on the service type corresponding to the flow, and an execution node, a judgment node, and an alternative path included in the graph structure; validating training data from a natural language dialogue dataset based on the execution node, the decision node, and the selectable path; and training the flow processing model based on the training data.
In some embodiments, the node for obtaining object information is the judging node, wherein the object information is used for determining the response data; different response conditions corresponding to the judging nodes are selectable paths; the node for executing the response data corresponding intention is the executing node, and the executing node corresponds to the selectable path.
The determining unit 702 is specifically configured to confirm that the question information corresponding to the judgment node in the natural language dialogue dataset and the answer information associated with the question information are first training sub-data corresponding to the judgment node;
confirming the natural language dialogue data set, and according to response conditions of different optional paths corresponding to the judging nodes and response data associated with the response conditions, obtaining second training sub-data corresponding to the optional paths;
confirming a target selectable path labeling selectable path corresponding to the response data in the natural language dialogue data set;
the first training sub-data, the second training sub-data and the labeling alternative path are the training data.
The determining unit 702 is specifically configured to input training data into the flow processing model, and confirm output of the flow processing model, where the training data is a score of each optional path corresponding to the judgment node in the natural language dialogue dataset;
and adjusting parameters of the flow processing model based on the scores of the optional paths corresponding to the judging nodes and the marked optional paths.
According to embodiments of the present disclosure, the present disclosure also provides an electronic device and a readable storage medium.
Fig. 11 illustrates a schematic block diagram of an example electronic device 800 that can be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 11, the electronic device 800 includes a computing unit 801 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 802 or a computer program loaded from a storage unit 808 into a Random Access Memory (RAM) 803. In the RAM 803, various programs and data required for the operation of the electronic device 800 can also be stored. The computing unit 801, the ROM 802, and the RAM 803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to the bus 804.
Various components in electronic device 800 are connected to I/O interface 805, including: an input unit 806 such as a keyboard, mouse, etc.; an output unit 807 such as various types of displays, speakers, and the like; a storage unit 808, such as a magnetic disk, optical disk, etc.; and a communication unit 809, such as a network card, modem, wireless communication transceiver, or the like. The communication unit 809 allows the electronic device 800 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.
The computing unit 801 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 801 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 801 performs the respective methods and processes described above, for example, a flow execution method. For example, in some embodiments, the flow execution method may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as the storage unit 808. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 800 via the ROM 802 and/or the communication unit 809. When a computer program is loaded into RAM 803 and executed by computing unit 801, one or more steps of the flow execution method described above may be performed. Alternatively, in other embodiments, the computing unit 801 may be configured to perform the flow execution method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server incorporating a blockchain.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps recited in the present disclosure may be performed in parallel or sequentially or in a different order, provided that the desired results of the technical solutions of the present disclosure are achieved, and are not limited herein.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present disclosure, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
The foregoing is merely specific embodiments of the disclosure, but the protection scope of the disclosure is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the disclosure, and it is intended to cover the scope of the disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (10)

1. A flow execution method, the method comprising:
responding to the execution of the flow to a judging node, acquiring response data corresponding to the judging node, and determining at least one optional path corresponding to the judging node;
determining a target selectable path in the at least one selectable path based on response conditions corresponding to the at least one selectable path, the response data and related historical data information;
and continuing to execute the flow based on the execution node corresponding to the target selectable path.
2. The method of claim 1, the determining a target alternate path of the at least one alternate path based on response conditions, the response data, and related historical data information corresponding to the at least one alternate path, comprising at least one of:
the method is executed based on a flow processing model, response conditions corresponding to the at least one selectable path, the response data and relevant historical data information are used as inputs of the flow processing model, and the output of the flow processing model is determined to be the target selectable path;
and analyzing the response data and the historical data information based on natural language understanding technology, judging response conditions met by the response data, and determining alternative paths corresponding to the response conditions as target alternative paths.
3. The method of claim 1, wherein the responding to the execution of the flow to the determination node, obtaining the response data corresponding to the determination node, comprises at least one of the following:
responding to the execution of the flow to a judging node, acquiring interaction information corresponding to the judging node at present, and acquiring response data corresponding to the judging node based on the interaction information corresponding to the judging node at present;
responding to the execution of the flow to a judging node, acquiring response information corresponding to the question information based on the question information corresponding to the judging node, and acquiring response data corresponding to the judging node based on the response information.
4. The method of claim 1, the method further comprising:
confirming historical data information based on the interactive information acquired in the process of executing the judging node and the executing node;
the historical data information is used for confirming a target optional path from at least one optional path corresponding to the judging node.
5. The method of claim 1, the method further comprising:
responding to the newly added judging node in the graph structure corresponding to the flow, and adding question information corresponding to the judging node;
or deleting the question information corresponding to the judging node in response to deleting the judging node in the graph structure.
6. The method of claim 2, the method further comprising training a flow process model prior to the determining a target alternate path of the at least one alternate path, the training the flow process model comprising:
based on the service type corresponding to the flow, confirming a graph structure corresponding to the service type, and an executing node, a judging node and an optional path included in the graph structure;
validating training data from a natural language dialogue dataset based on the execution node, the decision node, and the selectable path;
and training the flow processing model based on the training data.
7. The method according to claim 1 or 6, wherein,
the node for obtaining the object information is the judging node, wherein the object information is used for determining the response data;
different response conditions corresponding to the judging nodes are selectable paths;
the node for executing the response data corresponding intention is the executing node, and the executing node corresponds to the selectable path.
8. The method of claim 6, the validating training data from a natural language dialogue dataset based on the execution node, the decision node, and the alternative path, comprising:
confirming that the question information corresponding to the judging node and the answer information associated with the question information in the natural language dialogue data set are first training sub-data corresponding to the judging node;
confirming the natural language dialogue data set, and according to response conditions of different optional paths corresponding to the judging nodes and response data associated with the response conditions, obtaining second training sub-data corresponding to the optional paths;
confirming a target selectable path labeling selectable path corresponding to the response data in the natural language dialogue data set;
the first training sub-data, the second training sub-data and the labeling alternative path are the training data.
9. The method of claim 8, the training a process model based on the training data, comprising:
inputting training data into the flow processing model, and confirming the output of the flow processing model to be the score of each optional path corresponding to the judgment node in the natural language dialogue data set;
and adjusting parameters of the flow processing model based on the scores of the optional paths corresponding to the judging nodes and the marked optional paths.
10. A flow processing apparatus, the apparatus comprising:
the system comprises an acquisition unit, a judgment node and a processing unit, wherein the acquisition unit is used for responding to the execution of a flow to the judgment node, acquiring response data corresponding to the judgment node and determining at least one optional path corresponding to the judgment node;
a determining unit, configured to determine a target selectable path in the at least one selectable path based on a response condition corresponding to the at least one selectable path, the response data, and related history data information;
and the execution unit is used for continuing to execute the flow based on the execution node corresponding to the target selectable path.
CN202310316325.XA 2023-03-28 2023-03-28 Flow execution method and device Pending CN116450789A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310316325.XA CN116450789A (en) 2023-03-28 2023-03-28 Flow execution method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310316325.XA CN116450789A (en) 2023-03-28 2023-03-28 Flow execution method and device

Publications (1)

Publication Number Publication Date
CN116450789A true CN116450789A (en) 2023-07-18

Family

ID=87124814

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310316325.XA Pending CN116450789A (en) 2023-03-28 2023-03-28 Flow execution method and device

Country Status (1)

Country Link
CN (1) CN116450789A (en)

Similar Documents

Publication Publication Date Title
CN112528995B (en) Method for training target detection model, target detection method and device
CN113627536A (en) Model training method, video classification method, device, equipment and storage medium
CN117474091A (en) Knowledge graph construction method, device, equipment and storage medium
CN113904943B (en) Account detection method and device, electronic equipment and storage medium
CN113742457B (en) Response processing method, device, electronic equipment and storage medium
CN113204614A (en) Model training method, method and device for optimizing training data set
CN115328621B (en) Transaction processing method, device, equipment and storage medium based on block chain
CN114051057B (en) Cloud equipment queuing time determination method and device, electronic equipment and medium
CN116450789A (en) Flow execution method and device
CN113590447B (en) Buried point processing method and device
CN115470798A (en) Training method of intention recognition model, intention recognition method, device and equipment
CN115273148A (en) Pedestrian re-recognition model training method and device, electronic equipment and storage medium
CN114254028A (en) Event attribute extraction method and device, electronic equipment and storage medium
CN116629810B (en) Operation recommendation method, device, equipment and medium based on building office system
CN118245622B (en) Case data analysis method and device, electronic equipment and storage medium
CN113344405B (en) Method, device, equipment, medium and product for generating information based on knowledge graph
CN115982466B (en) Method, device, equipment and storage medium for retrieving data
CN113572679B (en) Account intimacy generation method and device, electronic equipment and storage medium
CN115131641A (en) Image recognition method and device, electronic equipment and storage medium
CN116341663A (en) Extension method, device, equipment and medium of deep learning reasoning framework
CN117668833A (en) Abnormal operation identification method, device, electronic equipment and storage medium
CN115878810A (en) Event prediction method and device
CN114329219A (en) Data processing method, method and device for outputting knowledge content
CN118587002A (en) Risk identification method, device and equipment based on answer sheet and storage medium
CN116756459A (en) Training method of pre-estimation model, distribution amount pre-estimation method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination