CN116719911A - Automatic flow generation method, device, equipment and storage medium - Google Patents

Automatic flow generation method, device, equipment and storage medium Download PDF

Info

Publication number
CN116719911A
CN116719911A CN202311000835.2A CN202311000835A CN116719911A CN 116719911 A CN116719911 A CN 116719911A CN 202311000835 A CN202311000835 A CN 202311000835A CN 116719911 A CN116719911 A CN 116719911A
Authority
CN
China
Prior art keywords
automation
training
data
automatic
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311000835.2A
Other languages
Chinese (zh)
Other versions
CN116719911B (en
Inventor
吴怀谷
罗焱
孟宏阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Bufan Intelligent Technology Co ltd
Original Assignee
Chengdu Bufan Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Bufan Intelligent Technology Co ltd filed Critical Chengdu Bufan Intelligent Technology Co ltd
Priority to CN202311000835.2A priority Critical patent/CN116719911B/en
Publication of CN116719911A publication Critical patent/CN116719911A/en
Application granted granted Critical
Publication of CN116719911B publication Critical patent/CN116719911B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/3331Query processing
    • G06F16/334Query execution
    • G06F16/3344Query execution using natural language analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Databases & Information Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Machine Translation (AREA)

Abstract

The application discloses an automatic flow generation method, an automatic flow generation device, automatic flow generation equipment and a storage medium, wherein the method comprises the steps of obtaining training data; pre-training the basic large language model by utilizing a pre-training data set to obtain a pre-training universal cognitive model; training the pre-trained universal cognitive model by using an automation domain knowledge vector data set to obtain a process automation cognitive model; training the flow automatic cognition model by using the real-time data set to obtain a flow automatic generation model; performing intention recognition on the input natural language to generate automatic text or structural data; and carrying out automatic process editing on the input process editing information to generate an automatic process. The application automatically converts the natural language input into the automatic flow, establishes the mapping relation between the automatic operation means and the natural language expression, provides a platform for the user to edit the automatic flow, and realizes the automatic flow generation scheme with low difficulty and high flexibility.

Description

Automatic flow generation method, device, equipment and storage medium
Technical Field
The present application relates to the field of automated process generation, and in particular, to an automated process generation method, apparatus, device, and storage medium.
Background
Automated processes refer to processes that utilize computer technology and software tools to simulate human background tasks, such as data extraction, form filling, and file movement. The aim is to achieve the improvement of business and productivity by integrating repetitive tasks between different systems and applications.
Current automated process generation approaches are typically based on specific software or programming languages, requiring the use of complex process orchestration tools and programming skills. The user needs to know various tools, icons, panels and menus and perform cumbersome configuration and debugging tasks to design and perform automated tasks.
Although automated processes bring about efficiency improvement, the current technology still has some problems in the process of generating the automated processes:
-a complex user interface: the partial flow orchestration software has a complex user interface containing a large number of options and functions, so that it takes a long time for the user to learn and master the use of the software.
Workflow design difficulty: designing complex automated workflows requires processing multiple steps, conditions, loops, variables, and logic, which can be a challenge for non-technical users.
Exception handling and error debugging difficulties: locating and solving problems can be relatively difficult when an automation task is in error or malfunctioning, requiring in-depth knowledge of the robot task's execution process.
Data processing and integration complexity: processing data in different formats, integrating with other systems, etc. tasks require users to understand and configure complex data mapping, conversion, and integration logic, which can be difficult for non-technical users.
Based on the above problems, the present application aims to provide an automated process generation system based on large model driving, so as to solve the complexity and difficulty existing in the prior art.
Disclosure of Invention
In order to achieve the above object, the present application provides an automated process generating method, the method comprising the steps of:
acquiring training data; wherein the training data comprises a pre-training data set, an automation domain knowledge vector data set and a real-time data set;
pre-training the basic large language model by utilizing the pre-training data set to obtain a pre-training universal cognitive model;
training the pre-trained universal cognitive model by using the knowledge vector data set in the automation field to obtain a process automation cognitive model;
training the process automation cognitive model by utilizing the real-time data set to obtain a process automation generation model;
when natural language input of a user is received, using the flow automation generation model to carry out intention recognition on the input natural language, and generating automatic text or structure data;
when receiving the process editing information input of the user, carrying out automatic process editing by utilizing the automatic text or the structural data based on the process editing information to generate an automatic process.
Optionally, the method further comprises:
acquiring automation field data and constructing an automation field data set;
extracting domain knowledge of the automation domain data set, and constructing an automation domain knowledge set;
and based on the automation domain data set and the automation domain knowledge set, performing vectorization processing of the automation domain data based on knowledge driving to obtain an automation domain knowledge vector data set.
Optionally, the step of acquiring automation domain data specifically includes:
acquiring an automation field data source;
and carrying out data source cognition and acquisition matching on the data source in the automation field, and carrying out automation flow data acquisition according to the data source cognition and acquisition matching result to obtain automation flow data.
Optionally, after the step of extracting domain knowledge of the automation domain data set, the method further includes:
performing domain concept definition according to the extracted domain knowledge of the automation domain data set, and constructing an automation domain knowledge set according to the defined domain concept and the defined domain knowledge;
and controlling the automatic flow data acquisition according to the defined domain concept.
Optionally, after the step of obtaining the flow automation model, the method further includes:
and decomposing the flow automation generation model into a plurality of cognitive models according to functions, functions or rights of the deployment unit, and outputting an intention recognition result corresponding to the automation text or the structural data according to the output requirement in the natural language when each cognitive model receives the natural language input by the user.
Optionally, the generating the automation text or the structure data specifically includes: one or more of automated workflow, data processing rules, condition judgment, or integrated configuration.
Optionally, based on the process editing information, performing automated process editing by using the automated text or the structural data, and generating an automated process step specifically includes:
based on the process editing information, one or more of an automated workflow, data processing rules, condition judgment or integrated configuration is combined, edited or adjusted to generate an automated process for each specific task;
and when the automatic flow is sent to an automatic execution engine, executing a specific task corresponding to the automatic flow.
In addition, in order to achieve the above object, the present application also provides an automated process generating apparatus, including:
the acquisition module is used for acquiring training data; wherein the training data comprises a pre-training data set, an automation domain knowledge vector data set and a real-time data set;
the first training module is used for pre-training the basic large language model by utilizing the pre-training data set to obtain a pre-training universal cognitive model;
the second training module is used for training the pre-training general cognitive model by utilizing the knowledge vector data set in the automation field to obtain a process automation cognitive model;
the third training module is used for training the process automation cognitive model by utilizing the real-time data set to obtain a process automation generating model;
the recognition module is used for carrying out intention recognition on the input natural language by utilizing the flow automation generation model when receiving the natural language input of the user, and generating automatic text or structural data;
the generation module is used for carrying out automatic process editing by utilizing the automatic text or the structural data based on the process editing information when receiving the process editing information input of the user, so as to generate an automatic process.
In addition, in order to achieve the above object, the present application also provides an automated process generating apparatus including: the system comprises a memory, a processor and an automatic flow generating program stored on the memory and capable of running on the processor, wherein the automatic flow generating program realizes the steps of the automatic flow generating method when being executed by the processor.
In addition, in order to achieve the above object, the present application also provides a storage medium having stored thereon an automated process generation program which, when executed by a processor, implements the steps of the above-described automated process generation method.
The application has the beneficial effects that: an automatic process generation method, an automatic process generation device and a storage medium are provided, wherein the method comprises the steps of obtaining training data; wherein the training data comprises a pre-training data set, an automation domain knowledge vector data set and a real-time data set; pre-training the basic large language model by utilizing the pre-training data set to obtain a pre-training universal cognitive model; training the pre-trained universal cognitive model by using the knowledge vector data set in the automation field to obtain a process automation cognitive model; training the process automation cognitive model by utilizing the real-time data set to obtain a process automation generation model; when natural language input of a user is received, using the flow automation generation model to carry out intention recognition on the input natural language, and generating automatic text or structure data; when receiving the process editing information input of the user, carrying out automatic process editing by utilizing the automatic text or the structural data based on the process editing information to generate an automatic process. The application automatically converts the natural language input into the automatic flow, establishes the mapping relation between the automatic operation means and the natural language expression, provides a platform for the user to edit the automatic flow, and realizes the automatic flow generation scheme with low difficulty and high flexibility.
Drawings
FIG. 1 is a schematic diagram of a device structure of a hardware operating environment according to an embodiment of the present application;
FIG. 2 is a flow diagram of an embodiment of an automated flow generation method of the present application;
FIG. 3 is a schematic diagram of a framework of a system for automated process generation based on large model drives of the present application;
FIG. 4 is a schematic diagram of an automated process generation method according to the present application;
fig. 5 is a block diagram of an automated process generating apparatus according to an embodiment of the present application.
The achievement of the objects, functional features and advantages of the present application will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
Referring to fig. 1, fig. 1 is a schematic diagram of an apparatus structure of a hardware running environment according to an embodiment of the present application.
As shown in fig. 1, the apparatus may include: a processor 1001, such as a CPU, a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. Wherein the communication bus 1002 is used to enable connected communication between these components. The user interface 1003 may include a Display, an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a stable memory (non-volatile memory), such as a disk memory. The memory 1005 may also optionally be a storage device separate from the processor 1001 described above.
It will be appreciated by those skilled in the art that the arrangement of the apparatus shown in fig. 1 is not limiting and may include more or fewer components than shown, or certain components may be combined, or a different arrangement of components.
As shown in fig. 1, an operating system, a network communication module, a user interface module, and an automated process generation program may be included in a memory 1005, which is a type of computer storage medium.
In the terminal shown in fig. 1, the network interface 1004 is mainly used for connecting to a background server and performing data communication with the background server; the user interface 1003 is mainly used for connecting a client (user side) and performing data communication with the client; and the processor 1001 may be configured to call an automated flow generation program stored in the memory 1005 and perform the following operations:
acquiring training data; wherein the training data comprises a pre-training data set, an automation domain knowledge vector data set and a real-time data set;
pre-training the basic large language model by utilizing the pre-training data set to obtain a pre-training universal cognitive model;
training the pre-trained universal cognitive model by using the knowledge vector data set in the automation field to obtain a process automation cognitive model;
training the process automation cognitive model by utilizing the real-time data set to obtain a process automation generation model;
when natural language input of a user is received, using the flow automation generation model to carry out intention recognition on the input natural language, and generating automatic text or structure data;
when receiving the process editing information input of the user, carrying out automatic process editing by utilizing the automatic text or the structural data based on the process editing information to generate an automatic process.
The specific embodiment of the present application applied to the apparatus is substantially the same as each embodiment of the following application automation flow generating method, and will not be described herein.
An embodiment of the present application provides an automatic process generating method, and referring to fig. 2, fig. 2 is a schematic flow diagram of an embodiment of the automatic process generating method of the present application.
In this embodiment, the automatic flow generating method includes the following steps:
s100: acquiring training data; wherein the training data comprises a pre-training data set, an automation domain knowledge vector data set and a real-time data set;
s200: pre-training the basic large language model by utilizing the pre-training data set to obtain a pre-training universal cognitive model;
s300: training the pre-trained universal cognitive model by using the knowledge vector data set in the automation field to obtain a process automation cognitive model;
s400: training the process automation cognitive model by utilizing the real-time data set to obtain a process automation generation model;
s500: when natural language input of a user is received, using the flow automation generation model to carry out intention recognition on the input natural language, and generating automatic text or structure data;
s600: when receiving the process editing information input of the user, carrying out automatic process editing by utilizing the automatic text or the structural data based on the process editing information to generate an automatic process.
In a preferred embodiment, the method further comprises: acquiring automation field data and constructing an automation field data set; extracting domain knowledge of the automation domain data set, and constructing an automation domain knowledge set; and based on the automation domain data set and the automation domain knowledge set, performing vectorization processing of the automation domain data based on knowledge driving to obtain an automation domain knowledge vector data set.
The process automation cognitive model is trained by utilizing the real-time data set, and a process automation generating model is obtained, specifically: and collecting real-time data, processing the data sources (websites/data sets and the like) according to the searched matching degree, so that the data sources can be learned by the language model, and then performing fine tuning training on the language model by using newly added knowledge at regular time, so that the language model can be learned to support and be compatible with more automatic operations.
That is, in this embodiment, after data is collected in the previous unit time, preprocessing is performed on the data sources according to the matching degree of the collected data, for example, when the collected data has a data type exceeding a preset threshold ratio and cannot be directly used, before data is collected in the next unit time, the corresponding data source having the data type exceeding the preset threshold ratio is processed, so that the output data can be directly used after the data is collected in the unit time, and further the model generating efficiency is improved.
Meanwhile, in this embodiment, the language model may be further subjected to fine tuning training with newly added knowledge, for example, every preset time, the data types in the data transmitted by each data source in the time period are counted, if the data amount of data of a certain data type in the time period exceeds a certain proportion, the data of the data type is used as new features of model training, the fine-tuned model is retrained, and is used for carrying out process automatic identification and generation, based on the fine-tuned model, the model which is most suitable for the current data environment can be adaptively generated according to the change of the data type in each data source in real time, the workload of preprocessing the data source can be reduced, the data type occupying a larger proportion is always kept as the data type which can be directly used for training, the data type occupying a smaller proportion is kept as the preprocessed data type, and the overall efficiency of model generation and the accuracy of identification and generation of the model are greatly improved.
In a preferred embodiment, the step of acquiring automation domain data specifically includes: acquiring an automation field data source; and carrying out data source cognition and acquisition matching on the data source in the automation field, and carrying out automation flow data acquisition according to the data source cognition and acquisition matching result to obtain automation flow data.
Specifically, in the process of performing data source cognition and acquisition matching on the automation domain data sources, it needs to be noted that in the data acquisition process, data source types with multiple dimensions are generally encountered, that is, each automation domain data source may include "data such as pictures, files, structures, unstructured data" and which automation domain data source is a data source more in line with the current automation process generation, and in the prior art, the data of each automation domain data source is randomly acquired or acquired in equal quantity without going from the beginning of the process, so that the matching degree of the data requirements is low, and the generated model has poor robustness.
In this embodiment, the matching degree is evaluated according to the data collected from the current unit time in the automation domain data source, and whether the next unit time extends into the relevant search domain is determined according to the evaluation result. For example, the data of multiple data sources are collected according to the first data collection proportion in the current unit time, the data matching degree evaluation is carried out after the data collection, the second data collection proportion with the highest final data matching degree is obtained by adjusting the first data collection proportion, and the data matching degree evaluation can be fed back according to the recognition efficiency and the accuracy of the generated model.
In a preferred embodiment, after the step of extracting domain knowledge of the automated domain data set, the method further comprises: performing domain concept definition according to the extracted domain knowledge of the automation domain data set, and constructing an automation domain knowledge set according to the defined domain concept and the defined domain knowledge; and controlling the automatic flow data acquisition according to the defined domain concept.
In this embodiment, the domain concept definition specifically includes: after domain knowledge is extracted from the automation domain data set, artificial domain concept definition is carried out on the domain knowledge, and the automation domain knowledge set is constructed by receiving domain concept definition information which is manually input and then utilizing the domain concept definition information and the extracted domain knowledge; meanwhile, the domain concept definition information can be fed back to the automatic flow data acquisition process, so that the efficiency and accuracy of acquiring the automatic flow data from the automatic domain data source are improved. For example, when the extracted domain knowledge is "XX is displayed in red", YY action "is executed, and at this time, the color of the XX can be defined as control logic according to the manually input domain concept, and when the automation domain knowledge set is generated, YY action" and "XX color are executed as control logic" when the "XX is displayed in red, so as to jointly generate the automation domain knowledge set; meanwhile, the color of the domain concept definition information XX is used as control logic to be fed back to an automatic flow data acquisition process so as to inform the automatic flow data acquisition process of carrying out specific acquisition on the color of the XX pointed by the domain concept definition. Therefore, on one hand, the search target can be converged through the domain concept definition information, more accurate knowledge features can be extracted to serve as training data, and on the other hand, the data with higher effectiveness and relativity can be collected in the automatic process data collection process through the feedback of the domain concept definition information, so that the data collection accuracy is improved.
In a preferred embodiment, after the step of obtaining the process automation generated model, the method further comprises: and decomposing the flow automation generation model into a plurality of cognitive models according to functions, functions or rights of the deployment unit, and outputting an intention recognition result corresponding to the automation text or the structural data according to the output requirement in the natural language when each cognitive model receives the natural language input by the user.
In a preferred embodiment, the generating automated text or structural data specifically includes: one or more of automated workflow, data processing rules, condition judgment, or integrated configuration.
In a preferred embodiment, based on the process editing information, performing automated process editing by using the automated text or structure data, and generating an automated process step specifically includes: based on the process editing information, one or more of an automated workflow, data processing rules, condition judgment or integrated configuration is combined, edited or adjusted to generate an automated process for each specific task; and when the automatic flow is sent to an automatic execution engine, executing a specific task corresponding to the automatic flow.
In this embodiment, an automatic process generation method is provided, which automatically converts natural language input into an automatic process, establishes a mapping relationship between an automatic operation means and a natural language expression, and provides a platform for a user to edit the automatic process, so as to implement an automatic process generation scheme with low difficulty and high flexibility.
In order to more clearly explain the present application, specific examples of the present application are provided below.
In this example, as shown in fig. 3, the system architecture for large model driven automated process generation for performing the automated process generation method is composed of several important parts, basic model construction, domain knowledge construction, model retraining, and intent recognition.
When the system executes automatic flow generation, as shown in fig. 4, the system specifically includes the following steps:
(1) Data preprocessing
Text data for training a large model needs to be preprocessed before driving an automated process generation. This includes collecting and sorting related art text data sets, and cleaning, tokenizing, word segmentation, vectorizing, etc. the text data to facilitate training and application of the model.
(2) Training and fine tuning of large models
The large model is trained using the preprocessed text dataset. Through extensive pre-training and fine tuning processes, models are enabled with the ability to understand and generate natural language, and consistent text can be generated from context.
(3) Intent resolution and understanding of natural language input
When a user inputs text or speech in natural language, the system first pre-processes the input, including cleaning, word segmentation, vectorization, and the like. The input is then parsed and understood using the trained large model to determine the intent and requirements of the user.
(4) Disassembly of the model
After the large model is trained, the system disassembles the large model into a cognitive model which is disassembled according to functions, functions and authority of the deployment unit, so that the cognitive model can be output according to the natural language input of a user. Such as text, images, code generation, etc.
(5) Automated task generation
The system generates text or structured data conforming to the automation knowledge field by using the model according to user input so as to create a workflow, data processing rules, condition judgment, integrated configuration and the like of the robot.
(6) Interface interactions and editing
And displaying the generated automatic task flow to a user, and providing friendly interface interaction. Allowing the user to view, edit, and adjust the generated automation flow to meet specific needs and business logic.
(7) Code generation and execution
According to the automatic flow edited by the user, the automatic flow is converted into an actual automatic code or configuration file for the robot to execute. And calling a corresponding automatic execution engine or platform to execute the generated automatic task flow.
Therefore, the system for generating the automation flow based on the large model drive is automatically converted into the automation flow through natural language input, a mapping relation between an automatic operation means and natural language expression is established, a large model is provided from basic capability, data acquisition, data preprocessing, automatic field data set construction, business field model training and fine adjustment, full life cycle capability of the automation flow generation is provided, meanwhile, knowledge construction in the automation professional field is also provided, and user interface operation based on natural language can be provided for a user. The complexity and difficulty of automatic flow generation in the prior art are solved.
Referring to fig. 5, fig. 5 is a block diagram illustrating an embodiment of an automated process generating apparatus according to the present application.
As shown in fig. 5, an automatic flow generating apparatus according to an embodiment of the present application includes:
an acquisition module 10 for acquiring training data; wherein the training data comprises a pre-training data set, an automation domain knowledge vector data set and a real-time data set;
a first training module 20, configured to perform pre-training on the basic large language model by using the pre-training data set, so as to obtain a pre-trained universal cognitive model;
a second training module 30, configured to train the pre-trained universal cognitive model by using the knowledge vector data set in the automation domain, so as to obtain a process automation cognitive model;
a third training module 40, configured to train the process automation cognitive model by using the real-time data set, so as to obtain a process automation generating model;
the recognition module 50 is configured to, when receiving a natural language input of a user, perform intent recognition on the input natural language by using the process automation generation model, and generate an automation text or structure data;
the generating module 60 is configured to, when receiving a process editing information input from a user, perform an automated process editing using the automated text or the structure data based on the process editing information, and generate an automated process.
Other embodiments or specific implementation manners of the automatic flow generating apparatus of the present application may refer to the above method embodiments, and are not described herein.
In addition, the application also provides an automatic flow generating device, which comprises: the system comprises a memory, a processor and an automatic flow generating program stored on the memory and capable of running on the processor, wherein the automatic flow generating program realizes the steps of the automatic flow generating method when being executed by the processor.
The specific implementation manner of the automatic flow generating device of the present application is basically the same as that of each embodiment of the automatic flow generating method, and will not be described herein.
Furthermore, the application also proposes a readable storage medium comprising a computer readable storage medium having an automated process generation program stored thereon. The readable storage medium may be a Memory 1005 in the terminal of fig. 1, or may be at least one of a ROM (Read-Only Memory)/RAM (Random Access Memory ), a magnetic disk, and an optical disk, and the readable storage medium includes several instructions for causing an automated process generating device having a processor to perform the automated process generating method according to the embodiments of the present application.
The specific implementation of the automatic flow generation program in the readable storage medium of the present application is substantially the same as the above embodiments of the automatic flow generation method, and will not be described herein.
It is appreciated that in the description herein, reference to the terms "one embodiment," "another embodiment," "other embodiments," or "first through nth embodiments," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) as described above, comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present application.
The foregoing description is only of the preferred embodiments of the present application, and is not intended to limit the scope of the application, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (10)

1. An automated process generation method, comprising the steps of:
acquiring training data; wherein the training data comprises a pre-training data set, an automation domain knowledge vector data set and a real-time data set;
pre-training the basic large language model by utilizing the pre-training data set to obtain a pre-training universal cognitive model;
training the pre-trained universal cognitive model by using the knowledge vector data set in the automation field to obtain a process automation cognitive model;
training the process automation cognitive model by utilizing the real-time data set to obtain a process automation generation model;
when natural language input of a user is received, using the flow automation generation model to carry out intention recognition on the input natural language, and generating automatic text or structure data;
when receiving the process editing information input of the user, carrying out automatic process editing by utilizing the automatic text or the structural data based on the process editing information to generate an automatic process.
2. The automated process generation method of claim 1, further comprising:
acquiring automation field data and constructing an automation field data set;
extracting domain knowledge of the automation domain data set, and constructing an automation domain knowledge set;
and based on the automation domain data set and the automation domain knowledge set, performing vectorization processing of the automation domain data based on knowledge driving to obtain an automation domain knowledge vector data set.
3. The automated process generating method of claim 2, wherein the step of obtaining automation domain data comprises:
acquiring an automation field data source;
and carrying out data source cognition and acquisition matching on the data source in the automation field, and carrying out automation flow data acquisition according to the data source cognition and acquisition matching result to obtain automation flow data.
4. The automated process generation method of claim 3, wherein after the step of extracting domain knowledge of the automated domain data set, the method further comprises:
performing domain concept definition according to the extracted domain knowledge of the automation domain data set, and constructing an automation domain knowledge set according to the defined domain concept and the defined domain knowledge;
and controlling the automatic flow data acquisition according to the defined domain concept.
5. The automated process generation method of claim 1, wherein after the process automation generation model step is obtained, the method further comprises:
and decomposing the flow automation generation model into a plurality of cognitive models according to functions, functions or rights of the deployment unit, and outputting an intention recognition result corresponding to the automation text or the structural data according to the output requirement in the natural language when each cognitive model receives the natural language input by the user.
6. The automated process generation method of claim 1, wherein the generating automated text or structural data specifically comprises: one or more of automated workflow, data processing rules, condition judgment, or integrated configuration.
7. The automated process generating method of claim 6, wherein automated process editing is performed using the automated text or structure data based on the process editing information, and wherein generating an automated process step specifically comprises:
based on the process editing information, one or more of an automated workflow, data processing rules, condition judgment or integrated configuration is combined, edited or adjusted to generate an automated process for each specific task;
and when the automatic flow is sent to an automatic execution engine, executing a specific task corresponding to the automatic flow.
8. An automated process generating apparatus, comprising:
the acquisition module is used for acquiring training data; wherein the training data comprises a pre-training data set, an automation domain knowledge vector data set and a real-time data set;
the first training module is used for pre-training the basic large language model by utilizing the pre-training data set to obtain a pre-training universal cognitive model;
the second training module is used for training the pre-training general cognitive model by utilizing the knowledge vector data set in the automation field to obtain a process automation cognitive model;
the third training module is used for training the process automation cognitive model by utilizing the real-time data set to obtain a process automation generating model;
the recognition module is used for carrying out intention recognition on the input natural language by utilizing the flow automation generation model when receiving the natural language input of the user, and generating automatic text or structural data;
the generation module is used for carrying out automatic process editing by utilizing the automatic text or the structural data based on the process editing information when receiving the process editing information input of the user, so as to generate an automatic process.
9. An automated process generating apparatus, the automated process generating apparatus comprising: memory, a processor and an automated process generation program stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the automated process generation method of any of claims 1 to 7.
10. A storage medium having stored thereon an automated process generation program which, when executed by a processor, implements the steps of the automated process generation method of any of claims 1 to 7.
CN202311000835.2A 2023-08-10 2023-08-10 Automatic flow generation method, device, equipment and storage medium Active CN116719911B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311000835.2A CN116719911B (en) 2023-08-10 2023-08-10 Automatic flow generation method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311000835.2A CN116719911B (en) 2023-08-10 2023-08-10 Automatic flow generation method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN116719911A true CN116719911A (en) 2023-09-08
CN116719911B CN116719911B (en) 2023-10-31

Family

ID=87875569

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311000835.2A Active CN116719911B (en) 2023-08-10 2023-08-10 Automatic flow generation method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116719911B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116975042A (en) * 2023-09-20 2023-10-31 北方健康医疗大数据科技有限公司 Intelligent data analysis system and method based on intelligent agent
CN117311697A (en) * 2023-11-28 2023-12-29 浙江城市数字技术有限公司 AIGC-based large language model self-feedback type flow creation method and system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102970578A (en) * 2012-11-19 2013-03-13 北京十分科技有限公司 Multimedia information identifying and training method and device
CN111159559A (en) * 2019-12-31 2020-05-15 青梧桐有限责任公司 Method for constructing recommendation engine according to user requirements and user behaviors
CN111324440A (en) * 2020-02-17 2020-06-23 深圳前海微众银行股份有限公司 Method, device and equipment for executing automation process and readable storage medium
CN111968624A (en) * 2020-08-24 2020-11-20 平安科技(深圳)有限公司 Data construction method and device, electronic equipment and storage medium
CN112419539A (en) * 2019-08-23 2021-02-26 北京车和家信息技术有限公司 Vehicle data acquisition method and device, cloud server and vehicle
WO2022141878A1 (en) * 2020-12-28 2022-07-07 平安科技(深圳)有限公司 End-to-end language model pretraining method and system, and device and storage medium
CN115240075A (en) * 2022-09-22 2022-10-25 山东大学 Construction and training method of electric power vision multi-granularity pre-training large model
WO2023038654A1 (en) * 2021-09-07 2023-03-16 Google Llc Using large language model(s) in generating automated assistant response(s)
CN116343230A (en) * 2023-03-30 2023-06-27 上海艺赛旗软件股份有限公司 Text recognition model training method, system, electronic equipment and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102970578A (en) * 2012-11-19 2013-03-13 北京十分科技有限公司 Multimedia information identifying and training method and device
CN112419539A (en) * 2019-08-23 2021-02-26 北京车和家信息技术有限公司 Vehicle data acquisition method and device, cloud server and vehicle
CN111159559A (en) * 2019-12-31 2020-05-15 青梧桐有限责任公司 Method for constructing recommendation engine according to user requirements and user behaviors
CN111324440A (en) * 2020-02-17 2020-06-23 深圳前海微众银行股份有限公司 Method, device and equipment for executing automation process and readable storage medium
CN111968624A (en) * 2020-08-24 2020-11-20 平安科技(深圳)有限公司 Data construction method and device, electronic equipment and storage medium
WO2022141878A1 (en) * 2020-12-28 2022-07-07 平安科技(深圳)有限公司 End-to-end language model pretraining method and system, and device and storage medium
WO2023038654A1 (en) * 2021-09-07 2023-03-16 Google Llc Using large language model(s) in generating automated assistant response(s)
CN115240075A (en) * 2022-09-22 2022-10-25 山东大学 Construction and training method of electric power vision multi-granularity pre-training large model
CN116343230A (en) * 2023-03-30 2023-06-27 上海艺赛旗软件股份有限公司 Text recognition model training method, system, electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李飞: "大模型和AIGA在营销领域的应用与未来发展", pages 1 - 5, Retrieved from the Internet <URL:https://www.163.com/dy/article/I8VO5I0O0511D3QS.html> *
王秋月 等: "基于主题模型的深层网数据源选择算法", 计算机应用, vol. 35, no. 09, pages 2553 - 2559 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116975042A (en) * 2023-09-20 2023-10-31 北方健康医疗大数据科技有限公司 Intelligent data analysis system and method based on intelligent agent
CN117311697A (en) * 2023-11-28 2023-12-29 浙江城市数字技术有限公司 AIGC-based large language model self-feedback type flow creation method and system
CN117311697B (en) * 2023-11-28 2024-02-27 浙江城市数字技术有限公司 AIGC-based large language model self-feedback type flow creation method and system

Also Published As

Publication number Publication date
CN116719911B (en) 2023-10-31

Similar Documents

Publication Publication Date Title
CN116719911B (en) Automatic flow generation method, device, equipment and storage medium
US10725827B2 (en) Artificial intelligence based virtual automated assistance
DE69327318T2 (en) SUPPORT FOR SYSTEM DEVELOPMENT.
US20190251417A1 (en) Artificial Intelligence System for Inferring Grounded Intent
CN107563417A (en) A kind of deep learning artificial intelligence model method for building up and system
CN107733782B (en) Method, device and system for generating group according to task
EP3617896A1 (en) Method and apparatus for intelligent response
CN114547072A (en) Method, system, equipment and storage medium for converting natural language query into SQL
CN110059927B (en) Performance online assessment method, device, equipment and storage medium
CN111353754A (en) Method and device for generating flow template, electronic equipment and readable storage medium
Rai et al. Robotic process automation: the virtual workforce
JPH03111969A (en) Method for supporting plan formation
CN116860985A (en) Natural language interaction software framework based on knowledge graph and construction method thereof
CN118192865A (en) Method, device, medium and equipment for batch drawing based on stable diffusion model
CN112817560B (en) Computing task processing method, system and computer readable storage medium based on table function
CN117235527A (en) End-to-end containerized big data model construction method, device, equipment and medium
CN116823164A (en) Business approval method, device, equipment and storage medium
CN116645550A (en) Generalized image recognition method for airborne display system based on test case
CN113377962B (en) Intelligent process simulation method based on image recognition and natural language processing
US8752004B2 (en) System and a method for generating a domain-specific software solution
CN116664078B (en) RPA object identification method based on semantic feature vector
JP7186411B1 (en) Information processing system, information processing method and information processing program
CN117874211B (en) Intelligent question-answering method, system, medium and electronic equipment based on SAAS software
CN117217238B (en) Intelligent interaction system and method based on large language model
US20220101238A1 (en) Computer implemented method of capturing and processing data from user activities in business processes and a computer program to implement this method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant