CN116149631A - Method for generating Web intelligent form based on natural language - Google Patents
Method for generating Web intelligent form based on natural language Download PDFInfo
- Publication number
- CN116149631A CN116149631A CN202310011891.XA CN202310011891A CN116149631A CN 116149631 A CN116149631 A CN 116149631A CN 202310011891 A CN202310011891 A CN 202310011891A CN 116149631 A CN116149631 A CN 116149631A
- Authority
- CN
- China
- Prior art keywords
- language
- domain
- web
- natural language
- specific
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 28
- 238000012549 training Methods 0.000 claims abstract description 54
- 230000006870 function Effects 0.000 claims abstract description 26
- 239000013598 vector Substances 0.000 claims description 28
- 238000013519 translation Methods 0.000 claims description 14
- 238000004458 analytical method Methods 0.000 claims description 11
- 230000007306 turnover Effects 0.000 claims description 3
- 238000005516 engineering process Methods 0.000 abstract description 9
- 238000010276 construction Methods 0.000 abstract description 4
- 238000013135 deep learning Methods 0.000 abstract description 4
- 230000009286 beneficial effect Effects 0.000 description 10
- 238000004519 manufacturing process Methods 0.000 description 4
- 230000011218 segmentation Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000009776 industrial production Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 230000000873 masking effect Effects 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 230000033772 system development Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/33—Intelligent editors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/279—Recognition of textual entities
- G06F40/289—Phrasal analysis, e.g. finite state techniques or chunking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/34—Graphical or visual programming
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/38—Creation or generation of source code for implementing user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/40—Transformation of program code
- G06F8/41—Compilation
- G06F8/44—Encoding
- G06F8/447—Target code generation
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Machine Translation (AREA)
Abstract
The invention discloses a method for generating a Web intelligent form based on natural language, which comprises the following steps: constructing a fine-tuned natural language understanding and specific domain language generating module; inputting the form function description of the natural language into a fine-tuned natural language understanding and specific domain language generating module; decoding the model output using a domain-specific language decoding module; and inputting the decoded JSON form configuration information into a low-code form editing module, and automatically generating a Web form. The deep learning technology is combined with the field specific language to efficiently and accurately generate the Web form, and the low-code drag form editor is combined to reduce the construction difficulty of the training sample, so that continuous training of the model is facilitated. The model generation result is convenient to be finely adjusted manually. Has better industrial use value.
Description
Technical Field
The invention relates to the technical field of webpage code generation, in particular to a method for generating a Web intelligent form based on natural language.
Background
Traditional webpage code generation technology requires a large amount of developers to participate in the technology, or to formulate detailed code generation rules, or to write templates. Typically only data will be dynamically embedded in the template. Is also a manual writing mode in nature, and only reduces the repeated code quantity.
In recent years, the technology of generating web page codes end to end based on deep learning, such as pix2code, skin 2code and corresponding improved technology based on images, is gradually rising. With the advent of large-scale language pre-training models, attempts to generate web page code based on natural language have been increasing.
Current attempts to generate web page code based on natural language are typically end-to-end direct generation of html source code, which currently generates only relatively simple, short codes, such as a single button, a container containing multiple simple elements. The method has little significance in industrial production.
Disclosure of Invention
The invention provides a method for generating a Web intelligent form based on natural language, which aims to solve the problems in the prior art.
The invention provides a method for generating a Web intelligent form based on natural language, which comprises the following steps:
s100, constructing a fine-tuned natural language understanding and specific domain language generating module;
s200, inputting the form function description of the natural language into a fine-tuned natural language understanding and specific domain language generating module;
s300, decoding the model output by using a domain specific language decoding module;
s400, inputting the decoded JSON form configuration information into a low-code form editing module, and automatically generating a Web form.
Preferably, the S100 includes:
s101, manually writing a natural language description and combining the text of a requirement document of the existing form application as the input of a training model;
s102, constructing a corresponding form by manual dragging through a low-code form editor;
s103, encoding by using JSON data of a corresponding form generated by a low-code form editor through a domain-specific language encoding module to obtain a domain-specific language one-hot vector;
s104, performing fine adjustment on the language model by taking the domain specific language one-hot vector as a prediction target to form a fine-adjusted natural language understanding and specific domain language generating module; the language model includes a GPT3 model.
Preferably, step S400 further includes:
s500, adjusting the generated Web form through a low-code form editing module; and taking the adjusted result as a training target for continuous training of the training model.
Preferably, in S103, the domain-specific language one-hot vector is obtained by encoding by the domain-specific language encoding module, which includes:
s1031, converting the form JSON form configuration information generated by the low-code form editing module into a specific domain language;
s1032, converting the domain-specific language into one-hot vectors of model support dimensions.
Preferably, the S1031 includes:
s1031-1, training natural language by using a large-scale natural language pre-training model to obtain analysis and understanding information of the natural language;
s1031-2, based on analysis and understanding information of the natural language, using the domain-specific language as a prediction target, and predicting the corresponding domain-specific language according to the input natural language.
Preferably, the JSON data of the corresponding form generated in S103 using the low-code form editor includes:
s1033, constructing a Web form of a quick zero code by using the visualized interface;
s1034, directly generating an Html code or JSON data by the constructed Web form; the Html code or the JSON data is generated for dynamically generating Web forms in conjunction with a renderer.
Preferably, the domain-specific language includes a form domain-specific language;
the form field specific language constitution mode comprises the following steps:
aiming at a specific application scene of a Web form, adopting a domain specific language of the form as a finetune training target of a model;
the basic syntax of the domain-specific language is: type { attr0 val, attr1 val1, child { type1{ attr2val2 }; where type refers to the currently generated type of table element.
Preferably, attr and val are respectively the attribute and attribute value of the currently generated form element, and when attr is equal to child ren, the attribute value val corresponding to attr is a sub-element recursively contained by the current element;
and for the continuous val values, discretizing the continuous val values, so that the algorithm complexity is reduced.
Preferably, the step of dynamically generating the Web form in S400 includes:
s401, analyzing JSON form configuration information, wherein the configuration information comprises a header and a list part of a Web form;
s402, obtaining detailed information of the form elements of the Web form, dynamically generating form elements, adding the form elements into the Web page, and dynamically generating a header;
s403, when the list of the Web form is generated, the engine adds the data acquired in the database to the corresponding field in the Web page Table, and the list is dynamically generated.
Preferably, the step S1031-2 includes:
s1031-2-1, performing predictive translation in an iterative mode by adopting a mask-predictive mode;
s1031-2-2, setting a prediction target sentence length, adding a specific word for predicting the target sentence length into the encoder, and predicting the target sentence length by using the specific hidden vector output by the encoder;
s1031-2-3, adding the negative log likelihood loss function of the predicted result to the total loss function as a final loss function of the whole model;
s1031-2-4, simultaneously decoding a plurality of candidate lengths of the prediction target length during decoding, and selecting the final prediction translation result with the highest probability in the turnover prediction translation.
Compared with the prior art, the invention has the following advantages:
the invention provides a method for generating a Web intelligent form based on natural language, which comprises the following steps: constructing a fine-tuned natural language understanding and specific domain language generating module; inputting the form function description of the natural language into a fine-tuned natural language understanding and specific domain language generating module; decoding the model output using a domain-specific language decoding module; and inputting the decoded JSON form configuration information into a low-code form editing module, and automatically generating a Web form. The deep learning technology is combined with the field specific language to efficiently and accurately generate the Web form, and the low-code drag form editor is combined to reduce the construction difficulty of the training sample, so that continuous training of the model is facilitated. The model generation result is convenient to be finely adjusted manually. Has better industrial use value.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
The technical scheme of the invention is further described in detail through the drawings and the embodiments.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, serve to explain the invention. In the drawings:
FIG. 1 is a flow chart of a method for generating a Web intelligent form based on natural language in an embodiment of the invention;
FIG. 2 is a flowchart of a method for constructing a trimmed natural language understanding and domain specific language generation module in accordance with an embodiment of the present invention;
FIG. 3 is a flowchart of an encoding method of domain-specific language encoding module encoding in an embodiment of the present invention.
Detailed Description
The preferred embodiments of the present invention will be described below with reference to the accompanying drawings, it being understood that the preferred embodiments described herein are for illustration and explanation of the present invention only, and are not intended to limit the present invention.
The embodiment of the invention provides a method for generating a Web intelligent form based on natural language, referring to FIG. 1, the method comprises the following steps:
s100, constructing a fine-tuned natural language understanding and specific domain language generating module;
s200, inputting the form function description of the natural language into a fine-tuned natural language understanding and specific domain language generating module;
s300, decoding the model output by using a domain specific language decoding module;
s400, inputting the decoded JSON form configuration information into a low-code form editing module, and automatically generating a Web form.
The working principle of the technical scheme is as follows: the scheme adopted by the embodiment is that firstly, a fine-tuned natural language understanding and specific domain language generating module is constructed; secondly, inputting the form function description of the natural language into a fine-tuned natural language understanding and specific domain language generating module; finally, decoding the model output using a domain specific language decoding module; and inputting the decoded JSON form configuration information into a low-code form editing module, and automatically generating a Web form.
According to the embodiment, analysis and understanding of natural language are achieved through the neural network language pre-training model, and corresponding Web intelligent forms are predicted and generated. Further included in the method is a domain-specific language for the Web form; the scheme is completed by a form specific domain language generating module, a domain specific language encoding module, a domain specific language decoding module and a low-code form editing module based on natural language understanding.
Form domain specific language: according to the specific application scene of the Web form, a field specific language is provided as a finetune training target of the model, so that the training search space of the model is reduced, the accuracy of the model is improved, and the training difficulty is reduced. The basic grammar of the language is: type { attr0 val, attr1 val1, child { type1{ attr2 value2 }. Wherein type is the list of currently generated list element types:
Form | form master container |
FormItem | Form item |
Button | … |
TextInput | … |
… | … |
… |
attr and val are the attribute and attribute value, respectively, of the currently generated form element, and when attr is equal to child ren, the value of attr is the child element that the current element recursively contains. And the continuous val values are discretized, so that the algorithm complexity is reduced.
The low-code form editing module: the module uses a visual interface to realize the construction of the Web form of the quick zero code, the constructed Web form can directly generate the Html code or generate Json data, and the dynamic generation of the form is realized by combining a renderer. In the patent, the module is respectively used in a training stage, and a zero code mode is used for dragging a form to generate training data; in the production stage, the form is reconstructed based on the prediction result generated by the model, and zero code type adjustment can be carried out on the result form by seamlessly accessing manual operation after reconstruction. The usability of the method in a real production environment is greatly improved.
Natural language understanding and domain specific language generation module: the module takes a large-scale natural language pre-training model (GPT 3) as a basis, and takes the domain-specific language proposed herein as a prediction target, so that the function of predicting the corresponding domain-specific language according to the input natural language is realized.
And the domain specific language coding module converts the form JSON form configuration generated by the low-code form editing module into a specific domain language and then into a one-hot vector of a model supporting dimension.
And the domain-specific language decoding module decodes the domain-specific language generated by the form-specific domain language generating module based on natural language understanding into JSON form configuration information which can be identified by the low-code form module.
Training phase: according to manual writing of natural language description, combining text of a demand document of an existing form application as input X, manually dragging through a low-code form editor to construct a corresponding form, and using a domain-specific language one-hot vector obtained by encoding a JSON data encoding module of the corresponding form generated by the low-code form editor as a prediction target to finely tune a GPT3 model.
The using stage is as follows: and inputting the form function description of the natural language into a natural language understanding and specific domain language generating module after fine adjustment, decoding the model output by using a domain specific language decoding module, and then inputting the JSON obtained by decoding into a low-code form editing module to realize automatic generation of the Web form. Furthermore, the generated form can be adjusted through the low-code form editing module. The adjusted result is used as a training target for continuous training of the model.
The beneficial effects of the technical scheme are as follows: the scheme provided by the embodiment uses the deep learning technology to generate the Web form with high efficiency and high accuracy in combination with the specific language in the field, and combines a form editor with low codes and a dragging type form editor to reduce the construction difficulty of training samples, so that continuous training of a model is facilitated. The model generation result is convenient to be finely adjusted manually. Has better industrial use value.
In another embodiment, referring to fig. 2, the step S100 includes:
s101, manually writing a natural language description and combining the text of a requirement document of the existing form application as the input of a training model;
s102, constructing a corresponding form by manual dragging through a low-code form editor;
s103, encoding by using JSON data of a corresponding form generated by a low-code form editor through a domain-specific language encoding module to obtain a domain-specific language one-hot vector;
s104, performing fine adjustment on the language model by taking the domain specific language one-hot vector as a prediction target to form a fine-adjusted natural language understanding and specific domain language generating module; the language model includes a GPT3 model.
The working principle of the technical scheme is as follows: the scheme adopted by the embodiment is that firstly, according to manual writing natural language description, a text of a requirement document combined with the existing form application is used as input of a training model; secondly, constructing a corresponding form by manual dragging through a low-code form editor; JSON data of a corresponding form generated by using a low-code form editor is encoded by a domain-specific language encoding module to obtain a domain-specific language one-hot vector; finally, the domain specific language one-hot vector is used as a prediction target to carry out fine adjustment on the language model, and a fine-adjusted natural language understanding and specific domain language generating module is formed; the language model includes a GPT3 model.
The beneficial effects of the technical scheme are as follows: according to the scheme provided by the embodiment, firstly, the text of a requirement document combined with the existing form application is written according to manual natural language description to serve as the input of a training model; secondly, constructing a corresponding form by manual dragging through a low-code form editor; JSON data of a corresponding form generated by using a low-code form editor is encoded by a domain-specific language encoding module to obtain a domain-specific language one-hot vector; finally, the domain specific language one-hot vector is used as a prediction target to carry out fine adjustment on the language model, and a fine-adjusted natural language understanding and specific domain language generating module is formed; the language model includes a GPT3 model.
In another embodiment, step S400 further includes:
s500, adjusting the generated Web form through a low-code form editing module; and taking the adjusted result as a training target for continuous training of the training model.
The working principle of the technical scheme is as follows: the scheme adopted by the embodiment is that the generated Web form is adjusted through a low-code form editing module; and taking the adjusted result as a training target for continuous training of the training model.
The beneficial effects of the technical scheme are as follows: the scheme provided by the embodiment is adopted to adjust the generated Web form through a low-code form editing module; and taking the adjusted result as a training target for continuous training of the training model.
In another embodiment, referring to fig. 3, the encoding of the domain-specific language one-hot vector by the domain-specific language encoding module in S103 includes:
s1031, converting the form JSON form configuration information generated by the low-code form editing module into a specific domain language;
s1032, converting the domain-specific language into one-hot vectors of model support dimensions.
The working principle of the technical scheme is as follows: the scheme adopted by the embodiment is that the domain-specific language one-hot vector is obtained by encoding by the domain-specific language encoding module, and the method comprises the following steps: converting the form JSON form configuration information generated by the low-code form editing module into a specific domain language; the domain-specific language is converted into a one-hot vector of model support dimensions.
The beneficial effects of the technical scheme are as follows: by adopting the scheme provided by the embodiment, in the field of natural language, text word segmentation can be carried out, all words after word segmentation are used as the total feature number, one-hot coding can be carried out, each word is one-hot vector, then each sentence in the text is also segmented, the words are used as a sentence representation mode after word segmentation, the sentence is a two-dimensional vector, the number of lines of the vector is the total number of words contained in the sentence, and therefore the sentence can be further represented as an article by one-hot vector composition.
In another embodiment, the S1031 includes:
s1031-1, training natural language by using a large-scale natural language pre-training model to obtain analysis and understanding information of the natural language;
s1031-2, based on analysis and understanding information of the natural language, using the domain-specific language as a prediction target, and predicting the corresponding domain-specific language according to the input natural language.
The working principle of the technical scheme is as follows: the scheme adopted by the embodiment is that a large-scale natural language pre-training model is used for training natural language, so that analysis and understanding information of the natural language are obtained; based on analysis and understanding information of natural language, domain specific language is taken as a prediction target, and corresponding domain specific language is predicted according to the input natural language.
The beneficial effects of the technical scheme are as follows: training natural language by using the scheme provided by the embodiment through a large-scale natural language pre-training model to obtain analysis and understanding information of the natural language; based on analysis and understanding information of natural language, domain specific language is taken as a prediction target, and corresponding domain specific language is predicted according to the input natural language.
In another embodiment, please continue to refer to fig. 3, the JSON data of the corresponding form generated in S103 using the low-code form editor includes:
s1033, constructing a Web form of a quick zero code by using the visualized interface;
s1034, directly generating an Html code or JSON data by the constructed Web form; the Html code or the JSON data is generated for dynamically generating Web forms in conjunction with a renderer.
The working principle of the technical scheme is as follows: the scheme adopted by the embodiment is that the JSON data of the corresponding form generated by using the low-code form editor comprises: constructing a Web form of a quick zero code by using a visual interface; directly generating an Html code or JSON data by the constructed Web form; the Html code or the JSON data is generated for dynamically generating Web forms in conjunction with a renderer.
The beneficial effects of the technical scheme are as follows: the scheme provided by the embodiment uses the JSON data of the corresponding form generated by the low-code form editor, and comprises the following steps: constructing a Web form of a quick zero code by using a visual interface; directly generating an Html code or JSON data by the constructed Web form; the Html code or the JSON data is generated for dynamically generating Web forms in conjunction with a renderer. The low-code form editing module is respectively used for generating training data by dragging the form in a zero-code mode in a training stage; in the production stage, the form is reconstructed based on the prediction result generated by the model, and zero code type adjustment can be carried out on the result form by seamlessly accessing manual operation after reconstruction. The usability of the method in a real production environment is greatly improved.
In another embodiment, the domain-specific language comprises a form domain-specific language;
the form field specific language constitution mode comprises the following steps:
aiming at a specific application scene of a Web form, adopting a domain specific language of the form as a finetune training target of a model;
the basic syntax of the domain-specific language is: type { attr0 val, attr1 val1, child { type1{ attr2val2 }; where type refers to the currently generated type of table element.
The working principle of the technical scheme is as follows: the scheme adopted by the embodiment is that the domain-specific language comprises a form domain-specific language; the form field specific language constitution mode comprises the following steps: aiming at a specific application scene of a Web form, adopting a domain specific language of the form as a finetune training target of a model; the basic syntax of the domain-specific language is: type { attr0 val, attr1 val1, child { type1{ attr2val2 }; where type refers to the currently generated type of table element.
The beneficial effects of the technical scheme are as follows: the domain-specific language includes a form domain-specific language by adopting the scheme provided by the embodiment; the form field specific language constitution mode comprises the following steps: aiming at a specific application scene of a Web form, adopting a domain specific language of the form as a finetune training target of a model; the basic syntax of the domain-specific language is: type { attr0 val, attr1 val1, child { type1{ attr2val2 }; where type refers to the currently generated type of table element. Common form elements include: form defining a form for user input. Fieldset: domain. I.e. the input area is provided with a text border. Legend: the title of the field, i.e. the text on the border, is defined. Label: a controlled tag is defined. Such as text before the input box, for associating the user's selections. Input, defining input field, commonly used. The type attribute may be set so as to have different functions. Textarea-define text field (one multi-line input control), which can be resized by mouse drag by default. Button, define a button. Select-define a selection list, i.e. a drop-down list. Option-defining options in the drop-down list.
In another embodiment, attr and val are the attribute and attribute value of the currently generated form element, respectively, and when attr is equal to child ren, the attribute value val corresponding to attr is a child element recursively contained by the current element;
and for the continuous val values, discretizing the continuous val values, so that the algorithm complexity is reduced.
The working principle of the technical scheme is as follows: the scheme adopted by the embodiment is that attr and val are respectively the attribute and attribute value of the currently generated form element, and when attr is equal to child ren, the attribute value val corresponding to attr is a sub-element recursively contained by the current element; and for the continuous val values, discretizing the continuous val values, so that the algorithm complexity is reduced.
The beneficial effects of the technical scheme are as follows: by adopting the scheme attr and val provided by the embodiment, the attribute and the attribute value of the currently generated form element are respectively, and when attr is equal to child ren, the attribute value val corresponding to attr is a sub-element recursively contained in the current element; and for the continuous val values, discretizing the continuous val values, so that the algorithm complexity is reduced.
In another embodiment, dynamically generating the Web form in S400 includes:
s401, analyzing JSON form configuration information, wherein the configuration information comprises a header and a list part of a Web form;
s402, obtaining detailed information of the form elements of the Web form, dynamically generating form elements, adding the form elements into the Web page, and dynamically generating a header;
s403, when the list of the Web form is generated, the engine adds the data acquired in the database to the corresponding field in the Web page Table, and the list is dynamically generated.
The working principle of the technical scheme is as follows: the scheme adopted by the embodiment is that the dynamic generation of the Web form comprises the following steps: analyzing JSON form configuration information, wherein the configuration information comprises a header and a list part of a Web form; the method comprises the steps of obtaining detailed information of table element of a Web form, dynamically generating form elements, adding the form elements into a Web page, and dynamically generating a header; when the list of the Web form is generated, the engine adds the data acquired in the database to the corresponding field in the Web page Table to dynamically generate the list.
The generation of the form and the presentation of the final page are both based on the configuration information in the configuration file, and the form is composed of form elements, so that the configuration file needs to be capable of accurately and completely describing the form elements in the Web form.
When the form configuration file is analyzed, each Web form can be corresponding to 2 XML format configuration files, the list head list part of the form can be described respectively, the form elements with the data acquisition function are connected with the fields of the corresponding Table in the database, and the columns in the Table are connected with the corresponding fields in the database.
The beneficial effects of the technical scheme are as follows: by adopting the scheme provided by the embodiment, repeated and inefficient work can be reduced, automatic generation of the Web form is realized, and a great deal of repeated labor is brought to system development and maintenance personnel due to the change of the Web form caused by the change of the data structure is avoided.
In another embodiment, the S1031-2 includes:
s1031-2-1, performing predictive translation in an iterative mode by adopting a mask-predictive mode;
s1031-2-2, setting a prediction target sentence length, adding a specific word for predicting the target sentence length into the encoder, and predicting the target sentence length by using the specific hidden vector output by the encoder;
s1031-2-3, adding the negative log likelihood loss function of the predicted result to the total loss function as a final loss function of the whole model;
s1031-2-4, simultaneously decoding a plurality of candidate lengths of the prediction target length during decoding, and selecting the final prediction translation result with the highest probability in the turnover prediction translation.
The working principle of the technical scheme is as follows: the scheme adopted by the embodiment is to carry out predictive translation in an iterative mode of a mask-predictive mode; setting a prediction target sentence length, adding a specific word for predicting the target sentence length into the encoder, and predicting the length of the target sentence by using the specific hidden vector output by the encoder; adding the negative log likelihood loss function of the predicted result to the total loss function as a final loss function of the whole model; in decoding, a plurality of candidate lengths of a prediction target length are simultaneously decoded, and a final prediction translation result with the highest probability is selected in the prediction translation.
It should be noted that the general loss function formula is as follows:
wherein,,t represents the total loss function, x r Representing the result of the random masking of the source statement, y r The result of the target prediction mask is represented,representing the t-th word in the set of source sentences, < ->Representing the t-th word in the target sentence, < +.>Representing the total amount of words in the set of source sentences, +.>Representing the number of words of the target sentence masked, C y (g) Representing the number of times set g appears in the target sentence,/->Representing the set g in the mask word y m The number of occurrences, alpha 1 The first weight super-parameter representing the control loss function can be set to 0.01, alpha 2 A second weight super-parameter representing the control loss function may be set to 0.01.
The beneficial effects of the technical scheme are as follows: the scheme provided by the embodiment is adopted to optimize the traditional coding and decoding technology, and the mask mode blank can be filled through the mask-prediction technology, and meanwhile, the problem of poor translation effect caused by repeated translation is avoided.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.
Claims (10)
1. A method for generating a Web smart form based on natural language, comprising:
s100, constructing a fine-tuned natural language understanding and specific domain language generating module;
s200, inputting the form function description of the natural language into a fine-tuned natural language understanding and specific domain language generating module;
s300, decoding the model output by using a domain specific language decoding module;
s400, inputting the decoded JSON form configuration information into a low-code form editing module, and automatically generating a Web form.
2. The method for generating a Web smart form based on natural language as claimed in claim 1, wherein said S100 comprises:
s101, manually writing a natural language description and combining the text of a requirement document of the existing form application as the input of a training model;
s102, constructing a corresponding form by manual dragging through a low-code form editor;
s103, encoding by using JSON data of a corresponding form generated by a low-code form editor through a domain-specific language encoding module to obtain a domain-specific language one-hot vector;
s104, performing fine adjustment on the language model by taking the domain specific language one-hot vector as a prediction target to form a fine-adjusted natural language understanding and specific domain language generating module; the language model includes a GPT3 model.
3. The method for generating a Web smart form based on natural language as claimed in claim 2, further comprising, after step S400:
s500, adjusting the generated Web form through a low-code form editing module; and taking the adjusted result as a training target for continuous training of the training model.
4. The method for generating a Web intelligent form based on natural language according to claim 2, wherein the step S103 of encoding by the domain-specific language encoding module to obtain a domain-specific language one-hot vector comprises:
s1031, converting the form JSON form configuration information generated by the low-code form editing module into a specific domain language;
s1032, converting the domain-specific language into one-hot vectors of model support dimensions.
5. The method for generating a Web smart form based on natural language as claimed in claim 4, wherein said S1031 comprises:
s1031-1, training natural language by using a large-scale natural language pre-training model to obtain analysis and understanding information of the natural language;
s1031-2, based on analysis and understanding information of the natural language, using the domain-specific language as a prediction target, and predicting the corresponding domain-specific language according to the input natural language.
6. The method for generating Web intelligent forms based on natural language according to claim 2, wherein the JSON data of the corresponding form generated using the low-code form editor in S103 comprises:
s1033, constructing a Web form of a quick zero code by using the visualized interface;
s1034, directly generating an Html code or JSON data by the constructed Web form; the Html code or the JSON data is generated for dynamically generating Web forms in conjunction with a renderer.
7. The method of generating Web intelligent forms based on natural language of claim 2, wherein the domain-specific language comprises a form domain-specific language;
the form field specific language constitution mode comprises the following steps:
aiming at a specific application scene of a Web form, adopting a domain specific language of the form as a finetune training target of a model;
the basic syntax of the domain-specific language is: type { attr0 val, attr1 val1, child { type1{ attr2val2 }; where type refers to the currently generated type of table element.
8. The method for generating a Web intelligent form based on natural language according to claim 7, wherein attr and val are respectively an attribute and an attribute value of a currently generated form element, and when attr is equal to child, the attribute value val corresponding to attr is a sub-element recursively contained in the current element;
and for the continuous val values, discretizing the continuous val values, so that the algorithm complexity is reduced.
9. The method for generating a Web intelligent form based on natural language according to claim 1, wherein the dynamically generating a Web form in S400 comprises:
s401, analyzing JSON form configuration information, wherein the configuration information comprises a header and a list part of a Web form;
s402, obtaining detailed information of the form elements of the Web form, dynamically generating form elements, adding the form elements into the Web page, and dynamically generating a header;
s403, when the list of the Web form is generated, the engine adds the data acquired in the database to the corresponding field in the Web page Table, and the list is dynamically generated.
10. The method for generating a Web smart form based on natural language as recited in claim 5, wherein S1031-2 comprises:
s1031-2-1, performing predictive translation in an iterative mode by adopting a mask-predictive mode;
s1031-2-2, setting a prediction target sentence length, adding a specific word for predicting the target sentence length into the encoder, and predicting the target sentence length by using the specific hidden vector output by the encoder;
s1031-2-3, adding the negative log likelihood loss function of the predicted result to the total loss function as a final loss function of the whole model;
s1031-2-4, simultaneously decoding a plurality of candidate lengths of the prediction target length during decoding, and selecting the final prediction translation result with the highest probability in the turnover prediction translation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310011891.XA CN116149631B (en) | 2023-01-05 | 2023-01-05 | Method for generating Web intelligent form based on natural language |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310011891.XA CN116149631B (en) | 2023-01-05 | 2023-01-05 | Method for generating Web intelligent form based on natural language |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116149631A true CN116149631A (en) | 2023-05-23 |
CN116149631B CN116149631B (en) | 2023-10-03 |
Family
ID=86352043
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310011891.XA Active CN116149631B (en) | 2023-01-05 | 2023-01-05 | Method for generating Web intelligent form based on natural language |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116149631B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116931911A (en) * | 2023-06-15 | 2023-10-24 | 明物数智科技研究院(南京)有限公司 | Intelligent low-code application development platform and development method based on AIGC |
CN117873450A (en) * | 2024-01-16 | 2024-04-12 | 美林数据技术股份有限公司 | Front-end code design device and method based on large model |
CN118092908A (en) * | 2024-04-17 | 2024-05-28 | 浪潮软件股份有限公司 | Application program generation method and device based on large language model |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110032561A (en) * | 2019-01-28 | 2019-07-19 | 阿里巴巴集团控股有限公司 | Semantic-based list construction method and system |
CN112771530A (en) * | 2018-09-27 | 2021-05-07 | 谷歌有限责任公司 | Automatic navigation of interactive WEB documents |
CN113052262A (en) * | 2021-04-23 | 2021-06-29 | 深圳壹账通智能科技有限公司 | Form generation method and device, computer equipment and storage medium |
CN113791791A (en) * | 2021-09-01 | 2021-12-14 | 中国船舶重工集团公司第七一六研究所 | Business logic code-free development method based on natural language understanding and conversion |
CN114237610A (en) * | 2021-11-25 | 2022-03-25 | 武汉天恒信息技术有限公司 | Method, system and storage medium for generating webpage configuration information by low-code platform |
CN115202640A (en) * | 2022-07-26 | 2022-10-18 | 上海交通大学 | Code generation method and system based on natural semantic understanding |
-
2023
- 2023-01-05 CN CN202310011891.XA patent/CN116149631B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112771530A (en) * | 2018-09-27 | 2021-05-07 | 谷歌有限责任公司 | Automatic navigation of interactive WEB documents |
CN110032561A (en) * | 2019-01-28 | 2019-07-19 | 阿里巴巴集团控股有限公司 | Semantic-based list construction method and system |
CN113052262A (en) * | 2021-04-23 | 2021-06-29 | 深圳壹账通智能科技有限公司 | Form generation method and device, computer equipment and storage medium |
CN113791791A (en) * | 2021-09-01 | 2021-12-14 | 中国船舶重工集团公司第七一六研究所 | Business logic code-free development method based on natural language understanding and conversion |
CN114237610A (en) * | 2021-11-25 | 2022-03-25 | 武汉天恒信息技术有限公司 | Method, system and storage medium for generating webpage configuration information by low-code platform |
CN115202640A (en) * | 2022-07-26 | 2022-10-18 | 上海交通大学 | Code generation method and system based on natural semantic understanding |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116931911A (en) * | 2023-06-15 | 2023-10-24 | 明物数智科技研究院(南京)有限公司 | Intelligent low-code application development platform and development method based on AIGC |
CN117873450A (en) * | 2024-01-16 | 2024-04-12 | 美林数据技术股份有限公司 | Front-end code design device and method based on large model |
CN118092908A (en) * | 2024-04-17 | 2024-05-28 | 浪潮软件股份有限公司 | Application program generation method and device based on large language model |
CN118092908B (en) * | 2024-04-17 | 2024-07-30 | 浪潮软件股份有限公司 | Application program generation method and device based on large language model |
Also Published As
Publication number | Publication date |
---|---|
CN116149631B (en) | 2023-10-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN116149631B (en) | Method for generating Web intelligent form based on natural language | |
CN108984683B (en) | Method, system, equipment and storage medium for extracting structured data | |
CN100527125C (en) | On-line translation model selection method of statistic machine translation | |
US6789231B1 (en) | Method and system for providing alternatives for text derived from stochastic input sources | |
CN108491421B (en) | Method, device and equipment for generating question and answer and computing storage medium | |
CN116501306B (en) | Method for generating interface document code based on natural language description | |
Sorge et al. | Towards making mathematics a first class citizen in general screen readers | |
CN104462072A (en) | Input method and device oriented at computer-assisting translation | |
CN110442880A (en) | A kind of interpretation method, device and the storage medium of machine translation translation | |
CN108519963B (en) | Method for automatically converting process model into multi-language text | |
CN113657123A (en) | Mongolian aspect level emotion analysis method based on target template guidance and relation head coding | |
CN114972848A (en) | Image semantic understanding and text generation based on fine-grained visual information control network | |
Rane et al. | Recent trends in deep learning based abstractive text summarization | |
CN114780582A (en) | Natural answer generating system and method based on form question and answer | |
CN117493379A (en) | Natural language-to-SQL interactive generation method based on large language model | |
CN113792556A (en) | Intelligent voice interaction system for real-time power grid dispatching based on deep neural network | |
JP7061594B2 (en) | Sentence conversion system, sentence conversion method, and program | |
CN114781376A (en) | News text abstract generation method based on deep learning | |
Wang et al. | Data augmentation for internet of things dialog system | |
Moukafih et al. | Improving machine translation of arabic dialects through multi-task learning | |
CN113515955A (en) | Semantic understanding-based online translation system and method from text sequence to instruction sequence | |
Anisha et al. | Text to sql query conversion using deep learning: A comparative analysis | |
CN114611510A (en) | Method and device for assisting machine reading understanding based on generative model | |
Ansari et al. | Hindi to English transliteration using multilayer gated recurrent units | |
Ma | Research on Computer Intelligent Proofreading System for English Translation Based on Deep Learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |