CN118132143B - Training method, mapping method and equipment for low-code component recognition model - Google Patents
Training method, mapping method and equipment for low-code component recognition model Download PDFInfo
- Publication number
- CN118132143B CN118132143B CN202410562006.1A CN202410562006A CN118132143B CN 118132143 B CN118132143 B CN 118132143B CN 202410562006 A CN202410562006 A CN 202410562006A CN 118132143 B CN118132143 B CN 118132143B
- Authority
- CN
- China
- Prior art keywords
- component
- sample
- low
- training
- design drawing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000013507 mapping Methods 0.000 title claims abstract description 68
- 238000012549 training Methods 0.000 title claims abstract description 62
- 238000000034 method Methods 0.000 title claims abstract description 59
- 238000013461 design Methods 0.000 claims abstract description 81
- 238000009877 rendering Methods 0.000 claims abstract description 34
- 238000003062 neural network model Methods 0.000 claims abstract description 14
- 238000004590 computer program Methods 0.000 claims description 13
- 238000012795 verification Methods 0.000 claims description 13
- 238000010586 diagram Methods 0.000 claims description 8
- 238000007781 pre-processing Methods 0.000 claims description 7
- 238000012360 testing method Methods 0.000 claims description 7
- 238000006243 chemical reaction Methods 0.000 claims description 4
- 230000000717 retained effect Effects 0.000 claims description 4
- 238000011156 evaluation Methods 0.000 claims description 3
- 238000002372 labelling Methods 0.000 abstract description 2
- 238000011161 development Methods 0.000 description 3
- 230000009193 crawling Effects 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- BUGBHKTXTAQXES-UHFFFAOYSA-N Selenium Chemical compound [Se] BUGBHKTXTAQXES-UHFFFAOYSA-N 0.000 description 1
- 239000008186 active pharmaceutical agent Substances 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000005034 decoration Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 229910052711 selenium Inorganic materials 0.000 description 1
- 239000011669 selenium Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/70—Software maintenance or management
- G06F8/73—Program documentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/34—Graphical or visual programming
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/36—Software reuse
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/38—Creation or generation of source code for implementing user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/70—Software maintenance or management
- G06F8/71—Version control; Configuration management
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Security & Cryptography (AREA)
- Library & Information Science (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a training method, a mapping method and equipment for a low-code component recognition model, wherein the training method comprises the steps of obtaining a UI design drawing sample data set and labeling each UI design drawing sample data; obtaining original data of a plurality of component samples from a component library and converting the original data to form a component library component sample data set; the method comprises the following steps: obtaining the original data of a component sample and all variant information corresponding to the component sample from a component library; storing the original data of the component sample, and constructing a mapping relation between the original data of the component sample and a plurality of variant information; rendering the original data of the component sample based on the constructed mapping relation to obtain rendering preview images of all variants under the component sample, and splicing the rendering preview images to form a component sample image; annotating the component sample graph to form component library component sample data; training the constructed neural network model according to the UI design drawing sample data set and the component library component sample data set to obtain a low-code component recognition model.
Description
Technical Field
The application relates to the technical field of software development, in particular to a training method, a mapping method and equipment for a low-code component recognition model.
Background
The low-code platform mainly converts the work which can be performed only by a developer who grasps code development into implementation only by a product operator through a visual and interactive mode, namely, the page and the application are built in a dragging mode to skip the basic framework and possibly facing technical details, and the work closely related to the service requirement is directly entered, so that the value is provided faster and more reliably. A User Interface (UI) is a medium for interaction and information exchange between a system and a user.
Currently, when a low-code platform is adopted for page development, front-end designers need to manually mark each component on a UI design drawing in a low-code editor; and then, replacing the marked UI design drawing component with a component in the component library to realize coding. Usually, a plurality of components are arranged on one UI design drawing, each component has a plurality of variants, and the manual annotation conversion mode needs to consume a great deal of time of front-end designers, so that the design problems of high labor cost, low page development efficiency, poor component matching precision and the like are usually caused by manual annotation.
Disclosure of Invention
The invention provides a training method, a mapping method and equipment for a low-code component recognition model based on a neural network in order to overcome the defects of the prior art.
In order to achieve the above object, the present invention provides a training method of a low-code component recognition model, comprising:
Obtaining a UI design drawing sample data set, wherein each UI design drawing sample data set is marked with a boundary box of a target position where a component is located and a category of the boundary box;
Obtaining original data of a plurality of component samples from a component library and converting the original data to form a component library component sample data set; the conversion of the sample data for each component library component includes: obtaining the original data of a component sample and all variant information corresponding to the component sample from a component library; storing the original data of the component sample, and constructing a mapping relation between the original data of the component sample and a plurality of variant information; rendering the original data of the component sample based on the constructed mapping relation to obtain rendering preview images of all variants under the component sample; splicing rendering preview graphs of all variants to form a component sample graph; marking all variant bounding boxes and the category to which the bounding boxes belong in the component sample graph to form component library component sample data;
Training the constructed neural network model according to the UI design drawing sample data set and the component library component sample data set to obtain a low-code component recognition model.
According to an embodiment of the invention, the information of all variants includes preview images of all variants and configuration items of all variants; constructing a mapping relationship between the raw data of the component sample and the plurality of variant information includes:
Establishing a one-to-many first mapping relationship between the original data of the component samples and the preview images of the plurality of variants, and establishing a second mapping relationship between the configuration items of all variants of each component sample and the preview images of all variants; and then rendering the original data of the corresponding component sample based on the constructed first mapping relation and second mapping relation.
According to one embodiment of the invention, after all the variant configuration items are obtained, the configuration items are stored in a file in which the original data of the component sample are located; and then, constructing a second mapping relation.
According to one embodiment of the invention, the original data of the component sample is stored in the form of a JSON file; and when in rendering, rendering the original data of the component sample at the server by adopting a headless browser, and generating and exporting rendering preview images of all variants under the component sample.
According to one embodiment of the invention, a UI design drawing sample data set is divided into a training set and a testing set, and a component library component sample data set is used as a verification set; and (3) performing iterative training by using a training set when training the constructed neural network model, performing model verification adjustment by using a verification set after training to prevent overfitting, and finally performing model evaluation by using a test set.
According to an embodiment of the invention, the step of obtaining the UI plan sample dataset comprises:
Acquiring a plurality of UI design drawing samples;
Identifying the target position of the component in the UI design drawing sample, marking the target position in the form of a boundary box, and distributing the class to the boundary box based on the type of the component in the target position, wherein each class is provided with a unique mark;
preprocessing the annotated UI design drawing sample to form UI design drawing sample data based on the input format of the constructed neural network model, wherein the preprocessing comprises one or more steps of image size adjustment, image graying, image binarization, image denoising and image clipping.
In another aspect, the present invention further provides a low code component mapping method, which includes:
acquiring a UI design diagram to be detected;
identifying the UI design drawing to be detected by using the low-code component identification model obtained by training by the low-code component identification model training method, marking a plurality of candidate components in the UI design drawing in a boundary box mode, and marking the category of each boundary box and the confidence of the candidate components in each boundary box based on the types of the candidate components;
Determining a target component among the plurality of candidate components based on the confidence and the location and width-height information of each candidate component;
according to the category of the bounding box in each target component, the target component is mapped into the components of the same category in the low-code component library to realize coding component mapping.
According to an embodiment of the present invention, the step of determining the target component among the plurality of candidate components includes:
Based on a preset confidence threshold, reserving candidate components with confidence degrees greater than or equal to the confidence threshold to form initially selected candidate components;
The preliminary candidate components whose bounding boxes are contained or which overlap with other bounding boxes are retained based on the bounding box position and width-height of each preliminary candidate component to form a target component.
In another aspect, the present invention also provides a computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of any of the methods described above when executing the computer program.
In another aspect, the invention also provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of the method of any of the above.
In summary, the low-code component mapping method provided by the invention adopts the trained low-code component recognition model to recognize the input UI design drawing, analyzes a plurality of candidate components on the UI design drawing and marks the type of each candidate component; the target component is determined among the plurality of candidate components and mapped to the same type of component within the component library, thereby automatically completing the component mapping between the UI design drawing and the front-end code. The mapping mode greatly simplifies the original process of manually replacing components, reduces the investment of developers and greatly improves the efficiency of page design; in addition, the resolution of candidate components based on the low-code component recognition model also greatly improves the accuracy of component matching during mapping. In order to further improve the matching precision, when the low-code component recognition model is trained, the UI design drawing sample data set is used as a basis, and then the original data of a plurality of component samples and the variants corresponding to each component sample are obtained from the component library and converted to form the component sample data set. The constructed neural network model is trained by adopting the UI design drawing sample data set and the component sample data set to avoid training over fitting, so that the accuracy of the low-code component recognition model obtained by training is greatly improved, and conditions are provided for high-accuracy matching in component mapping.
The foregoing and other objects, features and advantages of the invention will be apparent from the following more particular description of preferred embodiments, as illustrated in the accompanying drawings.
Drawings
Fig. 1 is a flowchart illustrating a low code component mapping method according to an embodiment of the invention.
Fig. 2 is a schematic flow chart of step S300 in fig. 1.
FIG. 3 is a flow chart of a method for training the low-code component recognition model of FIG. 1.
Fig. 4 is a schematic flow chart of step S10 in fig. 3.
Fig. 5 is a schematic flow chart of step S22 in fig. 3.
Fig. 6 is a schematic flow chart of step S30 in fig. 3.
Fig. 7 is a diagram illustrating an internal structure of a computer device according to an embodiment of the present invention.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
The present embodiment provides a low code component mapping method, as shown in fig. 1, including: a UI design drawing to be detected is acquired (step S100). Identifying the UI design drawing to be detected by using the low-code component identification model obtained by training the low-code component identification model by using the training method, marking a plurality of candidate components in the UI design drawing in a boundary box mode, and marking the belonging category of each boundary box and the confidence of the candidate components in each boundary box based on the types of the candidate components (step S200). Based on the confidence and the location and breadth height information of each candidate component, a target component is determined among the plurality of candidate components (step S300). According to the category of the bounding box in each target component, the target component is mapped to a component of the same category in the low code component library to achieve coded component mapping (step S400).
The low-code component mapping method provided by the embodiment adopts a trained low-code component recognition model to detect and recognize the UI design drawing input at the front end, determines candidate components possibly related to interaction on the UI design drawing, marks the positions of the candidate components through the bounding boxes, and marks the category of the bounding boxes based on the types of the candidate components. Candidate components may be basic components such as icons, text, buttons, pictures, tables, mask layers, pop-up layers, etc.; navigation components such as palace, navigation bar, tag bar, index bar, pager, etc. are also possible; or input components such as a radio selection box, a check box, an input box, a form, a selector and the like; or other labels, list-like presentation components or other feedback components, etc. Specifically, if the detected and identified candidate component is a "TABLE", selecting an area where the "TABLE" is located in the UI design drawing in a boundary box mode, and marking a category identifier of the "UI_TABLE_1" for the boundary box; when candidate components of multiple "TABLE" classes are detected, then each candidate box is labeled with a class identifier, such as "ui_table_2", "ui_table_3", and so on.
In the low-code component mapping method provided in this embodiment, step S200 outputs the location (i.e. bounding box) of the candidate component on the UI design drawing and the identifier of the bounding box category characterizing the type of the candidate component, and also outputs the confidence that the candidate component in each bounding box is the UI component. Based on the confidence and the location and width and height information of each candidate component, step S300 will screen the determined plurality of candidate components, and reject candidate components with low confidence and violating the UI component design rule, thereby determining a target component among the plurality of candidate components. Specifically, as shown in fig. 2, the steps include:
Step S301, based on a preset confidence threshold, reserving candidate components with the confidence degree being greater than or equal to the confidence threshold C f0 to form initially selected candidate components. Specifically, the bounding box B f1,Bf2……Bfn represents the bounding box of n candidate components determined on the UI design drawing after detection and recognition in step S200. Constructing a rectangular coordinate system by taking the position of the lower left corner of the UI design drawing as an origin coordinate to determine the coordinate of each boundary frame; the coordinates, width and height of the boundary box B f1 and the confidence coefficient are (X f1,Yf1,Wf1,Hf1,Cf1), the coordinates, width and height and confidence coefficient of the boundary box B f2 and the boundary box B fn are (X f2,Yf2,Wf2,Hf2,Cf2) and … …, respectively (X fn,Yfn,Wfn,Hfn,Cfn). Step S301 performs filtering of candidate components by judging the relation between the confidence coefficient of each bounding box and a preset confidence coefficient threshold C f0, and if the confidence coefficient C fi<Cf0 (i is more than or equal to 1 and less than or equal to n) of the ith bounding box B fi, eliminating the candidate components in the bounding box B fi. The candidate components retained after the screening in step S301 become the initial candidate components.
Thereafter, step S302 is performed to reserve the preliminary candidate components whose bounding boxes are included or which overlap with other bounding boxes according to the bounding box position and width and height of each preliminary candidate component, so as to form a target component. Specifically, bounding boxes of the plurality of initially selected candidate components are compared in pairs to determine inclusion or overlap relationships with each other. Taking the comparison of bounding box B f1 and bounding box B f2 as an example:
If X f1+Wf1<Xf2 or Y f1+Hf1<Yf2 or X f2+Wf2<Xf1 or Y f2+Hf2<Yf1, then no intersection between bounding box B f1 and bounding box B f2 is indicated, and it can be assumed that different components or common resources are within the two bounding boxes, preserving the initial candidate components within bounding box B f1 and bounding box B f2.
If X f1<Xf2 and X f1+Wf1>Xf2+Wf2 and Y f1<Yf2 and Y f1+Hf1>Yf2+Hf2, then it is indicated that bounding box B f1 includes bounding box B f2. Considering that components are typically placed inside and superimposed over the decoration map when UI design is performed, the first choice candidate components within bounding box B f1 are determined to be common resources for non-UI design components based on the design rules, so the first choice candidate components within bounding box B f1 are culled. However, it still cannot be determined whether the first candidate component in the bounding box B f2 is a target component, and the first candidate component in the bounding box B f2 is retained because the first candidate component needs to be determined based on the comparison between the positions of the bounding box B f2 and other bounding boxes.
If the positional relationship between the bounding box B f1 and the bounding box B f2 is the rest, it indicates that the bounding box B f1 and the bounding box B f2 overlap each other, and it can be determined that different components or common resources are in the two boxes, and the initially selected candidate components in the bounding boxes B f1 and B f2 are reserved. Although the present embodiment judges the inclusion relationship of two bounding boxes with the position of the bounding box on the UI design; however, the present invention is not limited in any way thereto.
After the two-to-two initial candidate components are judged to contain the relation based on the rules, the reserved initial candidate components become target components. Then, step S400 is executed to map the target component to a component of the same class in the low-code component library according to the class to which the bounding box belongs in each target component, so as to implement the coded component mapping. Specifically, if the initially selected candidate component within bounding box B f1 is the target component and the class flag of bounding box B f1 is "ui_table_1". Therefore, the components in the component library are indexed based on the Table category mark to obtain the Table components, so that the mapping from the target components in the boundary box B f1 to the Table components in the component library is realized. And then, acquiring codes of the Table components obtained by mapping from the component library, namely realizing component mapping between the UI design drawing and the front-end code.
The low-code component mapping method overcomes the defect that components on the UI design diagram need to be manually replaced by components in a component library in a low-code editor when the existing low-code system performs page design. The automatic component mapping between the UI design drawing and the front end code is used for greatly improving the component matching precision and the page output efficiency, so that the project can be more rapidly online. Furthermore, the automatic mapping of the components also reduces the investment of developers and simplifies the original process of manually replacing the components.
In the low-code component mapping method provided in this embodiment, the detection and identification of the input UI design drawing by the low-code component identification model in step S200 are performed to obtain candidate components, and this step provides a basis for the subsequent component mapping. For this purpose, the present embodiment provides a training method for a low-code component recognition model. As shown in fig. 3, the training method includes:
Step S10, a UI design drawing sample data set is obtained, and each UI design drawing sample data set is marked with a boundary box of a target position where a component is located and a category to which the boundary box belongs;
Step S20, obtaining original data of a plurality of component samples from a component library, and converting the original data to form a component sample data set of the component library; the conversion of the sample data for each component library component includes: obtaining the original data of the component sample and all variant information corresponding to the component sample from the component library (step S21); storing the original data of the component sample and constructing a mapping relationship between the original data of the component sample and the plurality of variant information (step S22); rendering the original data of the component sample based on the constructed mapping relationship to obtain rendering preview images of all variants under the component sample (step S23); concatenating the rendered preview images of all variants to form a component sample image (step S24); all variants' bounding boxes and the class to which the bounding boxes belong are labeled within the component sample graph to form component library component sample data (step S25).
And step S30, training the constructed neural network model according to the UI design drawing sample data set and the component library component sample data set to obtain a low-code component recognition model.
Although the present embodiment will be described by taking the example of executing step S10 first and then executing step S20. However, the present invention is not limited in any way thereto. In other embodiments, the steps S10 and S20 may be performed simultaneously, or the step S20 may be performed first, and then the step S10 may be performed.
As shown in fig. 4, the present embodiment provides a method for implementing step S10, which includes:
step S11, a plurality of UI design drawing samples are obtained. Specifically, the previous design drawings of the designer can be collected to be used as UI design drawing samples; or crawling the design drawings on each large design website through a crawler to serve as UI design drawing samples. Further, the UI design drawing sample is acquired synchronously, and the variation of each component in the UI design drawing sample in different application scenes is acquired synchronously.
And step S12, identifying the target positions of the components in the UI design drawing sample, marking the target positions in the form of a boundary box, and assigning categories to the boundary box based on the types of the components in the target positions and each category has a unique mark. Specifically, the position of each component is marked in the UI design drawing sample in a bounding box manner by using marking tools LabelImg, VGG Image Annotator (VIA) and the like. The bounding box is assigned a class based on the type of component within the bounding box and each class is assigned a unique flag, such as s1ui_ CHART _bar_1, s1ui_table_1, etc.
And step S13, preprocessing the annotated UI design drawing sample based on the input format of the constructed neural network model to form UI design drawing sample data. Preprocessing includes one or more of image resizing, image graying, image binarization, image denoising, image cropping to ensure unification of the input image format. The size of the collected UI design drawing sample is generally larger, and the input sample image is unified by adjusting the image size; the embodiment uniformly adjusts the size of the input UI design drawing sample to be. However, the present invention is not limited in any way thereto. In other embodiments, the input UI samples may be adjusted to other sizes. In yet other embodiments, other pre-processing of the UI design pattern samples may also be performed to further facilitate recognition training of the neural network model.
For step S20, it first performs step S21: and crawling the original data of each component sample and all variant information corresponding to the component sample from the component library through a crawler technology.
Thereafter, step S22 is performed to sort and uniquely store the raw data of each component sample. Specifically, as shown in fig. 5, step S221 is first executed, the original data of each component sample obtained in step S21 is sorted into JSON files, and each component sample corresponds to one JSON file to implement unique storage of the original data of each component sample. If the original data corresponding to the first component sample is arranged in a file with a file name of component UUID1, namely UUID1.Json; the original data corresponding to the second component sample is arranged into a file with a file name of the component UUID2, namely UUID2.Json; and so on. The JSON format data store provides conditions for subsequent rendering of component samples using a headless browser. However, the present invention is not limited in any way thereto. In other embodiments, the raw data for each component sample may be stored uniquely in other formats. Then, a mapping relationship is constructed between the raw data of the component sample and the plurality of variant information. In this embodiment, all variant information of the component sample includes preview images of all variants and configuration items of all variants; specifically, step S222 separately sorts and stores all the variant preview images corresponding to each component sample, and establishes a one-to-many first mapping relationship between the original data of the component sample and the preview images of the plurality of variants, that is, the correspondence between the variant preview images and the component sample. Step S223 sorts all variant configuration items of each component sample into JSON file corresponding to the component sample (e.g. all variant configuration items of the first component sample are sorted into uuid1. JSON) and establishes a second mapping relationship between the configuration items of all variants of each component sample and preview images of all variants, i.e. a correspondence relationship between variant configuration items and variant preview images. However, the present invention is not limited in any way to the storage locations of all the variant configurations of each component sample. In other embodiments, all variant configurations of each component sample may be stored separately, and then a second mapping relationship is established between the configuration and the preview images of all variants.
And then, executing step S23, and rendering the original data of the corresponding component sample based on the constructed first mapping relation and the second mapping relation so as to obtain rendering preview images of all variants under the component sample. Specifically, original data of the component sample is rendered at the server through Puppeteer, selenium WebDriver and other headless browsers and by combining Puppeteer API, and the width and the height of a rendering window are set as; And after rendering, obtaining rendering preview images of all variants under the component sample. The rendering preview of all variants is exported under the same directory and each component sample corresponds to a directory. For example, the first component sample corresponds to the directory UUID1, so that rendering preview images of all variants obtained after rendering the original data of the first component sample are stored under the directory UUID 1; similarly, rendering previews of all variants obtained after rendering the original data of the ith component sample are stored in the directory UUIDi.
Step S24 is to splice the rendered preview images of all the variants under the same directory to form corresponding component sample images. For example, the JSON file corresponding to the first component sample is UUID1.JSON, and the rendering preview storage directory corresponding to the first component sample is UUID1; ten variant rendering preview images under the UUID1 directory are spliced into the same image to form a component sample image corresponding to the first component sample, and the component sample image is named as UUID1.Jpg. And similarly, sequentially splicing rendering preview images of all variants under each catalog to form a component sample image corresponding to each component sample.
Thereafter, step S25 is performed to label the position of each component in the component sample map in a bounding box manner using a labeling tool LabelImg, VGG Image Annotator (VIA), or the like. The bounding boxes are assigned categories based on the types of components within the bounding boxes and each category has a unique category designation to form component library component sample data. As previously described, the component may be an underlying component such as an icon, text, button, picture, form, mask layer, pop-up layer, etc.; navigation components such as palace, navigation bar, tag bar, index bar, pager, etc. are also possible; or input components such as a radio selection box, a check box, an input box, a form, a selector and the like; or other labels, list-like presentation components or other feedback components, etc. The category labels of the bounding box may be s2ui_ CHART _bar_1, s2ui_table_1, s2ui_text_1, etc. Repeatedly executing the steps S20 to S21, and converting the original data and the variant information of each component sample acquired from the component library into component library component sample data; and finally, forming a component library component sample database.
Step S30 trains the neural network model based on the acquired UI design drawing sample data set and the component library component sample data set to obtain a low-code component recognition model. In this embodiment, the UI design drawing sample data set is divided into a training set and a test set, and the component library component sample data set is used as a verification set. And (3) performing iterative training by using a training set when training the constructed neural network model, performing model verification adjustment by using a verification set after training to prevent overfitting, and finally performing model evaluation by using a test set. Specifically, the convolutional neural network is YOLOv3 (You Only Look Once v), the model is configured before training, and a training data path, a verification data path, a weight file path and the like are specified in a configuration file. As shown in fig. 6, the specific training step of step S30 includes:
step S31: in each training iteration, gradient descent optimization is performed by using a training set;
step S32: after several iterations, the performance of the model is evaluated by using the verification set, and the super parameters of the model are adjusted according to the performance of the verification set;
step S33: using the test set to finally evaluate the performance of the model after training is completed;
Step S34: and training is completed to obtain a low-code component recognition model.
The low-code component recognition model training method provided by the embodiment takes the standardized component sample data set which is formed by component library and component code rendering as a verification set, so that the performance of the model is greatly improved while the training model is prevented from being over-fitted, and conditions are provided for the subsequent low-code component mapping.
FIG. 7 illustrates an internal block diagram of a computer device in one embodiment. The computer device may in particular be a server. As shown in fig. 7, the computer device includes a processor, a memory, and a network interface connected by a system bus. The memory includes a nonvolatile storage medium and an internal memory. The non-volatile storage medium of the computer device stores an operating system, and may also store a computer program that, when executed by the processor, causes the processor to implement a lightweight face attribute recognition model training method. The internal memory may also have stored therein a computer program which, when executed by the processor, causes the processor to perform a low code component recognition model training method or a low code component mapping method.
It will be appreciated by those skilled in the art that the structure shown in FIG. 7 is merely a block diagram of some of the structures associated with the present inventive arrangements and is not limiting of the computer device to which the present inventive arrangements may be applied, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
In one embodiment, a computer device is provided that includes a memory and a processor, the memory storing a computer program that, when executed by the processor, causes the processor to perform the steps of the low-code component recognition model training method described above. The steps of the low-code component recognition model training method herein may be the steps of the low-code component recognition model training method of the various embodiments described above.
In one embodiment, a computer readable storage medium is provided, storing a computer program which, when executed by a processor, causes the processor to perform the steps of the low code component recognition model training method described above. The steps of the low-code component recognition model training method herein may be the steps in the low-code component recognition model training method of the various embodiments described above.
In one embodiment, a computer device is provided that includes a memory and a processor, the memory storing a computer program that, when executed by the processor, causes the processor to perform the steps of the low code component mapping method described above. The steps of the low code component mapping method herein may be the steps in the low code component mapping method of the various embodiments described above.
In one embodiment, a computer readable storage medium is provided, storing a computer program which, when executed by a processor, causes the processor to perform the steps of the low code component mapping method described above. The steps of the low code component mapping method herein may be the steps in the low code component mapping method of the various embodiments described above.
In summary, the low-code component mapping method provided by the invention adopts the trained low-code component recognition model to recognize the input UI design drawing, analyzes a plurality of candidate components on the UI design drawing and marks the type of each candidate component; the target component is determined among the plurality of candidate components and mapped to the same type of component within the component library, thereby automatically completing the component mapping between the UI design drawing and the front-end code. The mapping mode greatly simplifies the original process of manually replacing components, reduces the investment of developers and greatly improves the efficiency of page design; in addition, the resolution of candidate components based on the low-code component recognition model also greatly improves the accuracy of component matching during mapping. In order to further improve the matching precision, when the low-code component recognition model is trained, the UI design drawing sample data set is used as a basis, and then the original data of a plurality of component samples and the variants corresponding to each component sample are obtained from the component library and converted to form the component sample data set. The constructed neural network model is trained by adopting the UI design drawing sample data set and the component sample data set to avoid training over fitting, so that the accuracy of the low-code component recognition model obtained by training is greatly improved, and conditions are provided for high-accuracy matching in component mapping.
Although the invention has been described with reference to the preferred embodiments, it should be understood that the invention is not limited thereto, but rather may be modified and varied by those skilled in the art without departing from the spirit and scope of the invention.
Claims (9)
1. A method of training a low-code component recognition model, comprising:
Obtaining a UI design drawing sample data set, wherein each UI design drawing sample data set is marked with a boundary box of a target position where a component is located and a category of the boundary box;
Obtaining original data of a plurality of component samples from a component library and converting the original data to form a component library component sample data set; the conversion of the sample data for each component library component includes: obtaining the original data of a component sample and all variant information corresponding to the component sample from a component library; storing the original data of the component sample, and constructing a mapping relation between the original data of the component sample and a plurality of variant information; rendering the original data of the component sample based on the constructed mapping relation to obtain rendering preview images of all variants under the component sample; splicing rendering preview graphs of all variants to form a component sample graph; marking all variant bounding boxes and the category to which the bounding boxes belong in the component sample graph to form component library component sample data;
Training the constructed neural network model according to the UI design drawing sample data set and the component library component sample data set to obtain a low-code component recognition model;
wherein the information of all variants includes preview images of all variants and configuration items of all variants; constructing a mapping relationship between the raw data of the component sample and the plurality of variant information includes:
Establishing a one-to-many first mapping relationship between the original data of the component samples and the preview images of the plurality of variants, and establishing a second mapping relationship between the configuration items of all variants of each component sample and the preview images of all variants; and then rendering the original data of the corresponding component sample based on the constructed first mapping relation and second mapping relation.
2. The training method of a low-code component recognition model according to claim 1, wherein after obtaining configuration items of all variants, the configuration items are stored in a file in which original data of the component sample is located; and then, constructing a second mapping relation.
3. The training method of a low-code component recognition model according to claim 1, wherein the raw data of the component sample is stored in the form of JSON file; and when in rendering, rendering the original data of the component sample at the server by adopting a headless browser, and generating and exporting rendering preview images of all variants under the component sample.
4. The training method of a low-code component recognition model according to claim 1, wherein the UI design drawing sample data set is divided into a training set and a test set, and the component library component sample data set is used as a verification set; and (3) performing iterative training by using a training set when training the constructed neural network model, performing model verification adjustment by using a verification set after training to prevent overfitting, and finally performing model evaluation by using a test set.
5. The method of training a low-code component recognition model of claim 1, wherein the step of obtaining a UI design drawing sample dataset comprises:
Acquiring a plurality of UI design drawing samples;
Identifying the target position of the component in the UI design drawing sample, marking the target position in the form of a boundary box, and distributing the class to the boundary box based on the type of the component in the target position, wherein each class is provided with a unique mark;
Preprocessing the annotated UI design drawing sample to form UI design drawing sample data based on an input format of the constructed neural network model, the preprocessing including one or more steps of image resizing, image graying, image binarization, image denoising, image cropping.
6. A method of low code component mapping, comprising:
acquiring a UI design diagram to be detected;
Identifying the UI design diagram to be detected by using the low-code component identification model obtained by training the low-code component identification model by the training method of any one of claims 1-5, marking a plurality of candidate components in the UI design diagram in a boundary box mode, and marking the belonging category of each boundary box and the confidence of the candidate components in each boundary box based on the types of the candidate components;
Determining a target component among the plurality of candidate components based on the confidence and the location and width-height information of each candidate component;
according to the category of the bounding box in each target component, the target component is mapped into the components of the same category in the low-code component library to realize coding component mapping.
7. The low code component mapping method of claim 6, wherein determining a target component among a plurality of candidate components comprises:
Based on a preset confidence threshold, reserving candidate components with confidence degrees greater than or equal to the confidence threshold to form initially selected candidate components;
The preliminary candidate components whose bounding boxes are contained or which overlap with other bounding boxes are retained based on the bounding box position and width-height of each preliminary candidate component to form a target component.
8. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 7 when the computer program is executed.
9. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410562006.1A CN118132143B (en) | 2024-05-08 | 2024-05-08 | Training method, mapping method and equipment for low-code component recognition model |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410562006.1A CN118132143B (en) | 2024-05-08 | 2024-05-08 | Training method, mapping method and equipment for low-code component recognition model |
Publications (2)
Publication Number | Publication Date |
---|---|
CN118132143A CN118132143A (en) | 2024-06-04 |
CN118132143B true CN118132143B (en) | 2024-08-02 |
Family
ID=91240778
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410562006.1A Active CN118132143B (en) | 2024-05-08 | 2024-05-08 | Training method, mapping method and equipment for low-code component recognition model |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118132143B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114185531A (en) * | 2021-12-17 | 2022-03-15 | 北京字节跳动网络技术有限公司 | Interface code generation method and device, computer equipment and storage medium |
CN117234505A (en) * | 2023-09-06 | 2023-12-15 | 中国平安财产保险股份有限公司 | Interactive page generation method, device, equipment and storage medium thereof |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111190600B (en) * | 2019-12-31 | 2023-09-19 | 中国银行股份有限公司 | Method and system for automatically generating front-end codes based on GRU attention model |
CN111475160A (en) * | 2020-03-13 | 2020-07-31 | 深圳壹账通智能科技有限公司 | Method and device for generating product page and computer equipment |
EP3916636A1 (en) * | 2020-05-27 | 2021-12-01 | Siemens Aktiengesellschaft | Method and systems for providing synthetic labelled training data sets and use of same |
CN114723988A (en) * | 2022-03-23 | 2022-07-08 | 深圳市东汇精密机电有限公司 | Image recognition training method and device, computer equipment and storage medium |
CN115775386A (en) * | 2022-11-30 | 2023-03-10 | 上海浦东发展银行股份有限公司 | User interface component identification method and device, computer equipment and storage medium |
CN117746117A (en) * | 2023-12-14 | 2024-03-22 | 阿里巴巴(中国)有限公司 | Variant image recognition and model training method thereof and electronic equipment |
-
2024
- 2024-05-08 CN CN202410562006.1A patent/CN118132143B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114185531A (en) * | 2021-12-17 | 2022-03-15 | 北京字节跳动网络技术有限公司 | Interface code generation method and device, computer equipment and storage medium |
CN117234505A (en) * | 2023-09-06 | 2023-12-15 | 中国平安财产保险股份有限公司 | Interactive page generation method, device, equipment and storage medium thereof |
Also Published As
Publication number | Publication date |
---|---|
CN118132143A (en) | 2024-06-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113391871B (en) | RPA element intelligent fusion picking method and system | |
CN111079387B (en) | Automatic questionnaire generation method and device, computer equipment and storage medium | |
US6289254B1 (en) | Parts selection apparatus and parts selection system with CAD function | |
CA2315548C (en) | Advanced schematic editor | |
US20050203869A1 (en) | Hierarchical database apparatus, components selection method in hierarchical database, and components selection program | |
US11586918B2 (en) | Methods and systems for automatically detecting design elements in a two-dimensional design document | |
CN109658485B (en) | Webpage animation drawing method, device, computer equipment and storage medium | |
KR101015765B1 (en) | Apparatus and methods for converting raster illustrated parts images into intelligent vector-layered files | |
JPH0683598A (en) | Job flow specification automatic generating method | |
CN111652266A (en) | User interface component identification method and device, electronic equipment and storage medium | |
US20040054568A1 (en) | Automated construction project estimator tool | |
CN112308069A (en) | Click test method, device, equipment and storage medium for software interface | |
CN115562656A (en) | Page generation method and device, storage medium and computer equipment | |
CN113239227A (en) | Image data structuring method and device, electronic equipment and computer readable medium | |
US20200364034A1 (en) | System and Method for Automated Code Development and Construction | |
CN116245052A (en) | Drawing migration method, device, equipment and storage medium | |
CN115098368A (en) | Intelligent verification method and device for recognizing brain picture use case | |
CN118132143B (en) | Training method, mapping method and equipment for low-code component recognition model | |
US8086950B2 (en) | Method and system for enhancing engineering information | |
WO2023093850A1 (en) | Component identification method and apparatus, electronic device, and storage medium | |
CN117951009A (en) | Test script generation method and device, computing equipment and storage medium | |
CN116610304A (en) | Page code generation method, device, equipment and storage medium | |
CN117574851A (en) | Method, device and storage medium for reconstructing circuit schematic diagram in EDA tool | |
CN115546824B (en) | Taboo picture identification method, apparatus and storage medium | |
CN115631374A (en) | Control operation method, control detection model training method, device and equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |