CN113778905B - UI design acceptance method, device, equipment and storage medium - Google Patents

UI design acceptance method, device, equipment and storage medium Download PDF

Info

Publication number
CN113778905B
CN113778905B CN202111325486.2A CN202111325486A CN113778905B CN 113778905 B CN113778905 B CN 113778905B CN 202111325486 A CN202111325486 A CN 202111325486A CN 113778905 B CN113778905 B CN 113778905B
Authority
CN
China
Prior art keywords
layer
processed
image
pixel point
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111325486.2A
Other languages
Chinese (zh)
Other versions
CN113778905A (en
Inventor
苟亚明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202111325486.2A priority Critical patent/CN113778905B/en
Publication of CN113778905A publication Critical patent/CN113778905A/en
Application granted granted Critical
Publication of CN113778905B publication Critical patent/CN113778905B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application relates to the field of computers, and provides a UI design acceptance method, device, equipment and storage medium, which can be applied to the scenes of cloud technology, artificial intelligence, intelligent traffic, Internet of vehicles and the like. The method comprises the following steps: respectively obtaining actual layer interval sets between each layer to be processed and other layers which accord with the set membership based on the respective pixel value of each layer to be processed in the UI image to be processed; and comparing the actual layer interval set corresponding to each layer to be processed with the target layer interval set corresponding to the target layer in the target UI image to generate an acceptance result of the UI image to be processed. The mode size of each UI mode does not need to be measured independently, whether the UI mode and the inter-layer distance design of each layer to be processed are reasonable or not can be judged by comparing the actual inter-layer distance set of each layer to be processed with the target inter-layer distance set of the corresponding target layer, and the acceptance efficiency and the acceptance result accuracy rate are improved.

Description

UI design acceptance method, device, equipment and storage medium
Technical Field
The application relates to the field of computers, and provides a UI design acceptance method, a device, equipment and a storage medium.
Background
With the development of science and technology, more and more platforms launch application programs developed autonomously. In order to ensure that a developed User Interface (UI) can restore a UI design scheme given by a design draft as much as possible, UI design acceptance becomes a key link in the application program research and development process.
In the related technology, when UI design and acceptance is carried out, tools such as Sketch and Photoshop need to be used for measuring the style size of a UI style displayed on each layer in a UI image and the layer spacing between a child layer where the UI style is located and a corresponding parent layer; and comparing the style size of each UI style obtained through measurement and the inter-layer distance between layers with the design draft, marking out the layers with wrong style sizes or wrong inter-layer distance sizes, and generating a corresponding acceptance result.
The UI design acceptance method adopted at present depends heavily on manual measurement results, especially when UI design acceptance is carried out on UI images with complex design, the acceptance efficiency is low, moreover, the pattern sizes of all UI patterns and all layer intervals on the UI images are difficult to manually measure, and if a final acceptance result is generated based on partial measurement results, the accuracy of the acceptance result is reduced.
Disclosure of Invention
The embodiment of the application provides a UI design acceptance method, a device, equipment and a storage medium, and aims to solve the problem of low accuracy of acceptance results.
In a first aspect, an embodiment of the present application provides a UI design acceptance method, including:
acquiring a UI image to be processed;
respectively obtaining actual layer interval sets between each layer to be processed and other layers which accord with a set membership based on the respective pixel values of each layer to be processed contained in the UI image to be processed;
comparing the actual layer interval set corresponding to each layer to be processed in the UI image to be processed with the target layer interval set corresponding to the target layer in the target UI image to generate an acceptance result of the UI image to be processed;
and the UI image to be processed and the target UI image are generated by respectively performing rendering processing based on the same UI data.
In a second aspect, an embodiment of the present application further provides a UI design acceptance apparatus, including:
the acquisition unit is used for acquiring a UI image to be processed;
the processing unit is used for respectively obtaining actual layer interval sets between each layer to be processed and other layers which accord with a set membership based on the respective pixel value of each layer to be processed contained in the UI image to be processed;
the acceptance unit is used for comparing the actual inter-layer distance set corresponding to each to-be-processed layer in the to-be-processed UI image with the target inter-layer distance set corresponding to the target layer in the target UI image to generate an acceptance result of the to-be-processed UI image;
and the UI image to be processed and the target UI image are generated by respectively performing rendering processing based on the same UI data.
Optionally, the processing unit obtains a pixel value of a layer to be processed in the UI image to be processed by performing the following operations:
acquiring a child node representing the layer to be processed and a father node representing other layers according with the set membership in the interface structure tree of the UI image to be processed;
reading the image texture information of the child node from the texture unit of the father node;
and obtaining the pixel value of the layer to be processed based on the identification information of the child node and the image texture information of the child node.
Optionally, the set membership representation indicates that a node corresponding to the layer to be processed in the interface structure tree is a child node of a node corresponding to the other layer in the interface structure tree.
Optionally, the acceptance unit is configured to:
comparing the actual inter-layer distance set corresponding to each layer to be processed displayed in the UI image to be processed with the target inter-layer distance set corresponding to the target layer displayed in the target UI image, and marking a corresponding target inter-layer distance on one layer to be processed when at least one actual inter-layer distance is different from the corresponding target inter-layer distance when comparing the actual inter-layer distance set of one layer to be processed;
and taking the marked UI image to be processed as a corresponding acceptance result.
In a third aspect, an embodiment of the present application further provides a computer device, including a processor and a memory, where the memory stores program codes, and when the program codes are executed by the processor, the processor is caused to execute the steps of any one of the UI design acceptance methods.
In a fourth aspect, the present application further provides a computer-readable storage medium including program code for causing a computer device to perform the steps of any one of the UI design acceptance methods described above when the program product is run on the computer device.
The beneficial effect of this application is as follows:
the embodiment of the application provides a UI design acceptance method, a device, equipment and a storage medium, wherein the method comprises the following steps: respectively obtaining actual layer interval sets between each layer to be processed and other layers which accord with the set membership based on the respective pixel values of each layer to be processed contained in the UI image to be processed; comparing the actual inter-layer distance set corresponding to each layer to be processed in the UI image to be processed with the target inter-layer distance set corresponding to the target layer in the target UI image to generate an acceptance result of the UI image to be processed; the UI image to be processed and the target UI image are generated by performing rendering processing based on the same UI data respectively.
And determining an actual layer interval set between each layer to be processed and other layers which accord with the set membership based on the respective pixel value of each layer to be processed, and comparing the actual layer interval set corresponding to each layer to be processed displayed in the UI image to be processed with a target layer interval set corresponding to a target layer displayed in the target UI image, so that whether the design of the UI pattern displayed by each layer to be processed is reasonable or not can be judged, and whether the design of the actual layer interval set of each layer to be processed is reasonable or not can be judged. Compared with the UI design acceptance method provided by the related technology, the method and the device for checking the UI design do not need to measure the size of each UI pattern independently, the acceptance steps are simplified, and the acceptance efficiency is improved. Especially, when UI design acceptance is carried out on a UI image to be processed with a complex design, a mode of determining an actual layer interval set of each layer to be processed based on respective pixel values of each layer to be processed is adopted, all layer intervals can be accurately measured, and the accuracy rate of acceptance results is improved.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1a is an alternative schematic diagram of an application scenario in an embodiment of the present application; (ii) a
Fig. 1b is a schematic flowchart of a UI design acceptance method according to an embodiment of the present application;
fig. 1c is a schematic diagram of a UI image to be processed according to an embodiment of the present application;
FIG. 1d is a schematic diagram of an interface structure tree provided in an embodiment of the present application;
fig. 1e is a schematic flowchart of a process for obtaining an actual inter-layer distance set of a layer x to be processed according to an embodiment of the present application;
fig. 1f is a schematic diagram formed by overlapping a layer to be processed and other layers according to an embodiment of the present application;
fig. 2a is a schematic diagram of a target UI image given by a design draft provided in an embodiment of the present application;
fig. 2b is a schematic diagram of a to-be-processed UI image rendered by a client according to the embodiment of the present application;
fig. 2c is a schematic diagram of an acceptance result of a UI image to be processed according to an embodiment of the present application;
FIG. 3 is a schematic flow chart illustrating UI design acceptance in an embodiment according to the present application;
fig. 4 is a schematic structural diagram of a UI design acceptance apparatus provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of a component structure of a computer device provided in an embodiment of the present application;
FIG. 6 is a block diagram of a computing device according to an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments, but not all embodiments, of the technical solutions of the present application. All other embodiments obtained by a person skilled in the art without any inventive step based on the embodiments described in the present application are within the scope of the protection of the present application.
Some terms in the embodiments of the present application are explained below to facilitate understanding by those skilled in the art.
1. Cloud Computing (Cloud Computing):
narrow-definition cloud computing refers to a mode of delivery and use of Internet Technology (IT) infrastructure, which is to obtain required resources through a network in an on-demand, easily extensible manner; and the generalized cloud computing refers to a delivery and use mode of a service, and a required service is obtained in an on-demand and easily-extensible manner through a network. The service may be a service related to IT, software, internet, or other service.
Cloud Computing is a product of development and fusion of traditional computers and Network Technologies, such as Grid Computing (Grid Computing), Distributed Computing (Distributed Computing), Parallel Computing (Parallel Computing), Utility Computing (Utility Computing), Network Storage (Network Storage Technologies), Virtualization (Virtualization), Load balancing (Load Balance), and the like.
With the development of diversification of internet, real-time data stream and connecting equipment and the promotion of demands of search service, social network, mobile commerce, open collaboration and the like, cloud computing is rapidly developed. Unlike traditional parallel distributed computing, cloud computing is a revolutionary revolution that will push the entire internet model and enterprise management model in concept.
2. Open Graphics Library (Open Graphics Library, OpenGL):
OpenGL is a cross-language, cross-platform Application Programming Interface (API) for rendering two-dimensional (2D) and three-dimensional (3D) vector graphics, and consists of nearly 350 different call functions, which can be used to draw both simple graphics bits and complex three-dimensional scenes.
OpenGL is commonly used in application scenarios such as Computer Aided Design (CAD), Virtual Reality (VR), scientific visualization, and electronic game development, and the interface interacts with a graphics processing unit to achieve the effect of hardware acceleration.
3. Tree data structure:
in computer science, a tree is an abstract data type or a data structure of such abstract data types. The tree data structure is a set with a hierarchical relationship composed of n (n > 0) finite nodes and is used for representing one-to-many relationship between data table elements, wherein a tree and a binary tree are two most common tree data structures.
This data structure is called a "tree" because it looks like an inverted tree, i.e., it is a tree with its root facing up and its leaves facing down. Thus, the tree data structure has several features:
(1) each node has zero or more child nodes;
(2) nodes without father nodes are called root nodes;
(3) each node of the non-root node has one father node;
(4) each child node, except the root node, may be divided into a plurality of disjoint sub-trees.
Tree data structures are an important type of non-linear data structures, which are widely used in the field of computers, for example, in a compiler, a tree is used to represent a syntax structure of a source program; for example, in a database system, a tree data structure is also one of important organization forms of information; for another example, a tree data structure is adopted to build a multilevel directory structure for file management.
4、YUV:
YUV is a color coding method commonly used in video processing components. Where the Y component represents the brightness luminence or Luma), also referred to as the gray value; the U component and the V component represent Chrominance (Chroma) which is commonly used to describe the color and saturation of a video picture, and thus, the pixel value of the video picture can be specified by the U component and the V component.
The following briefly introduces the design concept of the embodiments of the present application:
with the development of science and technology, more and more platforms launch application programs developed autonomously. In order to ensure that the developed UI can restore the UI design scheme given by the design draft as much as possible, UI design acceptance becomes a key link in the process of developing the application program.
In the related technology, when UI design and acceptance is carried out, tools such as Sketch and Photoshop need to be used for measuring the style size of a UI style displayed on each layer in a UI image and the layer spacing between a child layer where the UI style is located and a corresponding parent layer; and comparing the style size of each UI style obtained through measurement and the inter-layer distance between layers with the design draft, marking out the layers with wrong style sizes or wrong inter-layer distance sizes, and generating a corresponding acceptance result.
The UI design acceptance method adopted at present depends heavily on manual measurement results, especially when UI design acceptance is carried out on UI images with complex design, the acceptance efficiency is low, moreover, the pattern sizes of all UI patterns and all layer intervals on the UI images are difficult to manually measure, and if a final acceptance result is generated based on partial measurement results, the accuracy of the acceptance result is reduced.
In order to solve the problem of low accuracy of an acceptance result, the embodiment of the application provides a UI design acceptance method. The method comprises the following steps:
acquiring a UI image to be processed, and respectively acquiring actual layer interval sets between each layer to be processed and other layers which accord with a set membership based on the respective pixel value of each layer to be processed contained in the UI image to be processed; comparing the actual inter-layer distance set corresponding to each layer to be processed in the UI image to be processed with the target inter-layer distance set corresponding to the target layer in the target UI image to generate an acceptance result of the UI image to be processed; the UI image to be processed and the target UI image are generated by performing rendering processing based on the same UI data respectively.
The preferred embodiments of the present application will be described below with reference to the accompanying drawings of the specification, it should be understood that the preferred embodiments described herein are merely for illustrating and explaining the present application, and are not intended to limit the present application, and that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
The embodiment of the application can be applied to various scenes such as cloud technology, artificial intelligence and the like. Fig. 1a shows an alternative schematic diagram of an application scenario in which two physical end devices 110 running clients and a server 130 running a metadata processing (Meta Data Handle) module are included, and the two physical end devices 110 communicate with the server 130 via a wired network or a wireless network, respectively.
When the client loads a page, the client sends a UI identifier to the server 130 through the physical terminal device 110 to obtain UI data corresponding to the UI identifier sent by the server 130; the client performs rendering processing based on the UI data, displays a to-be-processed UI image on the display screen 120, and uploads the to-be-processed UI image to the metadata processing module of the server 130 by using a screen capture method of the physical terminal device 110.
A metadata processing module of the server 130 acquires a UI image to be processed, and based on the UI image to be processed including respective pixel values of layers to be processed, acquires actual layer interval sets between each layer to be processed and other layers that meet a set membership relationship; comparing the actual inter-layer distance set corresponding to each layer to be processed in the UI image to be processed with the target inter-layer distance set corresponding to the target layer in the target UI image to generate an acceptance result of the UI image to be processed; wherein the target UI image is also generated based on the UI data rendering process.
In this embodiment, the physical terminal device 110 is an electronic device used by a user, and the electronic device may be an electronic device with a page rendering capability, such as a personal computer, a mobile phone, a tablet computer, a notebook computer, an electronic book reader, an intelligent voice interaction device, an intelligent household appliance, and a vehicle-mounted terminal.
The server 130 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, a middleware service, a domain name service, a security service, a Content Delivery Network (CDN), big data, and an artificial intelligence platform, and the application is not limited herein.
Referring to the flow chart diagram shown in fig. 1b, a UI design acceptance method proposed in the embodiment of the present application is described.
S101: and the server acquires the UI image to be processed.
Before executing step 101, a client running on the physical terminal device sends a UI identifier to the server through the physical terminal device to obtain UI data corresponding to the UI identifier sent by the server; and the client performs rendering processing based on the UI data, displays the rendered UI image to be processed on a display screen of the physical terminal equipment, and uploads the UI image to be processed to a metadata processing module of the server by calling a screen capturing method of the physical terminal equipment.
S102: and the server respectively obtains each layer to be processed and an actual layer interval set between each layer to be processed and other layers which accord with the set membership based on the respective pixel value of each layer to be processed contained in the UI image to be processed.
Referring to the schematic diagram of the UI image to be processed shown in fig. 1c, the image includes a plurality of layers to be processed, each layer to be processed is set with a corresponding layer color, and the layer colors of two layers that meet the set membership relationship are different. Taking an image layer x to be processed as an example, the following operations are performed to obtain a pixel value of the image layer x to be processed:
acquiring child nodes representing a layer x to be processed and father nodes representing other layers according with set membership in an interface structure tree of a UI image to be processed, wherein the membership representation is set, and the nodes corresponding to the layer x to be processed in the interface structure tree are child nodes of the nodes corresponding to other layers in the interface structure tree; and reading the image texture information of the child node from the texture unit of the father node, and obtaining the pixel value of the layer x to be processed based on the identification information of the child node and the image texture information of the child node.
Specifically, the metadata processing module in the server analyzes the UI image to be processed to obtain an interface structure tree as shown in fig. 1d, where the interface structure tree is similar to an inverted tree, a root node of the interface structure tree is a background layer at the bottom of the UI image to be processed, and a leaf node of the interface structure tree is a layer to be processed at the top of the UI image to be processed.
In order to facilitate calculation of each layer to be processed and an actual layer interval set between each layer to be processed and other layers meeting a set dependency relationship, a metadata processing module sequentially traverses each node of an interface structure tree from bottom to top, and each node is traversed every time when the node represents a child node of one layer to be processed, obtains the child node and a parent node representing other layers meeting the set dependency relationship, reads image texture information (namely, layer color of the layer to be processed) of the child node from a texture unit of the parent node through OpenGL, creates a fragment shader 1 based on identification information of the child node and image texture information of the child node, calls a texture2D (texture 2D) function through the fragment shader 1 to calculate U components and V components of the layer to be processed, and finally obtains a pixel value of the layer to be processed based on the U components and the V components.
Each time the metadata processing module traverses one node, in addition to executing the above operation, the metadata processing module obtains a pixel value of a layer to be processed, reads image texture information of a parent node (namely, layer colors of other layers) from a texture unit of the parent node through OpenGL, creates a fragment shader 2 based on identification information of the parent node and the image texture information of the parent node, calculates U and V components of other layers by calling a texture2D (texture 2D) function through the fragment shader 2, and finally obtains the pixel value of the other layer based on the U and V components.
According to the embodiment of the application, the actual layer interval sets of the layers to be processed are respectively obtained based on the pixel values of the layers to be processed contained in the UI images to be processed, so that the metadata processing module can accurately obtain the pixel values of the layers to be processed in the UI images to be processed by executing the operation, and the accuracy of subsequent measurement results and acceptance results can be improved. Because each layer to be processed comprises a plurality of pixel points, the pixel values of the pixel points on the same layer to be processed are the pixel values of the corresponding layer.
Next, for each layer to be processed, the flowchart shown in fig. 1e is executed in a cyclic manner until a corresponding actual inter-layer distance set is obtained, where fig. 1e shows a process of obtaining an actual inter-layer distance set of one layer x to be processed.
S1021: in the layer x to be processed, every time a first pixel point s is traversed, when the first pixel point s traversed last time is determined to be a first edge pixel point based on the pixel value of the first pixel point s and the pixel value of the first pixel point traversed last time, second edge pixel points in other layers corresponding to the layer x to be processed are obtained based on the first edge pixel point, and the actual layer distance of the layer x to be processed in an edge direction is determined based on the pixel distance between the first edge pixel point and the second edge pixel point.
The method comprises the steps that a layer to be processed is composed of a plurality of pixel points, the pixel value of each pixel point is the pixel value of the layer, and two layers which accord with the set membership are set to be different layer colors, so that the respective edge pixel points of the two layers can be determined based on whether the pixel values of the currently traversed pixel points are suddenly changed, and the actual layer interval between the layer to be processed and other layers which accord with the set membership is calculated based on the corresponding two edge pixel points. Compared with the UI design acceptance method provided by the related technology, the method and the device improve the measurement accuracy of each actual layer interval and are beneficial to optimizing the accuracy of the acceptance result.
Specifically, when it is determined that the pixel value of the first pixel point s is different from the pixel value of the first pixel point s traversed last time and the pixel value of the first pixel point s is the same as the pixel values of the other image layers, the first pixel point s traversed last time is determined to be a first edge pixel point.
After the first edge pixel point is determined, traversing a plurality of second pixel points located in an edge direction on other layers corresponding to the layer x to be processed by taking the first edge pixel point as a starting point, and judging whether the last traversed second pixel point is the second edge pixel point based on the pixel value of the second pixel point s 'and the pixel value of the last traversed second pixel point every time one second pixel point s' is traversed.
The specific judgment process is that when the pixel value of the second pixel point s 'is different from the pixel value of the second pixel point which is traversed last time and the pixel value of the second pixel point s' is different from the pixel values of the two image layers, the second pixel point which is traversed last time is determined to be a second edge pixel point.
S1022: and outputting the obtained actual layer intervals of the plurality of edge directions as an actual interval set of the layer x to be processed.
For example, in the schematic diagram shown in fig. 1f, the layer containing the number 3 is a layer to be processed, and the layer containing the black background is another layer.
And traversing a plurality of first pixel points which are positioned in an upward edge direction on the layer to be processed by taking the first pixel points positioned at the center of the layer to be processed as starting points, and judging whether the first pixel points which are traversed last time are first edge pixel points or not based on the pixel values of the first pixel points s and the pixel values of the first pixel points which are traversed last time when one first pixel point s is traversed.
After the first edge pixel point is determined, traversing a plurality of second pixel points in an upward edge direction on other layers by taking the first edge pixel point as a starting point, and judging whether the second pixel point which is traversed last time is the second edge pixel point based on the pixel value of the second pixel point s' and the pixel value of the second pixel point which is traversed last time.
And finally, determining the actual layer interval of the layer to be processed in the upward edge direction to be 18.5pt based on the pixel distance between the first edge pixel point and the second edge pixel point. By using the same method, the actual layer pitch of the layer to be processed in the left edge direction is 29pt, the actual layer pitch of the layer to be processed in the right edge direction is 28.5pt, and the actual layer pitch of the layer to be processed in the downward edge direction is 18.5 pt.
S103: the server compares the actual inter-layer distance set corresponding to each layer to be processed in the UI image to be processed with the target inter-layer distance set corresponding to the target layer in the target UI image to generate an acceptance result of the UI image to be processed; the UI image to be processed and the target UI image are obtained by performing rendering processing based on the same UI data respectively.
And comparing the actual layer interval set corresponding to each layer to be processed displayed in the UI image to be processed with the target layer interval set corresponding to the target layer displayed in the target UI image, marking the corresponding target layer interval on the layer to be processed when at least one actual layer interval is different from the corresponding target layer interval in each actual layer interval set of the layer to be processed, and taking the marked UI image to be processed as a corresponding acceptance result. And marking a correct target diagram interlayer distance on the UI image to be processed, so that developers can conveniently adjust the UI design codes based on the acceptance result, and the UI image to be processed obtained based on the adjusted UI design codes through rendering can restore the design scheme given by the design draft as much as possible.
In the process of executing the above operation on the UI image to be processed, the metadata processing module also executes an operation similar to that shown in step 102 on the target UI image in an asynchronous manner, and obtains a target image layer interval set of each target image layer. Since the specific operation of step 102 has been described in detail above, the detailed operation performed on the target UI image is not described herein again.
For example, fig. 2a is a target UI image, and fig. 2b is a UI image to be processed, and the following operations are respectively performed on the two images to obtain a target inter-layer distance set of each target layer in the target UI image and an actual inter-layer distance set of each layer to be processed in the UI image to be processed; and comparing the actual layer interval set corresponding to each layer to be processed displayed in the UI image to be processed with the target layer interval set corresponding to the target layer displayed in the target UI image to obtain a corresponding comparison result, and redrawing the UI image to be processed based on the comparison result to obtain an acceptance result shown in fig. 2 c. Due to the fact that the distance between partial layers is too small, corresponding correct layer distance numerical values cannot be shown, and layers with design problems in the UI images to be processed are marked out only by using straight lines.
As can be seen from the schematic diagram shown in fig. 2a, each target layer displayed in the target UI image is a rectangular block, and the length and width of each rectangular block are both the layer size of the corresponding layer and the style size of the UI style displayed on the layer. And when the style size of the UI style displayed by each target layer is determined, determining the target layer interval set of each target layer.
Therefore, in the embodiment of the application, based on the respective pixel value of each layer to be processed, the actual layer interval set between each layer to be processed and other layers which meet the set membership is determined, and by comparing the actual layer interval set corresponding to each layer to be processed displayed in the UI image to be processed with the target layer interval set corresponding to the target layer displayed in the target UI image, it is possible to determine whether the design of the UI style displayed on each layer to be processed is reasonable and also determine whether the design of the actual layer interval set of each layer to be processed is reasonable.
Compared with the UI design acceptance method provided by the related technology, the method and the device for checking the UI design do not need to measure the size of each UI pattern independently, the acceptance steps are simplified, and the acceptance efficiency is improved. Especially, when UI design acceptance is carried out on a UI image to be processed with a complex design, a mode of determining an actual layer interval set of each layer to be processed based on respective pixel values of each layer to be processed is adopted, all layer intervals can be accurately measured, and the accuracy rate of acceptance results is improved.
Referring to the flowchart shown in fig. 3, a process of checking and accepting a UI design for a UI image to be processed will be described with reference to a specific embodiment.
S301: a metadata processing module of the server analyzes the target UI image uploaded by the designer to obtain corresponding UI data;
s302: the metadata processing module packages the UI data into UI data in an xml data format, and the UI data in the xml data format and the UI identification of the target UI image are mapped and stored in a database of the server;
s303: the client running on the physical terminal equipment sends the UI identification to the server through the physical terminal equipment so as to acquire UI data which is sent by the server and corresponds to the UI identification;
s304: the client performs rendering processing based on the UI data, displays a rendered UI image to be processed on a display screen of the physical terminal equipment, and uploads the UI image to be processed to the metadata processing module by calling a screen capturing method of the physical terminal equipment;
s305: the metadata processing module analyzes the UI image to be processed to obtain a corresponding interface structure tree;
s306: the metadata processing module sequentially traverses each node of the interface structure tree from bottom to top, and obtains a pixel value of the layer to be processed and pixel values of other layers according with a set dependency relationship through OpenGL every time one node is traversed, and obtains an actual layer interval set of the layer to be processed based on the pixel value of the layer to be processed and the pixel values of the other layers;
s307: and the metadata processing module compares the actual layer interval set corresponding to each layer to be processed in the UI image to be processed with the target layer interval set corresponding to the target layer in the target UI image to generate an acceptance result of the UI image to be processed.
The structure schematic diagram of the UI design acceptance device is provided in the embodiment of the application based on the same inventive concept as the embodiment of the method. As shown in fig. 4, the apparatus 400 may include:
an acquiring unit 401 configured to acquire a UI image to be processed;
a processing unit 402, configured to obtain, based on respective pixel values of each layer to be processed included in the UI image to be processed, an actual layer interval set between each layer to be processed and another layer that meets the set dependency relationship;
an acceptance unit 403, configured to compare an actual inter-layer distance set corresponding to each to-be-processed layer in the to-be-processed UI image with a target inter-layer distance set corresponding to a target layer in the target UI image, and generate an acceptance result of the to-be-processed UI image;
the UI image to be processed and the target UI image are generated by performing rendering processing based on the same UI data respectively.
Optionally, the processing unit 402 is configured to:
and respectively executing the following operations in a circulating mode aiming at each layer to be processed until a corresponding actual layer interval set is obtained:
in a layer to be processed, when a first pixel point is traversed and the last traversed first pixel point is determined to be a first edge pixel point based on the pixel value of the first pixel point and the pixel value of the last traversed first pixel point, second edge pixel points in other layers corresponding to the layer to be processed are obtained based on the first edge pixel point, and the actual layer interval of the layer to be processed in an edge direction is determined based on the pixel distance between the first edge pixel point and the second edge pixel point;
and outputting the obtained actual layer intervals of the plurality of edge directions as an actual layer interval set of the layer to be processed.
Optionally, the processing unit 402 is configured to:
and when the pixel value of a first pixel point is determined to be different from the pixel value of the first pixel point traversed last time, and the pixel value of the first pixel point is determined to be the same as the pixel values of other image layers, determining the first pixel point traversed last time as a first edge pixel point.
Optionally, the processing unit 402 is configured to:
and traversing a plurality of second pixel points located in an edge direction on other layers corresponding to one layer to be processed by taking the first edge pixel point as a starting point, and judging whether the last traversed second pixel point is the second edge pixel point or not based on the pixel value of one second pixel point and the pixel value of the last traversed second pixel point when one second pixel point is traversed.
Optionally, the processing unit 402 is configured to:
and when the pixel value of one second pixel point is determined to be different from the pixel value of the second pixel point traversed last time and the pixel value of one second pixel point is determined to be different from the pixel values of the two image layers, determining the second pixel point traversed last time as a second edge pixel point.
Optionally, the processing unit 402 obtains a pixel value of a layer to be processed in the UI image to be processed by performing the following operations:
acquiring a child node representing one layer to be processed and a father node representing other layers which accord with a set membership in an interface structure tree of a UI image to be processed;
reading image texture information of a child node from a texture unit of a father node;
and obtaining a pixel value of the layer to be processed based on the identification information of the child node and the image texture information of the child node.
Optionally, a dependency representation is set, and a node corresponding to one layer to be processed in the interface structure tree is a child node of a node corresponding to another layer in the interface structure tree.
Optionally, the acceptance unit 403 is configured to:
comparing an actual layer interval set corresponding to each layer to be processed displayed in the UI image to be processed with a target layer interval set corresponding to a target layer displayed in the target UI image, wherein when at least one actual layer interval set of one layer to be processed is different from the corresponding target layer interval set, a corresponding target layer interval is marked on one layer to be processed;
and taking the marked UI image to be processed as a corresponding acceptance result.
For convenience of description, the above parts are separately described as modules (or units) according to functional division. Of course, the functionality of the various modules (or units) may be implemented in the same one or more pieces of software or hardware when implementing the present application.
Having described the UI design acceptance method and apparatus of the exemplary embodiments of the present application, a computer device according to another exemplary embodiment of the present application is next described.
As will be appreciated by one skilled in the art, aspects of the present application may be embodied as a system, method or program product. Accordingly, various aspects of the present application may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
Based on the same inventive concept as the method embodiment described above, in the embodiment of the present application, a computer device is also provided, and referring to fig. 5, the computer device 500 may at least include a processor 501 and a memory 502. The memory 502 stores therein program code, which when executed by the processor 501, causes the processor 501 to perform the steps of any one of the UI design acceptance methods described above.
In some possible implementations, a computing device according to the present application may include at least one processor, and at least one memory. Wherein the memory stores program code that, when executed by the processor, causes the processor to perform the steps of the UI design acceptance method according to various exemplary embodiments of the present application described herein. For example, the processor may perform the steps as shown in FIG. 1 b.
A computing device 600 according to this embodiment of the present application is described below with reference to fig. 6. The computing device 600 of fig. 6 is only one example and should not be used to limit the scope of use and functionality of embodiments of the present application.
As shown in fig. 6, computing device 600 is embodied in the form of a general purpose computing device. Components of computing device 600 may include, but are not limited to: the at least one processing unit 601, the at least one memory unit 602, and a bus 603 that connects the various system components (including the memory unit 602 and the processing unit 601).
Bus 603 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, a processor, or a local bus using any of a variety of bus architectures.
The storage unit 602 may include readable media in the form of volatile memory, such as Random Access Memory (RAM) 6021 and/or cache memory unit 6022, and may further include Read Only Memory (ROM) 6023.
The memory unit 602 may also include a program/utility 6025 having a set (at least one) of program modules 6024, such program modules 6024 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
The computing apparatus 600 may also communicate with one or more external devices 604 (e.g., keyboard, pointing device, etc.), with one or more devices that enable a user to interact with the computing apparatus 600, and/or with any devices (e.g., router, modem, etc.) that enable the computing apparatus 600 to communicate with one or more other computing apparatuses. Such communication may occur via input/output (I/O) interfaces 605. Moreover, the computing device 600 may also communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 606. As shown, the network adapter 606 communicates with the other modules for the computing device 600 over the bus 603. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with computing device 600, including but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Based on the same inventive concept as the above-described method embodiments, the various aspects of the UI design acceptance method provided by the present application may also be implemented in the form of a program product comprising program code for causing a computer device to perform the steps in the UI design acceptance method according to various exemplary embodiments of the present application described in the present specification when the program product is run on the computer device, e.g. the electronic device may perform the steps as shown in fig. 1 b.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (13)

1. A UI design acceptance method is characterized by comprising the following steps:
acquiring a UI image to be processed;
and for each layer to be processed contained in the UI image to be processed, executing the following operations in a circulating mode respectively until a corresponding actual layer interval set is obtained: in one layer to be processed, when a first pixel point is traversed and the last traversed first pixel point is determined to be a first edge pixel point based on the pixel value of the first pixel point and the pixel value of the last traversed first pixel point, second edge pixel points in other layers which accord with a set membership relationship with the layer to be processed are obtained based on the first edge pixel points, the actual layer interval of the layer to be processed in one edge direction is determined based on the pixel distance between the first edge pixel point and the second edge pixel point, and the obtained actual layer intervals of a plurality of edge directions are used as the actual layer interval set of the layer to be processed to be output;
comparing the actual layer interval set corresponding to each layer to be processed in the UI image to be processed with the target layer interval set corresponding to the target layer in the target UI image to generate an acceptance result of the UI image to be processed;
and the UI image to be processed and the target UI image are generated by respectively performing rendering processing based on the same UI data.
2. The method of claim 1, wherein determining the last traversed first pixel point as a first edge pixel point based on the pixel value of the one first pixel point and the pixel value of the last traversed first pixel point comprises:
and when the pixel value of the first pixel point is determined to be different from the pixel value of the first pixel point traversed last time and the pixel value of the first pixel point is determined to be the same as the pixel values of the other image layers, determining the first pixel point traversed last time as the first edge pixel point.
3. The method according to claim 1, wherein said obtaining second edge pixel points in other layers corresponding to said one layer to be processed based on said first edge pixel points comprises:
and traversing a plurality of second pixel points located in the edge direction on other layers corresponding to the layer to be processed by taking the first edge pixel point as a starting point, and judging whether the last traversed second pixel point is the second edge pixel point or not based on the pixel value of the second pixel point and the pixel value of the last traversed second pixel point when one second pixel point is traversed.
4. The method of claim 3, wherein said determining whether the last traversed second pixel is the second edge pixel based on the pixel value of the one second pixel and the pixel value of the last traversed second pixel comprises:
and when the pixel value of the second pixel point is determined to be different from the pixel value of the last traversed second pixel point and the pixel value of the second pixel point is determined to be different from the pixel values of the two image layers, determining the last traversed second pixel point as the second edge pixel point.
5. The method according to claim 1, wherein the pixel value of one layer to be processed in the UI image to be processed is obtained by performing the following operations:
acquiring a child node representing the layer to be processed and a father node representing other layers according with the set membership in the interface structure tree of the UI image to be processed;
reading the image texture information of the child node from the texture unit of the father node;
and obtaining the pixel value of the layer to be processed based on the identification information of the child node and the image texture information of the child node.
6. The method according to claim 5, wherein the set dependency representation indicates that the node corresponding to the layer to be processed in the interface structure tree is a child node of the node corresponding to the other layers in the interface structure tree.
7. The method according to any one of claims 1 to 6, wherein the comparing the actual inter-layer distance set corresponding to each of the layers to be processed in the UI image to be processed with the target inter-layer distance set corresponding to the target layer in the target UI image to generate the acceptance result of the UI image to be processed includes:
comparing the actual inter-layer distance set corresponding to each layer to be processed displayed in the UI image to be processed with the target inter-layer distance set corresponding to the target layer displayed in the target UI image, wherein when at least one actual inter-layer distance is different from the corresponding target inter-layer distance, the corresponding target inter-layer distance is marked on one layer to be processed when the actual inter-layer distance set of one layer to be processed is compared;
and taking the marked UI image to be processed as a corresponding acceptance result.
8. A UI design acceptance apparatus, comprising:
the acquisition unit is used for acquiring a UI image to be processed;
a processing unit, configured to, for each layer to be processed included in the UI image to be processed, respectively perform the following operations in a cyclic manner until a corresponding actual inter-layer distance set is obtained: in one layer to be processed, when a first pixel point is traversed and the last traversed first pixel point is determined to be a first edge pixel point based on the pixel value of the first pixel point and the pixel value of the last traversed first pixel point, second edge pixel points in other layers which accord with a set membership relationship with the layer to be processed are obtained based on the first edge pixel points, the actual layer interval of the layer to be processed in one edge direction is determined based on the pixel distance between the first edge pixel point and the second edge pixel point, and the obtained actual layer intervals of a plurality of edge directions are used as the actual layer interval set of the layer to be processed to be output;
the acceptance unit is used for comparing the actual inter-layer distance set corresponding to each to-be-processed layer in the to-be-processed UI image with the target inter-layer distance set corresponding to the target layer in the target UI image to generate an acceptance result of the to-be-processed UI image;
and the UI image to be processed and the target UI image are generated by respectively performing rendering processing based on the same UI data.
9. The apparatus as defined in claim 8, wherein the processing unit is to:
and when the pixel value of the first pixel point is determined to be different from the pixel value of the first pixel point traversed last time and the pixel value of the first pixel point is determined to be the same as the pixel values of the other image layers, determining the first pixel point traversed last time as the first edge pixel point.
10. The apparatus as defined in claim 8, wherein the processing unit is to:
and traversing a plurality of second pixel points located in the edge direction on other layers corresponding to the layer to be processed by taking the first edge pixel point as a starting point, and judging whether the last traversed second pixel point is the second edge pixel point or not based on the pixel value of the second pixel point and the pixel value of the last traversed second pixel point when one second pixel point is traversed.
11. The apparatus as defined in claim 10, wherein the processing unit is to:
and when the pixel value of the second pixel point is determined to be different from the pixel value of the last traversed second pixel point and the pixel value of the second pixel point is determined to be different from the pixel values of the two image layers, determining the last traversed second pixel point as the second edge pixel point.
12. A computer device comprising a processor and a memory, wherein the memory stores program code which, when executed by the processor, causes the processor to perform the steps of the method of any of claims 1 to 7.
13. A computer-readable storage medium, characterized in that it comprises program code for causing a computer device to perform the steps of the method according to any one of claims 1 to 7, when said program code is run on said computer device.
CN202111325486.2A 2021-11-10 2021-11-10 UI design acceptance method, device, equipment and storage medium Active CN113778905B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111325486.2A CN113778905B (en) 2021-11-10 2021-11-10 UI design acceptance method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111325486.2A CN113778905B (en) 2021-11-10 2021-11-10 UI design acceptance method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113778905A CN113778905A (en) 2021-12-10
CN113778905B true CN113778905B (en) 2022-02-08

Family

ID=78873675

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111325486.2A Active CN113778905B (en) 2021-11-10 2021-11-10 UI design acceptance method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113778905B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103310037A (en) * 2013-04-28 2013-09-18 上海华力微电子有限公司 Layout design rule check file verification graphics library and establishment method thereof
CN110554957A (en) * 2019-07-31 2019-12-10 北京三快在线科技有限公司 method and device for testing user interface, electronic equipment and readable storage medium
CN111857703A (en) * 2020-07-31 2020-10-30 北京爱奇艺科技有限公司 Matching method and device for layers in interface and electronic equipment
CN111881050A (en) * 2020-07-31 2020-11-03 北京爱奇艺科技有限公司 Method and device for clipping text layer and electronic equipment
CN111881049A (en) * 2020-07-31 2020-11-03 北京爱奇艺科技有限公司 Acceptance method and device for application program interface and electronic equipment
WO2021184725A1 (en) * 2020-03-16 2021-09-23 平安科技(深圳)有限公司 User interface test method and apparatus, storage medium, and computer device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10944767B2 (en) * 2018-02-01 2021-03-09 International Business Machines Corporation Identifying artificial artifacts in input data to detect adversarial attacks
US11257275B2 (en) * 2018-10-31 2022-02-22 Facebook Technologies, Llc. Dual distance field color palette
US11194669B2 (en) * 2019-06-01 2021-12-07 Rubrik, Inc. Adaptable multi-layered storage for generating search indexes
CN112711526B (en) * 2019-10-25 2024-05-14 腾讯科技(深圳)有限公司 UI test method, device, equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103310037A (en) * 2013-04-28 2013-09-18 上海华力微电子有限公司 Layout design rule check file verification graphics library and establishment method thereof
CN110554957A (en) * 2019-07-31 2019-12-10 北京三快在线科技有限公司 method and device for testing user interface, electronic equipment and readable storage medium
WO2021184725A1 (en) * 2020-03-16 2021-09-23 平安科技(深圳)有限公司 User interface test method and apparatus, storage medium, and computer device
CN111857703A (en) * 2020-07-31 2020-10-30 北京爱奇艺科技有限公司 Matching method and device for layers in interface and electronic equipment
CN111881050A (en) * 2020-07-31 2020-11-03 北京爱奇艺科技有限公司 Method and device for clipping text layer and electronic equipment
CN111881049A (en) * 2020-07-31 2020-11-03 北京爱奇艺科技有限公司 Acceptance method and device for application program interface and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
急急急!!!基于OPENCV 的机器视觉:利用OPENCV实现图片边缘检测并求两条线间的距离;牛牛vs驴驴;《https://zhidao.baidu.com/question/646791050732570685.html》;20140115;第1-2页 *

Also Published As

Publication number Publication date
CN113778905A (en) 2021-12-10

Similar Documents

Publication Publication Date Title
CN114187633B (en) Image processing method and device, and training method and device for image generation model
CN113704531A (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN111161390A (en) Rendering method and device based on BIM (building information modeling) model and WebGL (Web graphics library)
CN111222571B (en) Image special effect processing method and device, electronic equipment and storage medium
CN109683858B (en) Data processing method and device
CN112053440A (en) Method for determining individualized model and communication device
CN111142863A (en) Page generation method and device
US9898873B2 (en) Methods and systems for processing 3D graphic objects at a content processor
CN113778905B (en) UI design acceptance method, device, equipment and storage medium
CN112734900A (en) Baking method, baking device, baking equipment and computer-readable storage medium of shadow map
CN111898544A (en) Character and image matching method, device and equipment and computer storage medium
CN116756796A (en) BIM data light weight method and system
CN113468626B (en) Drawing generation method and device, electronic equipment and storage medium
CN115794400A (en) Memory management method, device and equipment of deep learning model and storage medium
CN115859431A (en) Linkage method, device and equipment of three-dimensional building model and two-dimensional drawing
CN113419806B (en) Image processing method, device, computer equipment and storage medium
CN115311399A (en) Image rendering method and device, electronic equipment and storage medium
CN113010627A (en) City information model service method
CN113313809A (en) Rendering method and device
CN113590219B (en) Data processing method and device, electronic equipment and storage medium
CN117032617B (en) Multi-screen-based grid pickup method, device, equipment and medium
WO2023197729A1 (en) Object rendering method and apparatus, electronic device, and storage medium
CN115994203B (en) AI-based data annotation processing method, system and AI center
CN117132716B (en) Power environment monitoring method and device, electronic equipment and storage medium
CN114820908B (en) Virtual image generation method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant