CN115098364A - Single-interface robustness automatic test system and method thereof - Google Patents

Single-interface robustness automatic test system and method thereof Download PDF

Info

Publication number
CN115098364A
CN115098364A CN202210695618.9A CN202210695618A CN115098364A CN 115098364 A CN115098364 A CN 115098364A CN 202210695618 A CN202210695618 A CN 202210695618A CN 115098364 A CN115098364 A CN 115098364A
Authority
CN
China
Prior art keywords
interface
engine
rule
test
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210695618.9A
Other languages
Chinese (zh)
Inventor
黄博涛
花京武
秦钢
宋杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Diji Intelligent Technology Co ltd
Original Assignee
Hangzhou Diji Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Diji Intelligent Technology Co ltd filed Critical Hangzhou Diji Intelligent Technology Co ltd
Priority to CN202210695618.9A priority Critical patent/CN115098364A/en
Publication of CN115098364A publication Critical patent/CN115098364A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/36Software reuse

Abstract

The invention discloses a single-interface robustness automatic test system, which comprises: the front-end UI is internally provided with an Html module, a CSS module, a JQurery module and a picture module; the display layer is in communication connection with the front end UI and is internally provided with a template engine rendering module and an Ajax interaction module; the business layer is in communication connection with the display layer and is internally provided with an interface data source management module, an interface management module, a check rule management module and a report display module; the bottom layer engine is in communication connection with the business layer, is internally provided with an interface loading engine, a use case rule engine and a verification engine, runs the environment, is in communication connection with the bottom layer engine, and is internally provided with an independent server. According to the single-interface robustness automatic test system, a visual tool can be effectively provided in the test process through the arrangement of the front-end UI and the display layer.

Description

Single-interface robustness automatic test system and method thereof
Technical Field
The invention relates to a test system and a test method, in particular to a single-interface robustness automatic test system and a single-interface robustness automatic test method.
Background
The interface test is based on interface information, and the interface information is stored in various platforms or format files. The tester can compile test cases by using the interface information and the services, each interface can output cases of normal service processes and cases of abnormal service processes, and the compiling of the abnormal cases has certain rules in the test methodology. By collecting the interface information, processing the rule and checking the subsequent response, the problem that the tester tests the robustness of the interface can be solved.
However, the whole interface automation process is realized manually, the consumed manpower is hooked with the number of interfaces, the number of orders is between thousands and tens of thousands at present according to the development of the whole industry, and the investment of the manpower is also planned according to half a year.
At present, the existing test flow mainly comprises the following processes of obtaining interface information, then carrying out case design, then carrying out script writing, then carrying out test execution, and finally carrying out result verification to complete a test flow, however, the flow has the following defects:
1. the case design and the script writing have high requirements on the testing capability and the code capability of testers;
2. the work repeatability is very high, the workload is large, but each interface has difference, and the cutting processing cannot be carried out;
3. the number of cases and the number of scripts are too large, the workload of executing an analysis result and debugging the script once is particularly large, the script cannot be designed according to the dimensionality of a service, a module, an interface and the like, and the more heavy the project is, the more difficult the project is to maintain;
4. maintenance work after newly-increased and modification interface can't reuse, and the manpower needs continuously invest.
5. There is no visualization tool.
Disclosure of Invention
In view of the shortcomings of the prior art, the present invention is directed to an automated single-interface robustness testing system and method thereof that can effectively avoid one or more of the above-mentioned shortcomings.
In order to achieve the purpose, the invention provides the following technical scheme: a single interface robustness automated test system comprising:
the front-end UI is internally provided with an Html module, a CSS module, a JQurery module and a picture module and is used for man-machine interaction with a user;
the display layer is in communication connection with the front end UI, is internally provided with a template engine rendering module and an Ajax interaction module, is used for rendering data and then displaying the rendered data to a user, and is used for sending a POST request and a Get request to perform data interaction;
the service layer is in communication connection with the display layer, is internally provided with an interface data source management module, an interface management module, a check rule management module and a report display module and is used for automatically testing interface robustness, performing data source configuration and uploading swagger format interface information through the interface data source management module, loading the interface information from the data source configuration through the interface management module, generating a test case according to the interface information, then executing the test case, displaying the interface information case, and configuring a check rule and debugging a check engine on line according to dimensionality through the check rule management module;
the bottom layer engine is in communication connection with the business layer, is internally provided with an interface loading engine, a case rule engine and a verification engine and is used for providing an execution basis for loading and executing case rules and verification behaviors by the interface;
and the running environment is in communication connection with the bottom engine and is internally provided with an independent server for providing the running environment of the bottom engine.
As a further improvement of the invention, the method comprises the following steps:
managing an interface by using an interface data source management module, configuring a data source and uploading swagger format interface information;
step two, managing the use case by using an interface management module, loading interface information through the data source configuration in the step one, generating a test use case according to the interface information, and then executing a test by using the interface management module;
and step three, after the test in the step two is finished, displaying the test report through the report display module, and then verifying the test execution process through the verification engine.
As a further improvement of the present invention, in the step one, the specific steps of managing the interface are as follows:
opening an interface management page or opening an interface data source management page;
step two, judging whether each dimension configuration of the interface exists after opening an interface management page, if so, displaying each dimension hierarchical relationship of the interface, if not, jumping to the step one-by-one to open an interface data source management page, and then judging whether interface information in the interface hierarchy is loaded, if so, directly displaying the interface information, and if not, reading the interface information from each platform through data source configuration;
step three, configuring a data source after an interface data source management interface is opened, directly uploading swagger format interface information, loading the interface information into an engine, reading the interface information from each platform through data source configuration after the data source is configured, and then loading the interface information into the engine;
step four, storing the interface information in the engine into a database.
As a further improvement of the present invention, the specific steps of example management in the second step are as follows:
step two, entering an interface list, and displaying interface information of each dimension through a tree structure;
secondly, a test case is generated by right clicking an application dimension key, and a case generation rule engine generates a case by using interface information and stores the case into a database;
and step two, clicking a certain interface, and checking all test cases under the interface.
As a further improvement of the present invention, the step of generating the test case in the second step is as follows:
step 1, generating a multi-dimensional parameter combination according to interface parameters and a rule list;
and 2, generating a test case set by using an orthogonal experimental design method or a combined analysis method according to the multi-dimensional parameter combination generated in the step 1.
As a further improvement of the present invention, in the third step, the verification engine is managed before the verification engine verifies the test process, and an expected rule is set for all interfaces or use cases through the verification engine, where the expected rule specifically includes:
name: a rule name;
and (4) classification: functional verification or abnormal verification;
grade: one use case may hit multiple rules, the level of differentiation within multiple rules;
hit rules: grouping, project, interface and self-defining, and judging whether the rule is hit or not according to the executed case;
admission conditions are as follows: after a rule is confirmed to be hit, whether to execute verification of the rule needs to be judged; the rule content is as follows: the specific content of the check rule is consistent with the admission condition writing method.
As a further improvement of the present invention, the specific steps of the verification engine in step three for verifying the test execution process are as follows:
step three, entering an interface list, clicking and executing the test case on all dimensions by a right key, and then returning to the successful establishment of the task;
step two, when the test cases can be right-clicked and executed in all dimensions, collecting all the test cases;
step three, the following messages are sent in parallel: sending the request and the received response to a check engine for rule verification;
and step three, acquiring a final success and failure result and storing the final success and failure result in a database.
As a further improvement of the present invention, the step three display report is divided into the following dimensions:
and (4) task dimension: showing the dimension, time, operating environment, creators and states of the use cases when the tasks are established;
interface dimension: in each task, counting the number of test cases under each interface, the successful passing rate and a small amount of information of the interface per se;
use case dimension: and displaying a rule list hit by the current interface, information of each use case and specific details of the execution of the use case.
As a further improvement of the present invention, after the report is generated in the third step, different verification rules under the project dimension, the interface dimension and the case dimension are debugged according to each dimension in the report, so as to meet specific service test requirements.
The invention has the advantages that the whole automation needs input manpower (case design, script writing, debugging and maintenance), a visual platform is provided for greatly reducing the input manpower, the rules can be set according to business requirements, and then all steps are automatically completed according to the rules. The technical requirement is reduced, the period of obtaining the return is shortened, the whole manpower input is greatly reduced, and the method further has the following advantages:
the technical requirement is low: the user does not need to know the related test knowledge, the service, the code compiling and the automation and the like;
the use is easy: a visualization platform is provided, and the operation is simple and clear;
the manpower input is low: compared with the investment of the whole interface automation, the whole process can be completed in the next five minutes, and the design case development script is completely designed without extra manpower investment.
Drawings
FIG. 1 is a block diagram of a single-interface robustness automated testing system according to the present embodiment;
FIG. 2 is a flow chart of interface management;
FIG. 3 is a flow diagram of use case management;
FIG. 4 is a flow chart of test execution.
Detailed Description
The invention will be further described in detail with reference to the following examples, which are given in the accompanying drawings.
Referring to fig. 1 to 4, a single-interface robustness automated testing system of the present embodiment includes: the front-end UI is internally provided with an Html module, a CSS module, a JQurery module and a picture module and is used for man-machine interaction with a user;
the display layer is in communication connection with the front end UI, is internally provided with a template engine rendering module and an Ajax interaction module, is used for rendering data and then displaying the rendered data to a user, and is used for sending a POST request and a Get request to perform data interaction;
the service layer is in communication connection with the display layer, is internally provided with an interface data source management module, an interface management module, a check rule management module and a report display module and is used for automatically testing interface robustness, performing data source configuration and uploading swagger format interface information through the interface data source management module, loading the interface information from the data source configuration through the interface management module, generating a test case according to the interface information, then executing the test case, displaying the interface information case, and configuring a check rule and debugging a check engine on line according to dimensionality through the check rule management module;
the bottom layer engine is in communication connection with the business layer, is internally provided with an interface loading engine, a case rule engine and a verification engine and is used for providing an execution basis for loading the interface, executing the case rule and verifying the behavior;
the operation environment is in communication connection with the bottom engine, an independent server is arranged in the operation environment and used for providing the operation environment of the bottom engine, automatic testing can be effectively achieved on the robustness of a single interface through the arrangement of the front end UI, the display layer, the service layer, the bottom engine and the operation environment, manpower participation is greatly reduced, familiarity of the whole process of automatic testing is utilized, and automation, one-click type and fool type are achieved on the whole process through various different technical means.
The test method carried by the system comprises the following steps:
managing an interface by using an interface data source management module, configuring a data source and uploading swagger format interface information;
step two, managing the use case by using an interface management module, loading interface information through the data source configuration in the step one, generating a test use case according to the interface information, and then executing a test by using the interface management module;
and step three, after the test in the step two is finished, displaying the test report through the report display module, then verifying the test execution process through the verification engine, and simply and effectively realizing the automatic test through the setting of the three steps.
As a specific embodiment of the improvement of the above test method, the specific steps of managing the interface in the step one are as follows:
opening an interface management page or opening an interface data source management page;
step two, judging whether each dimension configuration of the interface exists after opening an interface management page, if so, displaying each dimension hierarchical relationship of the interface, if not, jumping to the step one-by-one to open an interface data source management page, and then judging whether interface information in the interface hierarchy is loaded, if so, directly displaying the interface information, and if not, reading the interface information from each platform through data source configuration;
step three, configuring a data source after an interface data source management interface is opened, directly uploading swagger format interface information, loading the interface information into an engine, reading the interface information from each platform through data source configuration after the data source is configured, and then loading the interface information into the engine;
and step four, storing the interface information in the engine into a database, effectively realizing the display of the interface information through the setting of the steps, and realizing the generation of data source configuration by directly opening an interface data source management page for inputting when the configuration of each dimension of the interface does not exist.
As a specific implementation of the improvement of the above test method, the specific steps of example management in the second step are as follows:
step two, entering an interface list, and displaying interface information of each dimension through a tree structure;
secondly, a test case is generated by right clicking an application dimension key, and a case generation rule engine generates a case by using interface information and stores the case into a database;
and step two, clicking a certain interface, checking all test cases under the interface, and effectively managing the cases through the three steps.
As a specific embodiment of the improvement of the foregoing test method, the step of generating the test case in the second step is as follows:
step 1, generating a multidimensional parameter combination according to interface parameters and a rule list;
step 2, generating a test case set by using an orthogonal experimental design method or a combined analysis method according to the multi-dimensional parameter combination generated in the step 1, and effectively realizing the test case through the setting of the steps, for example:
the interface A has three parameters of size, num and searchString, and the combination of the three parameters can be generated according to the rule list
[' size correct value: 1 ',' size type variation: true ',' size field deleted ',' size greater than maximum: 10086']
The correct value of [' num: 1 ',' num type change: true ',' num field deleted ',' num greater than maximum: 1001']
The correct value of [' searchString: application name ',' searchString type change: { "code": 123 "} ',' searchString field delete ',' searchString format exception: 2022051315:23:12']
The step also distinguishes two conditions, under the condition that the interface parameter has a default value, the correct value takes the default value, and under the condition that the interface parameter does not have the default value, the correct value can be randomly generated according to the parameter type.
As a specific embodiment of the improvement of the above test method, in the third step, before the verification engine verifies the test process, the verification engine is further managed, and an expected rule is set for all interfaces or use cases by the verification engine, where the expected rule specifically includes:
name: a rule name;
and (4) classification: functional verification or abnormal verification;
grade: one use case may hit multiple rules, the level of differentiation within multiple rules;
hit rules: grouping, project, interface and self-defining, and judging whether the rule is hit or not according to the executed case;
admission conditions are as follows: after confirming that a rule is hit, judging whether to execute the verification of the rule;
the rule content is as follows: the specific content of the check rule is consistent with the admission condition writing method, and the effect of presetting the check rule can be effectively realized through the rule setting.
As a specific embodiment of the improvement of the above test method, the specific steps of the verification engine in step three of verifying the test execution process are as follows:
step three, entering an interface list, clicking and executing a test case on all dimensions by a right key, and then returning to the successful establishment of a task;
step two, when the test cases can be right-clicked and executed in all dimensions, collecting all the test cases;
step three, the following messages are sent in parallel: sending the request and the received response to a check engine for rule verification;
and step three, acquiring a final success and failure result, storing the final success and failure result in a database, and effectively verifying the test execution process through the setting of the steps.
As a specific embodiment of the improvement of the above test method, the step three display report is divided into the following dimensions:
and (4) task dimension: displaying the dimension, time, running environment, creators and state of a use case during task establishment;
interface dimension: in each task, counting the number of test cases under each interface, the successful passing rate and a small amount of information of the interface per se;
use case dimension: the rule list hit by the current interface, the information of each use case and the specific details of the execution of the use case are displayed, various dimensions of the report can be effectively displayed through the modes of the dimensions, and the test content is comprehensively displayed.
As a specific implementation manner of the improvement of the testing method, after the report in the third step is generated, according to each dimension in the report, different verification rules under the project dimension, the interface dimension and the case dimension are debugged to meet specific service testing requirements, thereby greatly increasing the service testing requirement conditions and increasing the application range.
In summary, the testing system and method of the present embodiment have low technical requirements: the user does not need to know the related knowledge of the test, the service, the code compiling and the automation, and the like, and the knowledge is easy to use: and a visualization platform is provided, and the operation is simple and clear. The manpower input is low: compared with the investment of the whole interface automation, the whole process can be completed in the next five minutes, and the design case development script is completely designed without extra manpower investment.
The above description is only a preferred embodiment of the present invention, and the scope of the present invention is not limited to the above embodiments, and all technical solutions that belong to the idea of the present invention belong to the scope of the present invention. It should be noted that modifications and embellishments within the scope of the invention may occur to those skilled in the art without departing from the principle of the invention, and are considered to be within the scope of the invention.

Claims (9)

1. A single interface robustness automated testing system characterized in that: the method comprises the following steps:
the front-end UI is internally provided with an Html module, a CSS module, a JQurery module and a picture module and is used for man-machine interaction with a user;
the display layer is in communication connection with the front end UI, is internally provided with a template engine rendering module and an Ajax interaction module, is used for rendering data and then displaying the rendered data to a user, and is used for sending a POST request and a Get request to perform data interaction;
the service layer is in communication connection with the display layer, is internally provided with an interface data source management module, an interface management module, a check rule management module and a report display module and is used for automatically testing interface robustness, performing data source configuration and uploading swagger format interface information through the interface data source management module, loading the interface information from the data source configuration through the interface management module, generating a test case according to the interface information, then executing the test case, displaying the interface information case, and configuring a check rule and debugging a check engine on line according to dimensionality through the check rule management module;
the bottom layer engine is in communication connection with the business layer, is internally provided with an interface loading engine, a case rule engine and a verification engine and is used for providing an execution basis for loading the interface, executing the case rule and verifying the behavior;
and the running environment is in communication connection with the underlying engine and is internally provided with an independent server for providing the running environment of the underlying engine.
2. A single-interface robustness automated testing method applied to the system of claim 1, wherein: the method comprises the following steps:
managing an interface by using an interface data source management module, configuring a data source and uploading swagger format interface information;
step two, managing the use cases by using an interface management module, loading interface information through the data source configuration in the step one, generating test use cases according to the interface information, and then executing tests by using the interface management module;
and step three, after the test in the step two is finished, displaying the test report through a report display module, and then verifying the test execution process through a verification engine.
3. The single-interface robustness automated testing method of claim 2, wherein: the specific steps of managing the interface in the first step are as follows:
opening an interface management page or opening an interface data source management page;
step two, judging whether each dimension configuration of the interface exists after opening an interface management page, if so, displaying each dimension hierarchical relationship of the interface, if not, jumping to the step one-by-one to open an interface data source management page, and then judging whether interface information in the interface hierarchy is loaded, if so, directly displaying the interface information, and if not, reading the interface information from each platform through data source configuration;
step three, configuring a data source after an interface data source management interface is opened, directly uploading swagger format interface information, loading the interface information into an engine, reading the interface information from each platform through data source configuration after the data source is configured, and then loading the interface information into the engine;
step four, storing the interface information in the engine into a database.
4. The single interface robustness automated testing method of claim 2 or 3, wherein: the management of the use case in the second step comprises the following specific steps:
step two, entering an interface list, and displaying interface information of each dimension through a tree structure;
secondly, a test case is generated by right clicking an application dimension key, and a case generation rule engine generates a case by using interface information and stores the case into a database;
and step two, clicking a certain interface, and checking all test cases under the interface.
5. The single interface robustness automated testing method of claim 4, wherein: the step two of generating the test case is as follows:
step 1, generating a multidimensional parameter combination according to interface parameters and a rule list;
and 2, generating a test case set by using an orthogonal experimental design method or a combined analysis method according to the multi-dimensional parameter combination generated in the step 1.
6. The single interface robustness automated testing method of claim 2 or 3, wherein: in the third step, the verification engine is managed before verifying the test process, and an expected rule is set for all interfaces or cases through the verification engine, wherein the expected rule specifically includes:
name: a rule name;
and (4) classification: functional verification or abnormal verification;
grade: one use case may hit multiple rules, the level of differentiation within multiple rules;
hit rules: grouping, project, interface and self-defining, and judging whether the rule is hit or not according to the executed case;
admission conditions are as follows: after a rule is confirmed to be hit, whether to execute verification of the rule needs to be judged;
the rule content is as follows: the specific content of the check rule is consistent with the admission condition writing method.
7. The single interface robustness automated testing method of claim 6, wherein: the third step of the verification engine verifying the test execution process comprises the following specific steps:
step three, entering an interface list, clicking and executing the test case on all dimensions by a right key, and then returning to the successful establishment of the task;
step two, when the test cases can be right-clicked and executed in all dimensions, collecting all the test cases;
step three, the following messages are sent in parallel: sending the request and the received response to a check engine for rule verification;
and step three, acquiring a final success and failure result and storing the final success and failure result in a database.
8. The single-interface robustness automated testing method of claim 2 or 3, wherein: the step three shows that the report is divided into the following dimensions:
and (4) task dimension: displaying the dimension, time, running environment, creators and state of a use case during task establishment;
interface dimension: in each task, counting the number of test cases under each interface, the successful passing rate and a small amount of information of the interface per se;
use case dimension: and displaying a rule list hit by the current interface, information of each use case and specific details of the execution of the use case.
9. The single interface robustness automated testing method of claim 8, wherein: and after the report in the third step is generated, debugging different checking rules under the item dimension, the interface dimension and the case dimension according to all the dimensions in the report so as to meet the specific service test requirement.
CN202210695618.9A 2022-06-20 2022-06-20 Single-interface robustness automatic test system and method thereof Pending CN115098364A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210695618.9A CN115098364A (en) 2022-06-20 2022-06-20 Single-interface robustness automatic test system and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210695618.9A CN115098364A (en) 2022-06-20 2022-06-20 Single-interface robustness automatic test system and method thereof

Publications (1)

Publication Number Publication Date
CN115098364A true CN115098364A (en) 2022-09-23

Family

ID=83290763

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210695618.9A Pending CN115098364A (en) 2022-06-20 2022-06-20 Single-interface robustness automatic test system and method thereof

Country Status (1)

Country Link
CN (1) CN115098364A (en)

Similar Documents

Publication Publication Date Title
US7895565B1 (en) Integrated system and method for validating the functionality and performance of software applications
CN108628748B (en) Automatic test management method and automatic test management system
CN112486557A (en) Development-based complete life cycle management platform and method
CN109491922B (en) Test method and device based on model driving
CN106227654A (en) A kind of test platform
US20180232299A1 (en) Composing future tests
US9734042B1 (en) System, method, and computer program for automated parameterized software testing
CN112433948A (en) Simulation test system and method based on network data analysis
CN114328278B (en) Distributed simulation test method, system, readable storage medium and computer equipment
EP2913757A1 (en) Method, system, and computer software product for test automation
CN117370217A (en) Automatic interface test result generation method based on python
CN115098364A (en) Single-interface robustness automatic test system and method thereof
CN114124743B (en) Method and system for executing full-link inspection rule of data application
CN116069628A (en) Intelligent-treatment software automatic regression testing method, system and equipment
CN115629956A (en) Software defect management method and system based on interface automatic test
Weber et al. Detecting inconsistencies in multi-view uml models
CN112527312A (en) Test method and test device for embedded system
KR100992622B1 (en) Method and system for performing automatic test in embedded platform
CN111026654A (en) Automatic testing method based on interface
CN109669868A (en) The method and system of software test
CN117194267B (en) Software quality rating system based on cloud platform
CN115810137B (en) Construction method of interactive artificial intelligence technical evaluation scheme
CN116932414B (en) Method and equipment for generating interface test case and computer readable storage medium
CN110347741B (en) System for effectively improving output result data quality in big data processing process and control method thereof
CN117520145A (en) Automatic mixed scene test method for low-code assembled interface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination