CN110928796A - Automatic change test platform - Google Patents
Automatic change test platform Download PDFInfo
- Publication number
- CN110928796A CN110928796A CN201911195928.9A CN201911195928A CN110928796A CN 110928796 A CN110928796 A CN 110928796A CN 201911195928 A CN201911195928 A CN 201911195928A CN 110928796 A CN110928796 A CN 110928796A
- Authority
- CN
- China
- Prior art keywords
- case
- test
- data
- execution
- parameterization
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3684—Test management for test design, e.g. generating new test cases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
The invention provides an automatic test platform which is characterized by comprising a use case preprocessing module, a use case execution module and a use case response processing module. The automatic platform simulates manual test behaviors to gradually execute batch test cases, and has the advantages of well fitting the automatic regression test of a company pay line and a product line, self-defining a test task set and data analysis, test result analysis, data report generation and result notification, and interface performance test integration, interface display log analysis, data comparison, diversified parameterization and other coding-removing and humanized interface operation functions, low maintenance cost, high chirality and other characteristics, and can be applied to joint debugging and test requirements related to the existing interfaces of companies, so that the test cost is greatly reduced, and the automatic effect is improved.
Description
Technical Field
The invention relates to the technical field of computers, in particular to an automatic test platform.
Background
With the continuous and high-speed growth of services, interface regression testers with low cost, high efficiency, high coverage and timeliness are required to have higher requirements to help testers to release from repeated and tedious manual tests.
The traditional manual interface test is implemented by compiling test cases and test codes by a test team, manually comparing whether a test result is consistent with a pre-designed result to verify whether a program is in accordance with expectations or not, and finally recording a test process.
The automatic platform simulates manual test behaviors to gradually execute batch test cases, and has the advantages of well fitting the automatic regression test of a company pay line and a product line, self-defining a test task set and data analysis, test result analysis, data report generation and result notification, and interface performance test integration, interface display log analysis, data comparison, diversified parameterization and other coding-removing and humanized interface operation functions, low maintenance cost, high chirality and other characteristics, and can be applied to joint debugging and test requirements related to the existing interfaces of companies, so that the test cost is greatly reduced, and the automatic effect is improved.
Disclosure of Invention
The invention aims to solve the following problems in the traditional manual test:
firstly, executing the test by a manual test means, consuming huge manpower and time, and seriously influencing the project progress;
secondly, the testing precision of manual testing is not high, the judgment standard is judged by the experience of testers, and whether the testing result conforms to the expected result or not is judged, so that the testing quality is completely summarized to artificial factors, and the testing missing of the testers is easily caused by mistesting;
and thirdly, a large amount of repeated tests are caused when the release version is frequently updated, and because the test resources are limited, a large amount of regression tests are ignored, and whether the current function is in accordance with the test task which is the priority or not can be verified.
In order to solve the technical problems, the invention adopts the technical scheme that:
the invention provides an automatic test platform which is characterized by comprising three modules:
a case preprocessing module: generating test data of the environment according to the condition of case execution, and adding a test access condition;
a use case execution module: acquiring case preprocessing data, constructing and sending messages, and supporting dynamic judgment of message types to send messages of different protocols and different formats;
the case response processing module: and judging a case post-execution action, a response assertion and a data assertion.
Preferably, a user can configure test case information on a "personal operation platform", which includes test case management operations, such as executing, editing, deleting, copying, case classifying, case initializing, and other functions, as shown in fig. 2. The preposition use case can be configured for a single use case, and parameterization can be executed for a single parameter. The automatic test platform is characterized in that the case preprocessing module comprises:
a data initialization unit: generating test data of the environment according to the condition of case execution to meet the data requirement of the test;
dynamic preact unit: in order to meet the test requirements of various composite interfaces, test access conditions are added in advance;
a parameterization unit: the method comprises the following steps of dividing into 5 parameters, fixing parameterization, self-defining parameterization, initializing parameterization, preceding case parameterization and set parameterization, wherein the parameterization is shown in a figure 4;
a parameter replacement unit: the initialized and parameterized data preprocessing is in one-to-one correspondence with the fields of the message, and data replacement is carried out;
an environment inspection unit: pre-detecting whether the executed test environment meets the test condition; and if the check passes the next step, failing to pass the next step, and returning case execution failure information.
Preferably, the dynamic preaction unit executes the following steps:
step 1, pre-checking whether a pre-case meets an execution condition by multiple pre-test cases in advance, if the check is not passed, terminating the transaction, and if the check is passed, executing the next step;
step 2, executing a preposed case: executing the preposed case according to the unified case executing flow, judging that the preposed case passes the execution, and executing the current case; if the execution preposed use case does not pass, the transaction is terminated, and the use case is failed;
and 3, a response result obtained by executing the preposed use case can be stored in a warehouse as dynamic parameterization, can be associated with the parameter of the current use case and is used as a parameter replacement.
Preferably, the use case preprocessing module is characterized in that the parameterization unit includes:
fixed parameterization: a constant parameter;
self-defining parameters: is a freely chosen set of parameters, as shown in fig. 3;
initializing parameters: the system establishes initial parameters according to a certain rule to meet most parameter requirements, and specific parameters need to be configured through other parameter types;
the parameter result obtained after the execution of the preposed case can be used as parameterization, and the value is assigned to the current parameter according to the preposed case execution result;
set parameterization: one parameter set is a data set for fixing multiple test cases, and can automatically execute multiple test cases.
Preferably, the automated testing platform is characterized in that the use case execution module executes the following steps:
step 1, executing a regression case according to a standardized case of a service line or a product line;
step 2, the case is executed in an unattended self-defined task set mode, the case execution condition of the version task can be checked, case error information is checked according to the log, and a statistical task report is sent;
step 3, supporting diversified message formats and protocols;
and 4, printing the periodic case execution condition logs, and checking various logs on a page, so that a tester can check the case execution condition more conveniently.
Preferably, the automated test platform is characterized in that the use case response processing module comprises:
dynamic post-unit: judging whether the use case needs to execute the post-operation;
a response assertion unit: matching the pre-configured response result tree with the actually acquired message information, if the result tree is consistent with the actually acquired message information, going to the next step, and if the result tree is inconsistent with the actually acquired message information, terminating the execution of the use case;
a data parameterization unit: the data can be dynamically positioned and searched, so that the following data assertion service is facilitated;
a data assertion unit: and judging whether the data in the data storage and the response data are correct or not, if so, going to the next step, and if not, terminating the use case execution.
Preferably, the use case response processing module is characterized in that the post-action includes: message conversion, and message encryption and decryption information processing.
Preferably, the automated testing platform comprises the following automated testing steps:
step 1, in a case preprocessing stage, generating test data of an environment according to conditions executed by a case, and adding test admission conditions;
step 2, in the case execution process, case preprocessing data is obtained, a message is constructed and sent, and the dynamic judgment of the message type is supported to send messages of different protocols and different formats;
and 3, in a case response processing stage, judging a case post-execution action, response assertion and data assertion.
The invention also discloses a computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, implements an automated testing platform according to claim 1.
The invention also discloses a device/terminal equipment, which is characterized by comprising a memory, a processor and a computer program stored in the memory and capable of running on the processor, wherein the processor realizes the automatic test platform according to claim 1 when executing the computer program.
The automatic platform simulates manual test behaviors to gradually execute batch test cases, has the advantages of well fitting the automatic regression test of a company pay line and a product line, self-defining test task set and data analysis, test result analysis, data report generation and result notification, and interface performance test integration, interface display log analysis, data comparison, diversified parameterization and other coding-removing and humanized interface operation functions, has the characteristics of low maintenance cost, high chirality and the like, can be applied to joint debugging and test requirements related to the existing interfaces of companies, greatly reduces the test cost, and improves the automatic effect
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
FIG. 1 is a flow diagram of an automation platform;
FIG. 2 is a personal operations platform main interface;
FIG. 3 is a custom parameters page;
FIG. 4 is a parameterization configuration page.
Detailed Description
Example 1
The application provides an automated testing platform, and a system flow of the automated testing platform is shown in fig. 1.
The invention provides an automatic test platform which is characterized by comprising three modules:
a case preprocessing module: generating test data of the environment according to the condition of case execution, and adding a test access condition;
a use case execution module: acquiring case preprocessing data, constructing and sending messages, and supporting dynamic judgment of message types to send messages of different protocols and different formats;
the case response processing module: and judging a case post-execution action, a response assertion and a data assertion.
Further, the automated testing platform is characterized in that the use case preprocessing module comprises:
a data initialization unit: generating test data of the environment according to the condition of case execution to meet the data requirement of the test;
dynamic preact unit: in order to meet the test requirements of various composite interfaces, test access conditions are added in advance;
a parameterization unit: the method comprises the following steps of dividing into 5 parameters, fixing parameterization, self-defining parameterization, initializing parameterization, preceding case parameterization and set parameterization;
a parameter replacement unit: the initialized and parameterized data preprocessing is in one-to-one correspondence with the fields of the message, and data replacement is carried out;
an environment inspection unit: pre-detecting whether the executed test environment meets the test condition; and if the check passes the next step, failing to pass the next step, and returning case execution failure information.
Further, the dynamic preaction unit executes the following steps:
step 1, pre-checking whether a pre-case meets an execution condition by multiple pre-test cases in advance, if the check is not passed, terminating the transaction, and if the check is passed, executing the next step;
step 2, executing a preposed case: executing the preposed case according to the unified case executing flow, judging that the preposed case passes the execution, and executing the current case; if the execution preposed use case does not pass, the transaction is terminated, and the use case is failed;
and 3, a response result obtained by executing the preposed use case can be stored in a warehouse as dynamic parameterization, can be associated with the parameter of the current use case and is used as a parameter replacement.
Further, the use case preprocessing module is characterized in that the parameterization unit comprises:
fixed parameterization: a constant parameter;
self-defining parameters: is a freely selected set of parameters;
initializing parameters: the system establishes initial parameters according to a certain rule to meet most parameter requirements, and specific parameters need to be configured through other parameter types;
the parameter result obtained after the execution of the preposed case can be used as parameterization, and the value is assigned to the current parameter according to the preposed case execution result;
set parameterization: one parameter set is a data set for fixing multiple test cases, and can automatically execute multiple test cases.
Further, the automated testing platform is characterized in that the use case execution module executes the following steps:
step 1, executing a regression case according to a standardized case of a service line or a product line;
step 2, the case is executed in an unattended self-defined task set mode, the case execution condition of the version task can be checked, case error information is checked according to the log, and a statistical task report is sent;
step 3, supporting diversified message formats and protocols;
and 4, printing the periodic case execution condition logs, and checking various logs on a page, so that a tester can check the case execution condition more conveniently.
Further, the automated testing platform is characterized in that the case response processing module comprises:
dynamic post-unit: judging whether the use case needs to execute the post-operation;
a response assertion unit: matching the pre-configured response result tree with the actually acquired message information, if the result tree is consistent with the actually acquired message information, going to the next step, and if the result tree is inconsistent with the actually acquired message information, terminating the execution of the use case;
a data parameterization unit: the data can be dynamically positioned and searched, so that the following data assertion service is facilitated;
a data assertion unit: and judging whether the data in the data storage and the response data are correct or not, if so, going to the next step, and if not, terminating the use case execution.
Further, the use case response processing module is characterized in that the post-action includes: message conversion, and message encryption and decryption information processing.
Further, the automated testing platform comprises the following automated testing steps:
step 1, in a case preprocessing stage, generating test data of an environment according to conditions executed by a case, and adding test admission conditions;
step 2, in the case execution process, case preprocessing data is obtained, a message is constructed and sent, and the dynamic judgment of the message type is supported to send messages of different protocols and different formats;
and 3, in a case response processing stage, judging a case post-execution action, response assertion and data assertion.
The invention also discloses a computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, implements an automated testing platform according to claim 1.
The invention also discloses a device/terminal equipment, which is characterized by comprising a memory, a processor and a computer program stored in the memory and capable of running on the processor, wherein the processor realizes the automatic test platform according to claim 1 when executing the computer program.
The embodiments of the present invention have been described in detail, but the embodiments are merely examples, and the present invention is not limited to the embodiments described above. Any equivalent modifications and substitutions to those skilled in the art are also within the scope of the present invention. Accordingly, equivalent changes and modifications made without departing from the spirit and scope of the present invention should be covered by the present invention.
Claims (10)
1. An automated test platform, comprising:
a case preprocessing module: generating test data of the environment according to the condition of case execution, and adding a test access condition;
a use case execution module: acquiring case preprocessing data, constructing and sending messages, and supporting dynamic judgment of message types to send messages of different protocols and different formats;
the case response processing module: and judging a case post-execution action, a response assertion and a data assertion.
2. The automated test platform of claim 1, wherein the use case preprocessing module comprises:
a data initialization unit: generating test data of the environment according to the condition of case execution to meet the data requirement of the test;
dynamic preact unit: in order to meet the test requirements of various composite interfaces, test access conditions are added in advance;
a parameterization unit: the method comprises the following steps of dividing into 5 parameters, fixing parameterization, self-defining parameterization, initializing parameterization, preceding case parameterization and set parameterization;
a parameter replacement unit: the initialized and parameterized data preprocessing is in one-to-one correspondence with the fields of the message, and data replacement is carried out;
an environment inspection unit: pre-detecting whether the executed test environment meets the test condition; and if the check passes the next step, failing to pass the next step, and returning case execution failure information.
3. The use case preprocessing module of claim 2, wherein the dynamic preaction unit performs the following steps:
step 1, pre-checking whether a pre-case meets an execution condition by multiple pre-test cases in advance, if the check is not passed, terminating the transaction, and if the check is passed, executing the next step;
step 2, executing a preposed case: executing the preposed case according to the unified case executing flow, judging that the preposed case passes the execution, and executing the current case; if the execution preposed use case does not pass, the transaction is terminated, and the use case is failed;
and 3, a response result obtained by executing the preposed use case can be stored in a warehouse as dynamic parameterization, can be associated with the parameter of the current use case and is used as a parameter replacement.
4. The use case preprocessing module of claim 2 wherein the parameterization unit comprises:
fixed parameterization: a constant parameter;
self-defining parameters: is a freely selected set of parameters;
initializing parameters: the system establishes initial parameters according to a certain rule to meet most parameter requirements, and specific parameters need to be configured through other parameter types;
the parameter result obtained after the execution of the preposed case can be used as parameterization, and the value is assigned to the current parameter according to the preposed case execution result;
set parameterization: one parameter set is a data set for fixing multiple test cases, and can automatically execute multiple test cases.
5. The automated testing platform of claim 1, wherein the use case execution module performs the following steps:
step 1, executing a regression case according to a standardized case of a service line or a product line;
step 2, the case is executed in an unattended self-defined task set mode, the case execution condition of the version task can be checked, case error information is checked according to the log, and a statistical task report is sent;
step 3, supporting diversified message formats and protocols;
and 4, printing the periodic case execution condition logs, and checking various logs on a page, so that a tester can check the case execution condition more conveniently.
6. The automated testing platform according to claim 1, wherein the case response processing module comprises a dynamic post unit: judging whether the use case needs to execute the post-operation;
a response assertion unit: matching the pre-configured response result tree with the actually acquired message information, if the result tree is consistent with the actually acquired message information, going to the next step, and if the result tree is inconsistent with the actually acquired message information, terminating the execution of the use case;
a data parameterization unit: the data can be dynamically positioned and searched, so that the following data assertion service is facilitated;
a data assertion unit: and judging whether the data in the data storage and the response data are correct or not, if so, going to the next step, and if not, terminating the use case execution.
7. The use case response handler module of claim 6, wherein the post action comprises: message conversion, and message encryption and decryption information processing.
8. The automated test platform of claim 1, wherein the automated testing step comprises:
step 1, in a case preprocessing stage, generating test data of an environment according to conditions executed by a case, and adding test admission conditions;
step 2, in the case execution process, case preprocessing data is obtained, a message is constructed and sent, and the dynamic judgment of the message type is supported to send messages of different protocols and different formats;
and 3, in a case response processing stage, judging a case post-execution action, response assertion and data assertion.
9. A computer-readable storage medium, wherein the computer-readable storage medium stores a computer program which, when executed by a processor, implements the automated testing platform of claim 1.
10. An apparatus/terminal device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implementing the automated test platform of claim 1 when executing the computer program.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911195928.9A CN110928796B (en) | 2019-11-29 | 2019-11-29 | Automatic test platform |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911195928.9A CN110928796B (en) | 2019-11-29 | 2019-11-29 | Automatic test platform |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110928796A true CN110928796A (en) | 2020-03-27 |
CN110928796B CN110928796B (en) | 2023-05-30 |
Family
ID=69846931
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911195928.9A Active CN110928796B (en) | 2019-11-29 | 2019-11-29 | Automatic test platform |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110928796B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111624462A (en) * | 2020-04-23 | 2020-09-04 | 上海机电工程研究所 | Weapon system PCB detection method, system, medium and equipment based on big data |
CN112181845A (en) * | 2020-10-13 | 2021-01-05 | 湖南快乐阳光互动娱乐传媒有限公司 | Interface testing method and device |
CN112286796A (en) * | 2020-09-29 | 2021-01-29 | 长沙市到家悠享网络科技有限公司 | Software testing method, device and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6966052B1 (en) * | 2001-03-06 | 2005-11-15 | Hewlett-Packard Development Company, L.P. | Method and apparatus for top-down testing based on end user documentation |
US20140101640A1 (en) * | 2012-10-05 | 2014-04-10 | Software Ag | White-box testing systems and/or methods for use in connection with graphical user interfaces |
US20150356001A1 (en) * | 2014-06-06 | 2015-12-10 | Ebay Inc. | Unit test automation for business rules and applications |
CN105373469A (en) * | 2014-08-25 | 2016-03-02 | 广东金赋信息科技有限公司 | Interface based software automation test method |
CN107203473A (en) * | 2017-05-26 | 2017-09-26 | 四川长虹电器股份有限公司 | The automatization test system and method for automatic expansion interface test case |
CN109101415A (en) * | 2018-06-25 | 2018-12-28 | 平安科技(深圳)有限公司 | Interface test method, system, equipment and the storage medium compared based on database |
CN110232024A (en) * | 2019-05-26 | 2019-09-13 | 必成汇(成都)科技有限公司 | Software automated testing frame and test method |
-
2019
- 2019-11-29 CN CN201911195928.9A patent/CN110928796B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6966052B1 (en) * | 2001-03-06 | 2005-11-15 | Hewlett-Packard Development Company, L.P. | Method and apparatus for top-down testing based on end user documentation |
US20140101640A1 (en) * | 2012-10-05 | 2014-04-10 | Software Ag | White-box testing systems and/or methods for use in connection with graphical user interfaces |
US20150356001A1 (en) * | 2014-06-06 | 2015-12-10 | Ebay Inc. | Unit test automation for business rules and applications |
CN105373469A (en) * | 2014-08-25 | 2016-03-02 | 广东金赋信息科技有限公司 | Interface based software automation test method |
CN107203473A (en) * | 2017-05-26 | 2017-09-26 | 四川长虹电器股份有限公司 | The automatization test system and method for automatic expansion interface test case |
CN109101415A (en) * | 2018-06-25 | 2018-12-28 | 平安科技(深圳)有限公司 | Interface test method, system, equipment and the storage medium compared based on database |
CN110232024A (en) * | 2019-05-26 | 2019-09-13 | 必成汇(成都)科技有限公司 | Software automated testing frame and test method |
Non-Patent Citations (2)
Title |
---|
刘国庆;汪兴轩;: "基于Charles录制会话的HTTP接口自动化测试框架设计与实现" * |
张莉娜;: "基于电力行业的自动化测试用例编写规范研究" * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111624462A (en) * | 2020-04-23 | 2020-09-04 | 上海机电工程研究所 | Weapon system PCB detection method, system, medium and equipment based on big data |
CN112286796A (en) * | 2020-09-29 | 2021-01-29 | 长沙市到家悠享网络科技有限公司 | Software testing method, device and storage medium |
CN112181845A (en) * | 2020-10-13 | 2021-01-05 | 湖南快乐阳光互动娱乐传媒有限公司 | Interface testing method and device |
Also Published As
Publication number | Publication date |
---|---|
CN110928796B (en) | 2023-05-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108845940B (en) | Enterprise-level information system automatic function testing method and system | |
US8117598B2 (en) | Method and apparatus to automatically identify specific code changes to probabilistically exclude from regression | |
CN110928796A (en) | Automatic change test platform | |
CN106909510A (en) | A kind of method and server for obtaining test case | |
CN103164328A (en) | Method and device and system for regression testing of service function | |
CN112241360A (en) | Test case generation method, device, equipment and storage medium | |
CN111309343B (en) | Development deployment method and device | |
CN109144525A (en) | A kind of software installation method and system of network self-adapting | |
CN110851471A (en) | Distributed log data processing method, device and system | |
CN114741283A (en) | Automatic interface testing method and device based on python design | |
CN104899134A (en) | Automatic domain name registration server testing system and method | |
CN113342560A (en) | Fault processing method, system, electronic equipment and storage medium | |
CN113032256B (en) | Automated testing method, apparatus, computer system, and readable storage medium | |
CN113672502A (en) | Program multi-system testing method and corresponding device, equipment and medium | |
CN112527312A (en) | Test method and test device for embedded system | |
CN116467188A (en) | Universal local reproduction system and method under multi-environment scene | |
CN115438026A (en) | Database scanning method, device, equipment, storage medium and product | |
CN113986263A (en) | Code automation test method, device, electronic equipment and storage medium | |
CN112256554B (en) | Method and equipment for testing based on scene test cases | |
CN113535560A (en) | Test execution method and device, storage medium and computing equipment | |
CN113434382A (en) | Database performance monitoring method and device, electronic equipment and computer readable medium | |
CN112069202A (en) | SQL performance analysis method, system, device and medium based on tracking technology | |
CN111813659A (en) | UI and interface based automatic test method, device, equipment and readable medium | |
CN113238940A (en) | Interface test result comparison method, device, equipment and storage medium | |
CN112328473A (en) | Code automation integration test method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |