CN110928796B - Automatic test platform - Google Patents

Automatic test platform Download PDF

Info

Publication number
CN110928796B
CN110928796B CN201911195928.9A CN201911195928A CN110928796B CN 110928796 B CN110928796 B CN 110928796B CN 201911195928 A CN201911195928 A CN 201911195928A CN 110928796 B CN110928796 B CN 110928796B
Authority
CN
China
Prior art keywords
use case
test
data
parameterization
execution
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911195928.9A
Other languages
Chinese (zh)
Other versions
CN110928796A (en
Inventor
朱斌
张震
肖玮军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baofu Network Technology Shanghai Co ltd
Original Assignee
Baofu Network Technology Shanghai Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Baofu Network Technology Shanghai Co ltd filed Critical Baofu Network Technology Shanghai Co ltd
Priority to CN201911195928.9A priority Critical patent/CN110928796B/en
Publication of CN110928796A publication Critical patent/CN110928796A/en
Application granted granted Critical
Publication of CN110928796B publication Critical patent/CN110928796B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention provides an automatic test platform which is characterized by comprising a use case preprocessing module, a use case executing module and a use case response processing module. The automatic platform simulates the manual test behaviors to gradually execute batch test cases, has the characteristics of good fitting of company pay lines and product line automatic regression tests, self-defined test task sets and data analysis, test result analysis, data report generation and result notification, interface performance test integration, interface display log analysis, data comparison, diversified parameterization and the like, and has the coding-removal and humanized interface operation functions, and has the characteristics of low maintenance cost, high chirality and the like, and can be applied to the joint debugging and test requirements related to the conventional interfaces of the company, so that the test cost is greatly reduced, and the automatic effect is improved.

Description

Automatic test platform
Technical Field
The invention relates to the technical field of computers, in particular to an automatic test platform.
Background
With the continuous and high-speed growth of business, interface regression testing tools with low cost, high efficiency, high coverage rate and timeliness are required to have higher requirements to help the testers to free up from repeated and boring manual tests.
The traditional manual interface test is to write test cases and test codes by a test team to execute the test, compare whether the test result is consistent with the pre-designed result manually to check whether the program meets the expectations, and finally record the test process.
The automatic platform simulates the manual test behaviors to gradually execute batch test cases, has the characteristics of good fitting of company pay lines and product line automatic regression tests, self-defined test task sets and data analysis, test result analysis, data report generation and result notification, interface performance test integration, interface display log analysis, data comparison, diversified parameterization and the like, and has the coding-removal and humanized interface operation functions, and has the characteristics of low maintenance cost, high chirality and the like, and can be applied to the joint debugging and test requirements related to the conventional interfaces of the company, so that the test cost is greatly reduced, and the automatic effect is improved.
Disclosure of Invention
The invention aims to solve the following problems in the traditional manual test:
1. the manual test means is used for executing the test, so that huge manpower and time are consumed, and project progress is affected seriously;
2. the test precision of the manual test is not high, the judgment standard is judged by experience of a tester, and whether the test result accords with the expected result or not ensures that the test quality is completely reduced to the artificial factor, and the error detection and missing detection of the tester are easy to cause;
3. when the release version is frequently updated, a large number of repeated tests are caused, and due to limited test resources, a large number of regression tests are ignored, so that whether the current function accords with the test task with higher priority can only be verified.
In order to solve the technical problems, the invention adopts the following technical scheme:
the invention provides an automatic test platform which is characterized by comprising three modules:
the use case preprocessing module: generating test data of the environment according to conditions executed by the use cases, and adding test admittance conditions;
the use case execution module: obtaining example preprocessing data, constructing a message and sending the message, and supporting dynamic judgment of the type of the message to send messages with different protocols and different formats;
the use case response processing module: the judgment case post-executes actions, responds to assertions and data assertions.
Preferably, the user can configure test case information in the "personal operation platform", including functions of test case management operations, such as execution, editing, deletion, duplication, case classification, and case initialization, as shown in fig. 2. The pre-use case may be configured for a single use case, or parameterization may be performed for a single parameter. The automatic test platform is characterized in that the use case preprocessing module comprises:
a data initializing unit: generating test data of the environment according to conditions of use case execution so as to meet the data requirements of the test;
dynamic front-end action unit: in order to meet the test requirements of various composite interfaces, test admission conditions are added in advance;
parameterization unit: the method is divided into 5 parameters, namely fixed parameterization, custom parameterization, initialization parameterization, front-end use case parameterization and set parameterization, as shown in figure 4;
parameter replacement unit: preprocessing initialized and parameterized data, and performing data replacement by corresponding the initialized and parameterized data to the fields of the message one by one;
an environment inspection unit: pre-detecting whether the executed test environment meets the test condition; checking that the next step is executed again, failing the step, and returning the use case execution failure information.
Preferably, the step of executing the dynamic pre-action unit is as follows:
step 1, pre-checking whether the pre-test cases meet the execution conditions or not by the multi-pre-test cases, if the pre-test cases do not pass the check, ending the transaction, and if the pre-test cases pass the check, executing the next step;
step 2, executing the front-end use case: the pre-use case execution process is executed according to the unified use case execution flow, and if the pre-use case execution is judged to pass, the current use case is executed again; if the front-end use case is not executed, the transaction is terminated, the use case state is failed, and the use case fails;
and 3, the response result obtained by executing the front-end use case can be used as a dynamic parameterization to be stored and warehoused, and can be related to the input parameters of the current use case to be used as a parameter to replace one type.
Preferably, the use case preprocessing module, the parameterization unit includes:
fixed parameterization: constant parameters;
custom parameters: is a freely selected set of parameters, as shown in fig. 3;
initializing parameters: the system establishes initial parameters according to a certain rule so as to meet most parameter requirements, and specific parameters need to be configured through other parameter types;
the parameters of the pre-use case, namely, a parameter result obtained after the pre-use case is executed can be used as parameterization, and the parameter result is assigned to the current parameter according to the execution result of the pre-use case;
aggregate parameterization: one parameter set is a fixed data set of a plurality of test cases, and can automatically execute a plurality of test cases.
Preferably, the automated test platform is characterized by comprising a use case execution module, and the execution steps are as follows:
step 1, executing regression use cases according to standardized use cases of service lines or product lines;
step 2, executing the use cases in an unattended self-defined task set mode, checking version task execution use case conditions, checking use case error information according to a log, and sending a statistical task report;
step 3, supporting diversified message formats and protocols;
and 4, printing the periodic use case execution condition logs, and checking various logs on the page, so that the use case execution condition is more convenient for a tester to check.
Preferably, the automated test platform is characterized in that the use case response processing module includes:
dynamic post unit: judging whether the use case needs to execute the post-operation;
and responding to the assertion unit: matching the preset response result tree with the actually acquired message information, if the preset response result tree is consistent with the actually acquired message information, carrying out the next step, and if the preset response result tree is inconsistent with the actually acquired message information, terminating execution of the use case;
a data parameterization unit: the search data can be dynamically positioned, so that the following data assertion service is facilitated;
a data asserting unit: judging whether the data in the data warehouse and the response data are correct or not, if so, going to the next step, and if not, terminating the execution of the use case.
Preferably, the use case response processing module is characterized in that the post-processing action includes: message conversion, mock data and message encryption and decryption information processing.
Preferably, the automated testing platform comprises the following automated testing steps:
step 1, in a use case preprocessing stage, generating test data of an environment according to conditions executed by use cases, and adding test admittance conditions;
step 2, the use case executing process obtains the use case preprocessing data, constructs a message and transmits the message, and supports the dynamic judgment of the type of the message to transmit the messages with different protocols and different formats;
and 3, in the case answer processing stage, judging the post-execution action of the case, responding to the assertion and data assertion.
The invention also discloses a computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, implements the automated test platform according to claim 1.
The invention also discloses a device/terminal equipment characterized by comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the automated test platform according to claim 1 when executing the computer program.
The automated platform simulates the manual test behaviors to gradually execute batch test cases, has the characteristics of good fitting company pay line and product line automated regression test, custom test task set and data analysis, test result analysis, data report generation and result notification, interface performance test integration, interface display log analysis, data comparison, diversified parameterization and the like, and has the humanized interface operation functions, low maintenance cost, high chirality and the like, and can be applied to the joint debugging and test requirements related to the conventional interfaces of the company, thereby greatly reducing the test cost and improving the automation effect
The technical scheme of the invention is further described in detail through the drawings and the embodiments.
Drawings
FIG. 1 is a flow chart of an automation platform;
FIG. 2 is a personal operations platform main interface;
FIG. 3 is a custom parameters page;
fig. 4 is a parameterized configuration page.
Detailed Description
Example 1
The application provides an automatic test platform, and the system flow is shown in fig. 1.
The invention provides an automatic test platform which is characterized by comprising three modules:
the use case preprocessing module: generating test data of the environment according to conditions executed by the use cases, and adding test admittance conditions;
the use case execution module: obtaining example preprocessing data, constructing a message and sending the message, and supporting dynamic judgment of the type of the message to send messages with different protocols and different formats;
the use case response processing module: the judgment case post-executes actions, responds to assertions and data assertions.
Further, the automated test platform is characterized in that the use case preprocessing module comprises:
a data initializing unit: generating test data of the environment according to conditions of use case execution so as to meet the data requirements of the test;
dynamic front-end action unit: in order to meet the test requirements of various composite interfaces, test admission conditions are added in advance;
parameterization unit: the method comprises the steps of 5 kinds of parameters, fixed parameterization, custom parameterization, initialization parameterization, front use case parameterization and set parameterization;
parameter replacement unit: preprocessing initialized and parameterized data, and performing data replacement by corresponding the initialized and parameterized data to the fields of the message one by one;
an environment inspection unit: pre-detecting whether the executed test environment meets the test condition; checking that the next step is executed again, failing the step, and returning the use case execution failure information.
Further, the dynamic pre-action unit performs the following steps:
step 1, pre-checking whether the pre-test cases meet the execution conditions or not by the multi-pre-test cases, if the pre-test cases do not pass the check, ending the transaction, and if the pre-test cases pass the check, executing the next step;
step 2, executing the front-end use case: the pre-use case execution process is executed according to the unified use case execution flow, and if the pre-use case execution is judged to pass, the current use case is executed again; if the front-end use case is not executed, the transaction is terminated, the use case state is failed, and the use case fails;
and 3, the response result obtained by executing the front-end use case can be used as a dynamic parameterization to be stored and warehoused, and can be related to the input parameters of the current use case to be used as a parameter to replace one type.
Further, the use case preprocessing module is characterized in that the parameterization unit comprises:
fixed parameterization: constant parameters;
custom parameters: is a freely selected parameter set;
initializing parameters: the system establishes initial parameters according to a certain rule so as to meet most parameter requirements, and specific parameters need to be configured through other parameter types;
the parameters of the pre-use case, namely, a parameter result obtained after the pre-use case is executed can be used as parameterization, and the parameter result is assigned to the current parameter according to the execution result of the pre-use case;
aggregate parameterization: one parameter set is a fixed data set of a plurality of test cases, and can automatically execute a plurality of test cases.
Further, the automated test platform is characterized in that the use case execution module performs the following steps:
step 1, executing regression use cases according to standardized use cases of service lines or product lines;
step 2, executing the use cases in an unattended self-defined task set mode, checking version task execution use case conditions, checking use case error information according to a log, and sending a statistical task report;
step 3, supporting diversified message formats and protocols;
and 4, printing the periodic use case execution condition logs, and checking various logs on the page, so that the use case execution condition is more convenient for a tester to check.
Further, the automated test platform is characterized in that the use case response processing module comprises:
dynamic post unit: judging whether the use case needs to execute the post-operation;
and responding to the assertion unit: matching the preset response result tree with the actually acquired message information, if the preset response result tree is consistent with the actually acquired message information, carrying out the next step, and if the preset response result tree is inconsistent with the actually acquired message information, terminating execution of the use case;
a data parameterization unit: the search data can be dynamically positioned, so that the following data assertion service is facilitated;
a data asserting unit: judging whether the data in the data warehouse and the response data are correct or not, if so, going to the next step, and if not, terminating the execution of the use case.
Further, the use case response processing module is characterized in that the post-processing action comprises: message conversion, mock data and message encryption and decryption information processing.
Further, the automated test platform comprises the following automated test steps:
step 1, in a use case preprocessing stage, generating test data of an environment according to conditions executed by use cases, and adding test admittance conditions;
step 2, the use case executing process obtains the use case preprocessing data, constructs a message and transmits the message, and supports the dynamic judgment of the type of the message to transmit the messages with different protocols and different formats;
and 3, in the case answer processing stage, judging the post-execution action of the case, responding to the assertion and data assertion.
The invention also discloses a computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, implements the automated test platform according to claim 1.
The invention also discloses a device/terminal equipment characterized by comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the automated test platform according to claim 1 when executing the computer program.
The above description of the specific embodiments of the present invention has been given by way of example only, and the present invention is not limited to the above described specific embodiments. Any equivalent modifications and substitutions for the present invention will occur to those skilled in the art, and are also within the scope of the present invention. Accordingly, equivalent changes and modifications are intended to be included within the scope of the present invention without departing from the spirit and scope thereof.

Claims (6)

1. An automated test platform, comprising:
the use case preprocessing module: generating test data of the environment according to conditions executed by the use cases, and adding test admittance conditions; the use case execution module: obtaining example preprocessing data, constructing a message and sending the message, and supporting dynamic judgment of the type of the message to send messages with different protocols and different formats;
the use case response processing module: judging the post execution action of the use case, responding to the assertion and the data assertion;
wherein, the use case preprocessing module comprises:
a data initializing unit: generating test data of the environment according to conditions of use case execution so as to meet the data requirements of the test;
dynamic front-end action unit: in order to meet the test requirements of various composite interfaces, test admission conditions are added in advance; the dynamic front-end action unit comprises the following execution steps:
step 1, pre-checking whether the pre-test cases meet the execution conditions or not by the multi-pre-test cases, if the pre-test cases do not pass the check, ending the transaction, and if the pre-test cases pass the check, executing the next step;
step 2, executing the front-end use case: the pre-use case execution process is executed according to the unified use case execution flow, and if the pre-use case execution is judged to pass, the current use case is executed again; if the front-end use case is not executed, the transaction is terminated, the use case state is failed, and the use case fails;
step 3, the response result obtained by executing the front-end use case is used as a dynamic parameterization to be stored and warehoused, and can be related to the input parameters of the current use case to be used as a parameter to replace one type;
parameterization unit: the method comprises the steps of 5 kinds of parameters, fixed parameterization, custom parameterization, initialization parameterization, front use case parameterization and set parameterization; wherein the fixed parameterization is a constant parameter; custom parameterization is a freely selected set of parameters; the initialization parameterization is an initial parameter formulated by the system according to a preset rule; the front-end use case parameterization is to execute a parameter result obtained after the front-end use case is completed as parameterization, and assign a value to the current parameter according to the front-end use case execution result; the set parameterization is a parameter set, is a fixed multi-test case data set, and can automatically execute multi-test cases;
parameter replacement unit: preprocessing initialized and parameterized data, and performing data replacement by corresponding the initialized and parameterized data to the fields of the message one by one;
an environment inspection unit: pre-detecting whether the executed test environment meets the test condition; checking that the next step is executed again, if the next step is not executed, the next step fails, and returning the execution failure information of the use case;
the use case response processing module comprises:
dynamic post unit: judging whether the use case needs to execute the post-operation;
and responding to the assertion unit: matching the preset response result tree with the actually acquired message information, if the preset response result tree is consistent with the actually acquired message information, carrying out the next step, and if the preset response result tree is inconsistent with the actually acquired message information, terminating execution of the use case;
a data parameterization unit: dynamically positioning and searching data to facilitate the following data assertion service;
a data asserting unit: judging whether the data in the data warehouse and the response data are correct or not, if so, going to the next step, and if not, terminating the execution of the use case.
2. The automated test platform of claim 1, wherein the use case execution module performs the steps of:
step 1, executing regression use cases according to standardized use cases of service lines or product lines;
step 2, executing the use cases in an unattended self-defined task set mode, checking version task execution use case conditions, checking use case error information according to a log, and sending a statistical task report;
step 3, supporting diversified message formats and protocols;
and 4, printing the log of the execution condition of the periodic use cases, and checking various logs on the page.
3. The automated test platform of claim 1, wherein the post-action comprises: message conversion, mock data and message encryption and decryption information processing.
4. An automated test platform according to claim 1, wherein the automated test steps comprise:
step 1, in a use case preprocessing stage, generating test data of an environment according to conditions executed by use cases, and adding test admittance conditions;
step 2, the use case executing process obtains the use case preprocessing data, constructs a message and transmits the message, and supports the dynamic judgment of the type of the message to transmit the messages with different protocols and different formats;
and 3, in the case answer processing stage, judging the post-execution action of the case, responding to the assertion and data assertion.
5. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program which, when executed by a processor, implements the automated test platform of claim 1.
6. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the automated test platform of claim 1 when the computer program is executed by the processor.
CN201911195928.9A 2019-11-29 2019-11-29 Automatic test platform Active CN110928796B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911195928.9A CN110928796B (en) 2019-11-29 2019-11-29 Automatic test platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911195928.9A CN110928796B (en) 2019-11-29 2019-11-29 Automatic test platform

Publications (2)

Publication Number Publication Date
CN110928796A CN110928796A (en) 2020-03-27
CN110928796B true CN110928796B (en) 2023-05-30

Family

ID=69846931

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911195928.9A Active CN110928796B (en) 2019-11-29 2019-11-29 Automatic test platform

Country Status (1)

Country Link
CN (1) CN110928796B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111624462A (en) * 2020-04-23 2020-09-04 上海机电工程研究所 Weapon system PCB detection method, system, medium and equipment based on big data
CN112286796A (en) * 2020-09-29 2021-01-29 长沙市到家悠享网络科技有限公司 Software testing method, device and storage medium
CN112181845A (en) * 2020-10-13 2021-01-05 湖南快乐阳光互动娱乐传媒有限公司 Interface testing method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6966052B1 (en) * 2001-03-06 2005-11-15 Hewlett-Packard Development Company, L.P. Method and apparatus for top-down testing based on end user documentation
CN105373469A (en) * 2014-08-25 2016-03-02 广东金赋信息科技有限公司 Interface based software automation test method
CN107203473A (en) * 2017-05-26 2017-09-26 四川长虹电器股份有限公司 The automatization test system and method for automatic expansion interface test case
CN109101415A (en) * 2018-06-25 2018-12-28 平安科技(深圳)有限公司 Interface test method, system, equipment and the storage medium compared based on database
CN110232024A (en) * 2019-05-26 2019-09-13 必成汇(成都)科技有限公司 Software automated testing frame and test method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9047413B2 (en) * 2012-10-05 2015-06-02 Software Ag White-box testing systems and/or methods for use in connection with graphical user interfaces
US9606903B2 (en) * 2014-06-06 2017-03-28 Paypal, Inc. Unit test automation for business rules and applications

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6966052B1 (en) * 2001-03-06 2005-11-15 Hewlett-Packard Development Company, L.P. Method and apparatus for top-down testing based on end user documentation
CN105373469A (en) * 2014-08-25 2016-03-02 广东金赋信息科技有限公司 Interface based software automation test method
CN107203473A (en) * 2017-05-26 2017-09-26 四川长虹电器股份有限公司 The automatization test system and method for automatic expansion interface test case
CN109101415A (en) * 2018-06-25 2018-12-28 平安科技(深圳)有限公司 Interface test method, system, equipment and the storage medium compared based on database
CN110232024A (en) * 2019-05-26 2019-09-13 必成汇(成都)科技有限公司 Software automated testing frame and test method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘国庆 ; 汪兴轩 ; .基于Charles录制会话的HTTP接口自动化测试框架设计与实现.计算机应用与软件.2019,(06),全文. *
张莉娜 ; .基于电力行业的自动化测试用例编写规范研究.信息通信.2017,(07),全文. *

Also Published As

Publication number Publication date
CN110928796A (en) 2020-03-27

Similar Documents

Publication Publication Date Title
CN110928796B (en) Automatic test platform
CN101241467B (en) Automatized white box test system and method facing to WEB application
CN106909510A (en) A kind of method and server for obtaining test case
CN108628748B (en) Automatic test management method and automatic test management system
CN108111364B (en) Service system testing method and device
CN103902458A (en) Universal storage software test design method
CN110262979B (en) Simulated third-party data source testing method based on MOCK platform
CN114741283A (en) Automatic interface testing method and device based on python design
CN112463580A (en) Test system and method based on virtualization device
CN111209166A (en) Automatic inspection system for B/S architecture business system
US11704186B2 (en) Analysis of deep-level cause of fault of storage management
CN106354629A (en) Construction method of iOS system mobile application automatic test system based on multiple stages of servers
CN112241360A (en) Test case generation method, device, equipment and storage medium
CN112631937A (en) Automatic CAN operation testing method and device for T-Box
CN104899134A (en) Automatic domain name registration server testing system and method
US9612944B2 (en) Method and system for verifying scenario based test selection, execution and reporting
CN116545891A (en) Automatic distribution network testing method based on intelligent equipment
CN116860608A (en) Interface testing method and device, computing equipment and storage medium
CN113238940B (en) Interface test result comparison method, device, equipment and storage medium
CN113238901B (en) Multi-device automatic testing method and device, storage medium and computer device
CN115438026A (en) Database scanning method, device, equipment, storage medium and product
CN110543424B (en) Test method and device for electronic tax platform
CN115016321A (en) Hardware-in-loop automatic testing method, device and system
CN114124750A (en) Test method, system, equipment and storage medium of communication module
CN106909511A (en) A kind of automated testing method based on RedwoodHQ

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant