CN113742215B - Method and system for automatically configuring and calling test tool to perform test analysis - Google Patents
Method and system for automatically configuring and calling test tool to perform test analysis Download PDFInfo
- Publication number
- CN113742215B CN113742215B CN202110787250.4A CN202110787250A CN113742215B CN 113742215 B CN113742215 B CN 113742215B CN 202110787250 A CN202110787250 A CN 202110787250A CN 113742215 B CN113742215 B CN 113742215B
- Authority
- CN
- China
- Prior art keywords
- test
- testing
- control center
- center server
- tool
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 277
- 238000004458 analytical method Methods 0.000 title claims abstract description 32
- 238000000034 method Methods 0.000 title claims abstract description 27
- 238000011161 development Methods 0.000 claims description 22
- 238000012545 processing Methods 0.000 claims description 16
- 238000004891 communication Methods 0.000 claims description 12
- 230000008569 process Effects 0.000 claims description 11
- 238000012790 confirmation Methods 0.000 claims description 6
- 238000010998 test method Methods 0.000 claims description 4
- 238000012544 monitoring process Methods 0.000 claims description 2
- 230000006870 function Effects 0.000 description 6
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 238000007726 management method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000013522 software testing Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000011990 functional testing Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000011056 performance test Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000011076 safety test Methods 0.000 description 1
- 238000012916 structural analysis Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3692—Test management for test results analysis
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
The invention provides a method and a system for automatically configuring and calling a testing tool for testing analysis, wherein the method comprises the following steps: the control center server configures the information of the tested project, and develops and activates the testing tool; downloading and installing the test tool by the test machine, and configuring a test environment variable; the control center server distributes test items to all the testers; the testing machine writes a testing message queue by using a script language, invokes an environment variable configuration file to configure an environment variable, constructs a testing item based on a plurality of executable testing modules, and tests and analyzes the testing item; and the control center server compiles and generates a test report according to the test analysis result. According to the invention, the test tool is automatically configured and called, and the test items are constructed based on the executable test modules bound with the test cases, so that the automatic test in the distributed environment is realized.
Description
Technical Field
The invention relates to the field of automatic testing, in particular to a method and a system for automatically configuring and calling a testing tool to conduct testing analysis.
Background
Software is increasingly applied to application systems, and as the functions of system software become more powerful, the functions of the software become more complex, so that code quality needs to be tested to ensure the correctness of the functions of the software.
Software testing is accompanied by the generation of software. In the early software development process, the software scale is very small, the complexity is low, the software development process is disordered and has no testing concept, a developer can test and debug the software equally, the purpose is to correct the known faults in the software, and the developer can often complete the work of the part. The investment in testing is minimal, the test intervention is also late, often waiting until the code is formed and the product has been substantially completed. As software progresses to a large-scale and complex direction, the quality of the software becomes more and more important, basic theory and practical technology of software testing begin to form, and various processes and management methods for software development begin to be designed. The software development is developed towards the direction of structuring and object, and is characterized by structural analysis and design, structural review, structural programming and structural test, and the concept of quality is integrated into the software development, the software test definition is changed, the test is not simply a process of finding errors, the test is taken as the main function of Software Quality Assurance (SQA), the content of software quality evaluation is contained, and Bill Hetzel indicates in the book of software test complete guidelines: a "test" is any activity that aims at evaluating a program or system attribute. Testing is a measure of software quality. "software developers and testers begin to sit together to discuss software engineering and testing problems.
Common test methods include a static test method and a dynamic test, and there are unit test, integrated test, system test and acceptance test according to test levels, and a test tool is an indispensable aspect according to any method in the current test.
The test tools have different emphasis points according to different types and different functions, and are functional test tools, performance test tools, unit test tools, safety test tools and the like, but the test tools can be used only under the control of License. For the items needing to be subjected to static analysis, if the tested items and the testing tool are not on the same computer, the static analysis cannot be performed on the items, so that the testing tool cannot be effectively used, and the software quality is not greatly improved. Therefore, in a distributed system, we need to distribute items to test computers by automatically invoking tools to implement test automation.
Disclosure of Invention
Aiming at the fact that no analysis tool exists in the market at present, the invention aims to provide a method and a system for automatically distributing projects, automatically configuring project information and automatically calling a testing tool to conduct testing analysis.
Specifically, the invention provides a method for automatically configuring and calling a testing tool to carry out test analysis, which is applied to an automatic testing system, wherein the system comprises a control center server and a testing machine connected with the control center server through a communication network, and is characterized by comprising the following steps:
s1) a control center server configures tested item information:
different project information parameters are configured according to different test projects, wherein the project information parameters comprise project types, development environments, development languages, rules to be used, environment variables to be called, test report formats and header file configuration paths;
s2) the control center server activates the test tool:
creating a dynamic link library DLL document when developing a test tool; when loading the testing tool, writing the route of the DLL document of the testing tool into a registration database of the control center server; the control center server providing the test tool to the tester via the network;
s3) configuring a test environment by the tester:
starting a testing machine, and downloading the testing tool from a control center server through a communication network; installing a testing tool and configuring testing environment variables including a header file directory, a library file directory and a bin directory; setting a head file directory and a library file directory by executing batch files;
s4) the control center server distributes items:
adding IP addresses and test tools of all the testers in a control center server, and sending the project information parameters to all the testers by the control center server;
s5) starting an automatic test procedure:
after receiving the project information parameters sent by the control center server, the testing machine combines the project information parameters and compiles a test message queue by using a script language; calling an environment variable configuration file to configure environment variables; setting a plurality of executable test modules for controlling the tested units; the test message queue is associated with the executable test module;
constructing a test item based on the executable test modules, compiling the test item, directly testing and analyzing the test item if the test item is compiled, and returning error information if the item is compiled to fail;
s6) the control center server collects test analysis results from the testing machine, displays the test results through pages, checks and processes problems by testing staff or developers, stores the problem processing results into a database after confirmation is completed, and compiles and generates a test report based on the test state information stored by the analysis monitor and the problem processing results.
The invention also provides a system for automatically configuring and calling the testing tool to carry out testing analysis, the system comprises a control center server and a testing machine connected with the control center server through a communication network, and the system is characterized in that:
the control center server is configured to:
creating a dynamic link library DLL document when developing a test tool;
when loading the testing tool, writing the route of the DLL document of the testing tool into a registration database of the control center server;
providing the test tool to the tester via the network;
adding IP addresses and test tools of all the testers in a control center server, and sending the project information parameters to all the testers by the control center server;
collecting test analysis results from a testing machine, displaying the test results through a page, checking and processing problems by a tester or a developer, storing the problem processing results into a database after confirmation, and compiling and generating a test report based on the test state information stored by the analysis monitor and the problem processing results;
the tester is configured to:
downloading the test tool from a control center server via a communication network;
installing a testing tool and configuring testing environment variables including a header file directory, a library file directory and a bin directory; setting a head file directory and a library file directory by executing batch files;
after receiving the project information parameters sent by the control center server, combining the project information parameters, and writing a test message queue by using a script language; calling an environment variable configuration file to configure environment variables; setting a plurality of executable test modules for controlling the tested units; the test message queue is associated with the executable test module;
and constructing a test item based on the executable test modules, compiling the test item, directly testing and analyzing the test item if the test item is compiled, and returning error information if the item is compiled to fail.
Drawings
FIG. 1 is a flow chart of a method of automatically configuring and invoking a test tool for test analysis in accordance with the present invention.
FIG. 2 is a schematic diagram of an automatic test flow of the tester.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to specific embodiments of the present invention and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Fig. 1 shows a flow chart of the method of the invention, which is applied to an automatic test system, the system comprises a control center server and a tester connected with the control center server through a communication network, and the method comprises the following steps:
s1) a control center server configures tested item information:
different project information parameters are configured according to different test projects, including project types, development environments, development languages, rules used, environment variables required to be called, test report formats and header file configuration paths.
The information required by the tested project is configured on the control center server according to different testing projects, and generally comprises project types, development environments, development languages, rules required to be used, environment variables required to be called, report formats and the like. For common projects, after the projects are loaded, the types of the projects can be automatically analyzed, the development environment, the development language, default rules and the like are matched, but for the projects with complex types, the default matching can be incorrect, and adjustment is needed.
Taking the DSP6713 chip in the embedded development environment CCS3.3 for testing the Windows operating system as an example, the CCS3.3 environment variables are configured, there are dosrun. Bat batch files under the CCS3.3 installation directory, and executing this file sets the header file and library file directory of the chip model, including the DSP6713 chip model. Before the test tool analyzes the project, the tested project needs to be compiled, so that the project can be correctly constructed in the CCS3.3 development environment. The information of the tested item includes item type CCS3.3 pjt item, development environment CCS3.3 DSP6713 type, development language C, rule used MISRA C2004, environment variable dosrun.bat to be called, report format html, header configuration path, etc.
S2) the control center server activates the test tool:
creating a dynamic link library DLL document when developing a test tool; when loading the testing tool, writing the route of the DLL document of the testing tool into a registration database of the control center server; the control center server provides the test tool to the tester via the network.
The dynamic link library DLL file comprises description information of the test process and the layout of the test tool, when the test tool is loaded, the information of the DLL file route is required to be written into a registration database of the control center server, and after the test tool is loaded, if the test tool is renamed or upgraded, the test center server provides a management interface, and can test the update condition of the test tool and provide operation guidance.
S3) configuring a test environment by the tester:
starting a testing machine, and downloading the testing tool from a control center server through a communication network; installing a testing tool and configuring testing environment variables including a header file directory, a library file directory and a bin directory; wherein the head file directory and the library file directory are set by executing batch files.
When the test tool analyzes the project, the tested project needs to be compiled, so that the project can be correctly tested and analyzed, and different projects need different compiling environments, so that the environment variables need to be correctly configured when the analysis is performed; meanwhile, a head file catalog, a library file catalog and a bin catalog are required to be configured, and for some special development environments, the configuration is carried out according to an operation manual; for embedded development environments, information such as chip type needs to be configured. In order to ensure that various environment variables on a computer do not conflict with each other, parameters of each development environment are made into a file form, executable rights are then given,
under the embedded development environment of the Windows operating system, the tester configures CCS3.3 environment variables, and executes dosrun.bat batch file setting of header files and library file catalogues which accord with the chip model; under the embedded development environment of the Linux operating system, the tester configures a QT creator environment variable, and writes a bin directory and a library file directory of the QT creator into a batch processing file QT. When testing items, environment variables are set in the form of calling files, and after the testing is completed, the environment variables disappear, so that the next testing cannot be influenced.
S4) the control center server distributes items:
and adding the IP addresses and the test tools of all the testers in the control center server, and sending the project information parameters to all the testers by the control center server.
S5) starting an automatic test process, wherein a test flow chart is shown in FIG. 2:
after receiving the project information parameters sent by the control center server, the testing machine combines the project information parameters and compiles a test message queue by using a script language; calling an environment variable configuration file to configure environment variables; setting a plurality of executable test modules for controlling the tested units; the test message queue is associated with the executable test module;
and constructing a test item based on the executable test modules, compiling the test item, directly testing and analyzing the test item if the test item is compiled, and returning error information if the item is compiled to fail.
The executable test module is bound with the test case; the test message queue includes references to the availability
Executing the command of the test module.
And reading commands in a plurality of test message queues in the test item, executing corresponding executable test modules, and analyzing and verifying the script language in the test message queues.
Each tester also includes an analysis monitor:
the analysis monitor is used for monitoring and storing test state information, and periodically retrieving information from the testing machine to determine whether the testing machine fails or not; identifying the capability of the testing machine, determining whether the testing machine meets the testing resources required by the testing project, and providing constraints on the testing to be executed for a control center server; the analysis monitor also records the running process information of the test script and generates a test execution log file.
The test item includes a plurality of versions, each version being one or more scenarios associated with test code for testing a different aspect of the item; each version contains two file libraries for tracking compiled code and for tracking uncompiled files required for testing, respectively.
S6) the control center server collects test analysis results from the testing machine, displays the test results through pages, checks and processes problems by testing staff or developers, stores the problem processing results into a database after confirmation is completed, and compiles and generates a test report based on the test state information stored by the analysis monitor and the problem processing results.
The test result can be one or more result sets, the result sets are displayed to a tester or a developer in an XML format for viewing and processing, the tester or the developer can view each problem, confirm that the problem is not caused, edit and delete the problem, and restore the result sets to a database after the confirmation is completed, and meanwhile, a test report can be generated.
The invention also provides a system for automatically configuring and calling the testing tool to carry out testing analysis, which comprises a control center server and a testing machine connected with the control center server through a communication network;
the control center server is configured to:
creating a dynamic link library DLL document when developing a test tool;
when loading the testing tool, writing the route of the DLL document of the testing tool into a registration database of the control center server;
providing the test tool to the tester via the network;
adding IP addresses and test tools of all the testers in a control center server, and sending the project information parameters to all the testers by the control center server;
collecting test analysis results from a testing machine, displaying the test results through a page, checking and processing problems by a tester or a developer, storing the problem processing results into a database after confirmation, and compiling and generating a test report based on the test state information stored by the analysis monitor and the problem processing results;
the tester is configured to:
downloading the test tool from a control center server via a communication network;
installing a testing tool and configuring testing environment variables including a header file directory, a library file directory and a bin directory; setting a head file directory and a library file directory by executing batch files;
after receiving the project information parameters sent by the control center server, combining the project information parameters, and writing a test message queue by using a script language; calling an environment variable configuration file to configure environment variables; setting a plurality of executable test modules for controlling the tested units; the test message queue is associated with the executable test module;
and constructing a test item based on the executable test modules, compiling the test item, directly testing and analyzing the test item if the test item is compiled, and returning error information if the item is compiled to fail.
It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other.
In the several embodiments provided in the present invention, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the elements is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The integrated units implemented in the form of software functional units described above may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium, and includes several instructions for making a computer device (which may be a personal computer, a physical machine Server, or a network cloud Server, etc., and need to install a Windows or Windows Server operating system) execute part of the steps of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The above description is only of the preferred embodiments of the present invention, and is not intended to limit the present invention in any way, but any simple modification, equivalent variation and modification made to the above embodiments according to the technical substance of the present invention still fall within the scope of the technical solution of the present invention.
Claims (2)
1. A method for automatically configuring and calling a test tool for test analysis, which is applied to an automatic test system, wherein the system comprises a control center server and a tester connected with the control center server through a communication network, and the method is characterized by comprising the following steps:
s1) a control center server configures tested item information:
different project information parameters are configured according to different test projects, wherein the project information parameters comprise project types, development environments, development languages, rules to be used, environment variables to be called, test report formats and header file configuration paths;
s2) the control center server activates the test tool:
creating a dynamic link library DLL document when developing a test tool; when loading the testing tool, writing the route of the DLL document of the testing tool into a registration database of the control center server; the control center server providing the test tool to the tester via the network;
s3) configuring a test environment by the tester:
starting a testing machine, and downloading the testing tool from a control center server through a communication network; installing a testing tool and configuring testing environment variables including a header file directory, a library file directory and a bin directory; setting a head file directory and a library file directory by executing batch files;
wherein, in the step S3:
under the embedded development environment of the Windows operating system, the tester configures CCS3.3 environment variables, and executes dosrun.bat batch file setting of header files and library file catalogues which accord with the chip model;
under the embedded development environment of the Linux operating system, configuring a QT creator environment variable by a testing machine, and writing a bin directory and a library file directory of the QT creator into a batch processing file qt.sh;
s4) the control center server distributes items:
adding IP addresses and test tools of all the testers in a control center server, and sending the project information parameters to all the testers by the control center server;
s5) starting an automatic test procedure:
after receiving the project information parameters sent by the control center server, the testing machine combines the project information parameters and compiles a test message queue by using a script language; calling an environment variable configuration file to configure environment variables; setting a plurality of executable test modules for controlling the tested units; the test message queue is associated with the executable test module;
constructing a test item based on the executable test modules, compiling the test item, directly testing and analyzing the test item if the test item is compiled, and returning error information if the item is compiled to fail;
wherein in the step S5, the test message queue is associated with the executable test module, and includes:
the executable test module is bound with the test case; the test message queue includes commands referencing the executable test module;
in the step S5, the testing and analyzing the test item includes:
reading commands in a plurality of test message queues in the test item, executing a corresponding executable test module, and analyzing and verifying a script language in the test message queues;
wherein, in the step S5, each test machine further includes an analysis monitor:
the analysis monitor is used for monitoring and storing test state information, and periodically retrieving information from the testing machine to determine whether the testing machine fails or not; identifying the capability of the testing machine, determining whether the testing machine meets the testing resources required by the testing project, and providing constraints on the testing to be executed for a control center server;
the analysis monitor also records the running process information of the test script and generates a test execution log file;
wherein in step S5, the test item includes a plurality of versions, each version being one or more scenarios associated with test code for a different aspect of the test item; each version comprises two file libraries which are respectively used for tracking compiled codes and non-compiled files required by tracking tests;
s6) the control center server collects test analysis results from the testing machine, displays the test results through pages, checks and processes problems by testing staff or developers, stores the problem processing results into a database after confirmation is completed, and compiles and generates a test report based on the test state information stored by the analysis monitor and the problem processing results.
2. The method for automatically configuring and invoking test tools for testing analysis according to claim 1, wherein in step S1:
the development language is C language, the used rule is MISRA C2004, and the report format is html format.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110787250.4A CN113742215B (en) | 2021-07-13 | 2021-07-13 | Method and system for automatically configuring and calling test tool to perform test analysis |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110787250.4A CN113742215B (en) | 2021-07-13 | 2021-07-13 | Method and system for automatically configuring and calling test tool to perform test analysis |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113742215A CN113742215A (en) | 2021-12-03 |
CN113742215B true CN113742215B (en) | 2024-04-09 |
Family
ID=78728593
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110787250.4A Active CN113742215B (en) | 2021-07-13 | 2021-07-13 | Method and system for automatically configuring and calling test tool to perform test analysis |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113742215B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115190141B (en) * | 2022-06-15 | 2024-04-16 | 国网浙江省电力有限公司宁波供电公司 | Industrial control safety protection method based on electric power target range environment |
CN115629295B (en) * | 2022-11-30 | 2023-04-14 | 苏州萨沙迈半导体有限公司 | Chip automation test system, method and storage medium |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110046099A (en) * | 2019-04-11 | 2019-07-23 | 艾伯资讯(深圳)有限公司 | Intelligent software test macro and method |
CN111427762A (en) * | 2019-12-09 | 2020-07-17 | 北京关键科技股份有限公司 | Automatic calling tool analysis techniques |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8533676B2 (en) * | 2011-12-29 | 2013-09-10 | Unisys Corporation | Single development test environment |
-
2021
- 2021-07-13 CN CN202110787250.4A patent/CN113742215B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110046099A (en) * | 2019-04-11 | 2019-07-23 | 艾伯资讯(深圳)有限公司 | Intelligent software test macro and method |
CN111427762A (en) * | 2019-12-09 | 2020-07-17 | 北京关键科技股份有限公司 | Automatic calling tool analysis techniques |
Also Published As
Publication number | Publication date |
---|---|
CN113742215A (en) | 2021-12-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20180357146A1 (en) | Completing functional testing | |
US7299451B2 (en) | Remotely driven system for multi-product and multi-platform testing | |
US7503037B2 (en) | System and method for identifying bugs in software source code, using information from code coverage tools and source control tools to determine bugs introduced within a time or edit interval | |
KR101132560B1 (en) | System and method for automatic interface testing based on simulation for robot software components | |
US6473707B1 (en) | Test executive system and method including automatic result collection | |
US6401220B1 (en) | Test executive system and method including step types for improved configurability | |
US6397378B1 (en) | Test executive system and method including distributed type storage and conflict resolution | |
US7681180B2 (en) | Parameterized test driven development | |
US9465718B2 (en) | Filter generation for load testing managed environments | |
US8171460B2 (en) | System and method for user interface automation | |
Laranjeiro et al. | A black box tool for robustness testing of REST services | |
US20110202901A1 (en) | Automated software testing and validation system | |
US20030046029A1 (en) | Method for merging white box and black box testing | |
US20080320071A1 (en) | Method, apparatus and program product for creating a test framework for testing operating system components in a cluster system | |
CN113742215B (en) | Method and system for automatically configuring and calling test tool to perform test analysis | |
US20110126179A1 (en) | Method and System for Dynamic Patching Software Using Source Code | |
US8661414B2 (en) | Method and system for testing an order management system | |
CN112241360A (en) | Test case generation method, device, equipment and storage medium | |
Adamsen et al. | Practical initialization race detection for JavaScript web applications | |
US10229029B2 (en) | Embedded instruction sets for use in testing and error simulation of computing programs | |
US11977478B2 (en) | Compositional verification of embedded software systems | |
US9632912B1 (en) | Method and system for debugging a program | |
JP2002014847A (en) | Device for checking program and method for the same and recording medium with checking program stored | |
CN115374018B (en) | Automatic interface testing method and device | |
CN116932414B (en) | Method and equipment for generating interface test case and computer readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |