CN110232024B - Software automation test framework and test method - Google Patents
Software automation test framework and test method Download PDFInfo
- Publication number
- CN110232024B CN110232024B CN201910442962.5A CN201910442962A CN110232024B CN 110232024 B CN110232024 B CN 110232024B CN 201910442962 A CN201910442962 A CN 201910442962A CN 110232024 B CN110232024 B CN 110232024B
- Authority
- CN
- China
- Prior art keywords
- test
- result
- software
- module
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3664—Environments for testing or debugging software
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3684—Test management for test design, e.g. generating new test cases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
The invention discloses a software automation test framework and a test method, wherein the test framework comprises a test code generation module, an actuator scheduling module, an interface parameter input processing module, a database monitoring module, an assertion scheduling module and a test result processing and display platform; the software automated testing comprises the following steps: step 1, automatically generating a software test code or a test case, and step 2, carrying out actuator scheduling and entry processing according to a test type; step 3, monitoring is set for the software testing process; step 4, setting assertion on the test result; step 5, counting, analyzing and displaying the test result; the framework of the invention can conveniently test the software, has wide coverage, does not need manual coding and saves the software testing cost.
Description
Technical Field
The invention belongs to the technical field of data testing, and particularly relates to a software automation testing framework and a testing method.
Background
In the prior art, in the automatic testing process of web software, a large amount of manpower and time are consumed due to the fact that test cases need to be compiled manually, the error rate of the manually compiled test cases is high, the scene coverage is poor, and the software testing is limited, so that the testing result is not accurate enough; after the test is finished, the test database cannot be recovered, so that repeated regression test is difficult, the requirement of software test on testers is high, and the participation degree of the testers in the software test process is high, so that the test result is not accurate enough.
Disclosure of Invention
The invention aims to provide a software automatic testing framework, which realizes automatic generation of testing codes and testing cases during software testing, reduces the workload and error rate of technicians, improves the software testing efficiency, avoids the problems of poor coverage, incomplete coverage and the like caused by manual writing, realizes monitoring of data change before and after testing by using a data monitoring technology during software testing, solves the problem of test data rollback, and improves the data analysis and processing efficiency and effect by utilizing a test report display platform to uniformly analyze and process test data and test results.
The invention also aims to provide a software automatic testing method, so that the software testing is simpler, more convenient and more controllable, the testing result can be visually checked and analyzed, and the error query and improvement of the software are facilitated.
The technical scheme adopted by the invention is that the software automation test framework comprises a test code generation module, an actuator scheduling module, an interface parameter input processing module, a database monitoring module, an assertion scheduling module and a test result processing and displaying platform;
the test code generation module is used for finding out fields and attribute values required by interface entry parameters through Java reflection, and automatically generating test case codes or test case xls files suitable for detection from original codes of the tested item according to the fields and the attribute values required by the interface entry parameters, wherein the xls files contain parameters and expected effects required by a test method;
the actuator scheduling module is used for scheduling a test engine according to the test type, and a testNG or Junit unit test frame is selected as a bottom layer frame of the test engine;
the interface parameter input processing module is used for arranging and combining the parameter input data according to the normal values and the abnormal values of the parameter input data to obtain parameter input data suitable for different test types;
the database monitoring module is used for monitoring the test system data in the software test process and rolling back the data according to the selection of a user after the test execution is finished;
the assertion scheduling module is used for recording a test method or an interface return result and verifying a return value of a repeated test according to a recorded correct test result;
the test result processing and displaying platform is used for counting, screening, processing and analyzing all data in the software testing process, comparing the test result with the historical test result, analyzing the error reason of the test result, constructing a report and a graph according to the test result, and displaying in multiple dimensions.
The software automatic testing method specifically comprises the following steps:
step 1, a test code generation module uses maven to create a project and add dependence and plug-ins, selects a code or an xls type, automatically generates a test code of a project to be tested if a code mode is used, automatically generates a test case template if an xls file is selected, and designs a test case in the template;
step 2, designing a test code or a test case, scheduling a test engine by an actuator scheduling module according to the type of the test case, selecting a test NG or Junit unit test frame by a test engine bottom frame, arranging and combining input parameter data by an interface input parameter processing module according to normal values and abnormal values of input parameter fields to obtain input parameter data with different results, and storing the input parameter data in a database;
step 3, selecting whether to perform data monitoring, if so, adding method comments to the software testing framework, inputting the input parameter data into a testing database for testing, monitoring the current database state before each test execution, and after the test execution is finished, selecting whether to perform data rollback by a user;
step 4, recording a test method or interface return result by the assertion scheduling module in the software test process, taking the recording result as a benchmark test under the condition that the recording result is accurate, verifying a repeatedly executed test return value, and judging whether the test result meets the expected requirement or not;
and 5, the test result processing and displaying platform carries out statistics, screening, processing and analysis on all test results, and processing results are made into reports and graphs for multi-dimensional display.
Furthermore, the test case types in the step 2 include unit test, integrated test and interface test, and the scheduling of the test engine according to the test case types means inputting test codes of different test types into corresponding test databases.
Further, the interface parameter input processing module in step 2 uses an equivalence class method and a boundary value method to perform permutation and combination on parameter input fields, so as to obtain parameter input data applicable to the corresponding test type.
The invention has the beneficial effects that: 1. according to the invention, by automatically generating the test codes, the workload of workers and the code error rate are reduced, and the test efficiency is improved; 2. the invention realizes the automatic generation of the test case, and avoids poor coverage and incomplete coverage caused by manually compiling the test case; 3. the invention realizes the monitoring of data change before and after the test through data monitoring, and avoids data rollback in the test process; 4. the invention utilizes the test report display platform to carry out unified analysis and display on the test data and the test result, and the display effect is good.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flow chart of software automation testing.
FIG. 2 is an interface diagram of a software automation test framework.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The software automation testing framework comprises a testing code generation module, an actuator scheduling module, an interface parameter input processing module, a database monitoring module, an assertion scheduling module and a testing result processing and displaying platform.
The test code generation module finds out fields and attribute values required by interface entry parameters through Java reflection, and automatically generates test case codes or an xls file of the test cases, wherein the xls file comprises parameters required by a test method and expected results, a tester can select to adopt the automatically generated test case codes to perform parameter combination test, and can also design different parameter values and expected results in the xls file to perform test, a test frame can intelligently identify the test cases in the xls file, transmit different parameters to a method to be tested, and compare actual results with the expected results of the test cases; the software testing efficiency is greatly improved by using the test code generation module, different branch combinations of the software testing are more effectively covered, the software testing coverage rate is improved to the maximum extent, meanwhile, the requirement on the coding capacity of a tester is reduced by adopting the xls file to carry out the software testing, the tester only needs to design and maintain a software testing case in the xls file, the parameter input field and an expected result are arranged and combined by using methods such as equivalence class, boundary value and the like to obtain parameter input data, and a testing frame automatically completes the identification and testing work.
After the compiling of software test codes or the design of xls files of test cases is completed, the executor scheduling module schedules a test engine according to test types, namely the test codes are input into different test databases according to the test types, a bottom framework of the test engine selects TestNG or Junit, the bottom framework is optimized and upgraded, multiple data sources are selected and scheduled in the test process, test data are stored in mysql or redis, and a test is constructed by combining maven; the test types include unit test, integration test, and interface test.
And the interface parameter input processing module uses an equivalence class method and a boundary value method to arrange and combine parameter input fields according to normal values and abnormal values of the parameter input fields in the parameter input data to obtain parameter input data with different results.
The database monitoring module selects whether to perform data rollback in the test, monitors the current data of the test system before the software test starts if the data rollback is selected, recovers the data of the test system according to the monitoring result after the test execution is completed until the test is performed so as to ensure the cleanness of the test environment data and avoid the influence of dirty data on the test result, and only monitors the test process and stores the monitored data if the data rollback is not selected; and whether the database is monitored in the test process can be determined by the method annotation.
The assertion scheduling module is used for verifying whether a test result accords with an expected result, if the accuracy of the test result is difficult to guarantee without assertion, the existing test frame only compares the expected result with an actual result, but the test frame of the invention records the returned result of the test method or the interface in the test process on the basis of comparing the expected result with the actual result, the recorded content comprises each parameter value and type of the method response result, if the recorded test result is correct, the recorded test result is defined as a reference test, and the accuracy and consistency of the repeatedly executed test returned value are checked according to the reference test in the subsequent test, so that the automatic assertion is realized to judge whether the software test can pass or not.
The test result processing and displaying platform carries out statistics, screening, processing and analysis on all data in the software testing process, the test result comprises 7 majority parts of overview, category, test suite, chart, time scale, function, package and the like, self-definition of a plurality of information is supported, the information comprises attachment addition, defect link, case link, testing step, function module, user story, case name, case description and the like, the test result is compared with the historical test result, the reason that the test result of the test case is wrong is automatically analyzed, a report and a graph are constructed according to the test result output by the test result processing module, and the software test result is visually displayed.
The interface of the software testing framework is shown in FIG. 2, the invention has the advantages that the automatic generation of the testing code and the management of the testing case model can be realized, the testing framework visually edits the testing flow when the software testing is carried out, the returned result of the testing method or the interface is finely checked, and the intelligent identification, the automatic driving and the continuous integration can be realized in the software testing process; the test framework adopts one-stop editing to automatically generate test codes or cases, adopts TestNG or Junit as a bottom framework of a test engine, finishes data scheduling by using mysql or redis, sets data monitoring and assertion on the software test process in the software test process, can perform rollback processing on test data of a database by the data monitoring, verifies the accuracy and consistency of a repeated test return value by the assertion according to a recorded test result, inputs the test result into an interpretation result processing and display platform to perform statistics, screening, processing and analysis, and makes a report and a graph for display.
The software cell test flow is shown in fig. 1, and the example steps are as follows:
step 1, a test code generation module uses maven to create a project and add dependence and plug-ins, selects a code or an xls type, automatically generates a test code of a project to be tested if a code mode is used, automatically generates a test case template if an xls file is selected, and designs a test case in the template;
step 2, the test case design is completed, the test framework intelligently analyzes the test case parameters and the expected results, the actuator scheduling module schedules the test engine according to the test case type, the interface parameter input processing module processes the parameter input fields to obtain parameter input data of different results, and the parameter input data is stored in the database;
step 3, selecting whether to perform data monitoring, if so, adding method comments to a test frame, inputting parameter data into a test database for testing in the test, monitoring the current database state by the monitoring database before each test execution, and after the test execution is finished, selecting whether to perform data rollback by a user;
step 4, whether to assert recording can be selected in the data monitoring process, if recording is selected, the testing frame automatically records the testing result for subsequent calling, and the assertion scheduling module asserts the testing result according to the expected result of the testing item and judges whether the testing result meets the expected requirement; the playback function can be selected, the accuracy and consistency of the repeated test result are checked according to the recorded test result, automatic assertion is realized, whether the software test can pass or not is judged, and the correctness of the recorded test result is ensured in the process;
and 5, the test result processing and displaying platform automatically selects TestNG or Junit, calls a data analysis platform, counts, screens, processes and analyzes all test results, makes the processed results into reports and graphs, and displays the processed results in multiple dimensions.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.
Claims (4)
1. The software automation testing framework is characterized by comprising a testing code generating module, an executor scheduling module, an interface parameter input processing module, a database monitoring module, an assertion scheduling module and a testing result processing and displaying platform;
the test code generation module is used for finding out fields and attribute values required by interface entry parameters through Java reflection, and automatically generating test case codes or test case xls files suitable for detection from original codes of the tested item according to the fields and the attribute values required by the interface entry parameters, wherein the xls files contain parameters and expected effects required by a test method;
the actuator scheduling module is used for scheduling a test engine according to the test type, and a testNG or Junit unit test frame is selected as a bottom layer frame of the test engine;
the interface parameter input processing module is used for arranging and combining the parameter input data according to the normal values and the abnormal values of the parameter input fields to obtain parameter input data suitable for different test types;
the database monitoring module is used for monitoring the test system data in the software test process and rolling back the data according to the selection of a user after the test execution is finished;
the assertion scheduling module is used for recording a test method or an interface return result and verifying a return value of a repeated test according to a recorded correct test result;
the test result processing and displaying platform is used for counting, screening, processing and analyzing all data in the software testing process, comparing the test result with the historical test result, analyzing the error reason of the test result, constructing a report and a graph according to the test result, and displaying in multiple dimensions.
2. The software automation test method for the software automation test framework of the application claim 1 is characterized by comprising the following steps:
step 1, a test code generation module uses maven to create a project and add dependence and plug-ins, selects a code or an xls type, automatically generates a test code of a project to be tested if a code mode is used, automatically generates a test case template if an xls file is selected, and designs a test case in the template;
step 2, designing a test code or a test case, scheduling a test engine by an actuator scheduling module according to the type of the test case, selecting a test NG or Junit unit test frame by a test engine bottom frame, arranging and combining input parameter data by an interface input parameter processing module according to normal values and abnormal values of input parameter fields to obtain input parameter data with different results, and storing the input parameter data in a database;
step 3, selecting whether to perform data monitoring, if so, adding method comments to the test frame, inputting the input parameter data into a test database for testing, monitoring the current database state before each test execution, and after the test execution is finished, selecting whether to perform data rollback by a user;
step 4, recording a test method or interface return result by the assertion scheduling module in the software test process, taking the recording result as a benchmark test under the condition that the recording result is accurate, verifying a repeatedly executed test return value, and judging whether the test result meets the expected requirement or not;
and 5, the test result processing and displaying platform carries out statistics, screening, processing and analysis on all test results, and processing results are made into reports and graphs for multi-dimensional display.
3. The method according to claim 2, wherein the test case types in step 2 include unit test, integrated test, and interface test, and the scheduling of the test engine according to the test case types means inputting test codes of different test types into corresponding test databases.
4. The automated software testing method according to claim 2, wherein the interface parameter input processing module in step 2 uses an equivalence class method and a boundary value method to arrange and combine parameter input fields to obtain parameter input data applicable to the corresponding test type.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910442962.5A CN110232024B (en) | 2019-05-26 | 2019-05-26 | Software automation test framework and test method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910442962.5A CN110232024B (en) | 2019-05-26 | 2019-05-26 | Software automation test framework and test method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110232024A CN110232024A (en) | 2019-09-13 |
CN110232024B true CN110232024B (en) | 2020-02-28 |
Family
ID=67861611
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910442962.5A Expired - Fee Related CN110232024B (en) | 2019-05-26 | 2019-05-26 | Software automation test framework and test method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110232024B (en) |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110597725B (en) * | 2019-09-19 | 2023-05-05 | 浙江诺诺网络科技有限公司 | Mysql simulation return method, device and equipment |
CN110865941A (en) * | 2019-11-11 | 2020-03-06 | 中信百信银行股份有限公司 | Interface test case generation method, device and system |
CN110928796B (en) * | 2019-11-29 | 2023-05-30 | 宝付网络科技(上海)有限公司 | Automatic test platform |
CN110990276A (en) * | 2019-11-29 | 2020-04-10 | 泰康保险集团股份有限公司 | Automatic testing method and device for interface field and storage medium |
CN111209191A (en) * | 2019-12-29 | 2020-05-29 | 的卢技术有限公司 | Automatic testing method and system for realizing video classification |
CN111209216A (en) * | 2020-03-11 | 2020-05-29 | 山东汇贸电子口岸有限公司 | Distributed test framework based on plug-in and test method |
CN111324546A (en) * | 2020-03-20 | 2020-06-23 | 普信恒业科技发展(北京)有限公司 | Task testing method and device |
CN111459821B (en) * | 2020-04-01 | 2023-05-30 | 汇通达网络股份有限公司 | Software automation unit test method based on TestNG |
CN111930633A (en) * | 2020-08-19 | 2020-11-13 | 北京海益同展信息科技有限公司 | Data testing method, platform, electronic device and storage medium |
CN113760714B (en) * | 2020-10-30 | 2024-10-18 | 北京沃东天骏信息技术有限公司 | Software testing method and device |
CN112579439A (en) * | 2020-12-05 | 2021-03-30 | 西安翔腾微电子科技有限公司 | Formal verification method based on display control system |
CN112463644B (en) * | 2020-12-17 | 2024-05-17 | 深圳软牛科技有限公司 | Regression testing method, device and equipment of data recovery software and storage medium |
CN112905453A (en) * | 2021-02-03 | 2021-06-04 | 重庆富民银行股份有限公司 | Method for quickly generating database operation code in automatic test |
CN113176914A (en) * | 2021-06-03 | 2021-07-27 | 上海中通吉网络技术有限公司 | Modularized testing tool based on automatic Web end |
CN113467761B (en) * | 2021-06-03 | 2024-06-14 | 上海赛可出行科技服务有限公司 | Interface test template generation method based on Java reflection principle |
CN114880158A (en) * | 2022-07-11 | 2022-08-09 | 飞狐信息技术(天津)有限公司 | Redis instance diagnosis method and device |
CN115687161B (en) * | 2022-12-30 | 2023-06-23 | 云筑信息科技(成都)有限公司 | Method for automatically generating interface test case based on scanning maven warehouse |
CN118152204B (en) * | 2024-05-13 | 2024-07-26 | 华安证券股份有限公司 | TAF micro-service architecture-based interface intelligent testing method and device |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102141962A (en) * | 2011-04-07 | 2011-08-03 | 北京航空航天大学 | Safety distributed test framework system and test method thereof |
CN102841841A (en) * | 2011-06-20 | 2012-12-26 | 阿里巴巴集团控股有限公司 | Method and system for processing assertion in test |
CN106528425A (en) * | 2016-11-18 | 2017-03-22 | 南京南瑞继保电气有限公司 | Platform plug-in automatically testing method for microprocessor |
CN109614324A (en) * | 2018-12-03 | 2019-04-12 | 北京云测网络科技有限公司 | A kind of method for generating test case and device |
CN109766269A (en) * | 2018-12-18 | 2019-05-17 | 微梦创科网络科技(中国)有限公司 | Continuous integrating automated testing method, device, equipment and medium |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7865780B2 (en) * | 2007-10-05 | 2011-01-04 | Sap Ag | Method for test case generation |
CN101706753B (en) * | 2009-12-11 | 2013-04-10 | 武汉虹信通信技术有限责任公司 | Unit testing framework and method based on Perl |
US8842125B2 (en) * | 2011-10-26 | 2014-09-23 | Google Inc. | Automatically testing compatibility between a graphics card and a graphics program |
WO2015088316A1 (en) * | 2013-12-09 | 2015-06-18 | Mimos Berhad | Functional test automation framework using user defined ontology |
-
2019
- 2019-05-26 CN CN201910442962.5A patent/CN110232024B/en not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102141962A (en) * | 2011-04-07 | 2011-08-03 | 北京航空航天大学 | Safety distributed test framework system and test method thereof |
CN102841841A (en) * | 2011-06-20 | 2012-12-26 | 阿里巴巴集团控股有限公司 | Method and system for processing assertion in test |
CN106528425A (en) * | 2016-11-18 | 2017-03-22 | 南京南瑞继保电气有限公司 | Platform plug-in automatically testing method for microprocessor |
CN109614324A (en) * | 2018-12-03 | 2019-04-12 | 北京云测网络科技有限公司 | A kind of method for generating test case and device |
CN109766269A (en) * | 2018-12-18 | 2019-05-17 | 微梦创科网络科技(中国)有限公司 | Continuous integrating automated testing method, device, equipment and medium |
Non-Patent Citations (4)
Title |
---|
Artest+python+selenium一套轻量级wed自动化测试框架;一颗蛋;《https://zhuanlan.zhihu.com/p/60077060》;20190322;第1-15页 * |
Unit testing with asserts;Matt Chernosky;《http://www.electronvector.com/blog/unit-testing-with-asserts》;20180717;第1-10页 * |
基于Java反射的APP自动化混合测试框架的研究与实现;杜巍;《移动通信》;20161130(第22期);第66-70页 * |
基于JUnit的TDD自动化测试框架改进与实现;周亚慧;《中国优秀硕士学位论文全文数据库 信息科技辑》;20160315(第3期);第I138-644页 * |
Also Published As
Publication number | Publication date |
---|---|
CN110232024A (en) | 2019-09-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110232024B (en) | Software automation test framework and test method | |
US11023358B2 (en) | Review process for evaluating changes to target code for a software-based product | |
CN103150249B (en) | A kind of method and system of automatic test | |
US5500941A (en) | Optimum functional test method to determine the quality of a software system embedded in a large electronic system | |
US8539438B2 (en) | System and method for efficient creation and reconciliation of macro and micro level test plans | |
US7673179B2 (en) | Online testing unification system with remote test automation technology | |
US9354867B2 (en) | System and method for identifying, analyzing and integrating risks associated with source code | |
CN103235759A (en) | Method and device for generating test cases | |
CN111400198B (en) | Self-adaptive software testing system | |
CN112817865A (en) | Coverage precision test method and system based on componentized distributed system | |
CN103631713A (en) | ERP software automated testing system and method | |
CN107861876A (en) | Method of testing, device, computer equipment and readable storage medium storing program for executing | |
CN101082876A (en) | Software automatically evaluating tool bag | |
CN104583789A (en) | Creation and scheduling of a decision and execution tree of a test cell controller | |
Söylemez et al. | Challenges of software process and product quality improvement: catalyzing defect root-cause investigation by process enactment data analysis | |
CN104364664A (en) | An algorithm and structure for creation, definition, and execution of an SPC rule decision tree | |
Zhao et al. | Research on international standardization of software quality and software testing | |
CN115657890A (en) | PRA robot customizable method | |
Silva Filho et al. | Supporting concern-based regression testing and prioritization in a model-driven environment | |
CN116954624A (en) | Compiling method based on software development kit, software development system and server | |
CN115629956A (en) | Software defect management method and system based on interface automatic test | |
JP2006059276A (en) | Source code evaluating system | |
Ramler et al. | Noise in bug report data and the impact on defect prediction results | |
CN113609015A (en) | Automatic test framework based on Bash Shell | |
CN114138628A (en) | Method and device for selecting regression test case based on layered test model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20200228 |
|
CF01 | Termination of patent right due to non-payment of annual fee |