CN115687140A - Test case compiling method and system based on automatic test - Google Patents
Test case compiling method and system based on automatic test Download PDFInfo
- Publication number
- CN115687140A CN115687140A CN202211405771.XA CN202211405771A CN115687140A CN 115687140 A CN115687140 A CN 115687140A CN 202211405771 A CN202211405771 A CN 202211405771A CN 115687140 A CN115687140 A CN 115687140A
- Authority
- CN
- China
- Prior art keywords
- variable
- tested
- automation
- pool
- automation step
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 74
- 238000000034 method Methods 0.000 title claims abstract description 24
- 230000008569 process Effects 0.000 abstract description 8
- 230000001419 dependent effect Effects 0.000 abstract description 4
- 235000014510 cooky Nutrition 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 238000013522 software testing Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 238000012356 Product development Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
Images
Landscapes
- Debugging And Monitoring (AREA)
Abstract
The invention provides a test case compiling method and system based on automatic test, which comprises the following steps: compiling an automation step to be tested; reading input variables from a variable pool based on the automation step to be tested; executing the automation step to be tested using the input variables; judging whether the automation step to be tested passes the execution; if yes, writing the variable obtained after the execution is passed into the variable pool, and taking the automation step to be tested as an automation step; compiling the next automation step to be tested and executing; if not, rewriting the automation step to be tested and executing again; when the automation steps to be tested corresponding to the functions are executed and passed, a plurality of the automation steps which are executed and passed are used as test cases to be stored; the problem that dependent cases or steps need to be repeatedly executed during debugging due to inter-step variable dependence in the case writing process is solved, and the case writing efficiency is remarkably improved.
Description
Technical Field
The invention relates to the technical field of software testing, in particular to a test case compiling method and system based on automatic testing.
Background
Software testing is needed before the software is on line, so that stable operation of the software after the software is on line is ensured. However, with the pace of life increasing, the requirement for the iteration cycle of software is faster and faster, and the time for product development is more and more strained in order to keep customers in the market for the first time. Because the software is not tested for enough time, the shortage of the software is perfected, the BUG often appears after the product is on line, and the loss of users and even the fund loss are caused.
In view of this, the present specification provides a test case compiling method and system based on an automated test, so as to improve the efficiency and quality of software testing.
Disclosure of Invention
The invention aims to provide a test case compiling method based on automatic testing, which comprises the following steps: compiling an automation step to be tested; reading input variables from a variable pool based on the automation step to be tested; one or more variables are stored in the variable pool; said one or more variables are derived by performing historical automation steps; executing the automation step to be tested using the input variables; judging whether the automation step to be tested passes the execution; if so, writing the variable obtained after the execution is passed into the variable pool, and taking the automation step to be tested as an automation step; compiling the next automation step to be tested and executing; if not, rewriting the automation step to be tested and executing again; and when the automation steps to be tested corresponding to the functions are all executed and passed, saving the plurality of executed automation steps as test cases.
Further, the reading the input variable from the variable pool includes: judging whether a variable is used in the automation step to be detected; if yes, judging whether the variable required by the automation step to be detected exists in the variable pool; and if so, extracting the default value of the required variable from the variable pool as the input variable.
Further, the storage structure of the variable pool adopts a tree structure.
Further, the writing the variable obtained after the execution passes into the variable pool includes: judging whether a variable is output; if yes, judging whether other automation steps output variables with the same variable name; if so, taking the variable name as a father node, and adding the variable value and the variable source into a child node; and if not, adding the variable name, the variable value and the variable source into the variable pool.
Further, the parent node further includes a default variable value, the variable name corresponds to the default variable value, and the default variable value is a value of a same-name variable added to the variable pool where the variable name is located last time.
The invention aims to provide a test case compiling system based on automatic testing, which comprises a compiling module, an obtaining module, an executing module, a judging module, a determining module and a storing module, wherein the compiling module is used for compiling a test case; the compiling module is used for compiling the automation steps to be tested; the compiling module is also used for re-compiling the automation step to be tested when the automation step to be tested is not executed; the acquisition module is used for reading input variables from a variable pool based on the automation step to be tested; one or more variables are stored in the variable pool; said one or more variables are derived by performing historical automation steps; the execution module is used for executing the automation step to be tested by using the input variable; the execution module is also used for re-executing the to-be-tested automation step according to the re-written to-be-tested automation step when the to-be-tested automation step is not executed; the judgment module is used for judging whether the automation step to be tested passes the execution; the determination module is used for writing the variable obtained after the execution of the automation step to be tested is passed into the variable pool when the execution of the automation step to be tested is passed, and taking the automation step to be tested as an automation step; compiling the next automation step to be tested and executing; and the storage module is used for storing a plurality of automatic steps which pass the execution as test cases when all the automatic steps to be tested corresponding to the functions pass the execution.
Further, the obtaining module is further configured to: judging whether a variable is used in the automation step to be tested; if yes, judging whether the variable required by the automation step to be tested exists in the variable pool; and if so, extracting the default value of the required variable from the variable pool as the input variable.
Further, the storage structure of the variable pool adopts a tree structure.
Further, the determining module is further configured to determine whether a variable is output; if yes, judging whether other automation steps output variables with the same variable name; if so, taking the variable name as a father node, and adding the variable value and the variable source into a child node; and if not, adding the variable name, the variable value and the variable source into the variable pool.
Further, the parent node further comprises a default variable value, the variable name corresponds to the default variable value, and the default variable value is a value of the same-name variable added to the variable pool where the variable name is located for the last time.
The technical scheme of the embodiment of the invention at least has the following advantages and beneficial effects:
some embodiments in this specification enable the execution of the automated debugging phase to be independent, without requiring that prior steps, or dependent test cases, be performed on each execution of the problem of the dependent variable. After the step is added or modified, the step can be directly executed, the variable in the variable pool is read for use, and the purpose of judging whether the debugging step is correct or not can be achieved. The problem that dependent cases or steps need to be repeatedly executed during debugging due to inter-step variable dependence in the case writing process is solved, and the case writing efficiency is remarkably improved.
Drawings
FIG. 1 is an exemplary flow chart of a test case compiling method based on automated testing according to some embodiments of the present invention;
FIG. 2 is an exemplary flow diagram for reading input variables from a pool of variables provided by some embodiments of the invention;
FIG. 3 is an exemplary flow diagram for writing a variable obtained after execution passes into a variable pool according to some embodiments of the present invention;
FIG. 4 is an exemplary flow diagram for use case writing using the automated test-based test use case writing method provided by some embodiments of the present invention;
fig. 5 is an exemplary block diagram of a test case writing system based on automated testing according to some embodiments of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Fig. 1 is an exemplary flowchart of a test case writing method based on automated testing according to some embodiments of the present invention. In some embodiments, process 100 may be performed by system 500. As shown in fig. 1, the process 100 may include the following steps:
and step 110, compiling an automation step to be tested. In some embodiments, step 110 may be performed by the writing module 510.
The automated step may refer to a step for automated testing of the software. The written automation step needs to be tested before being used for the system automation test, so the automation step which is not tested can be regarded as the automation step to be tested.
A variable pool may refer to a database that stores variables. One or more variables are stored in the variable pool, wherein the one or more variables are obtained by executing the historical automation step. The historical automation step may refer to an automation step that has been historically performed. The input variables may refer to variables required to execute an automation step under test, by which the automation step under test is executed, and when the execution passes, the variables to execute the automation step may be obtained. The automated test steps and variables may be input into a pool of variables, wherein when the next automated step to be tested for that automated step is executed, that automated step may be considered a historical automated step and the corresponding variables may be considered input variables. In some embodiments, a variable name of a variable required to perform the automation step under test may be obtained, and a value of the variable may be obtained from a variable pool by the variable name. For more on reading input variables from the variable pool, see FIG. 2 and its associated description.
When the automation step to be tested is executed, the expected function is realized, and the execution of the automation step to be tested is determined to pass; and if the expected function is not realized, determining that the execution of the automation step to be tested is failed. For example, the automation step to be tested is a step of logging in the APP, variables comprise a user name and a login password, and when the step of logging in the APP is executed by using the preset user name and login password, the APP is successfully logged in, and the step of logging in the APP is determined to be executed; otherwise, it fails.
For example, after the step of executing the login APP passes, a cookie including a user name and login information may be generated, the cookie may be written as a variable into the variable pool, and the automatic step to be tested that passes execution is used as an automatic step to indicate that the automatic step passes execution. In some embodiments, variables and their corresponding automation steps may be written to a pool of variables for use in subsequent software testing. For more on writing the variables obtained after execution pass into the variable pool, refer to fig. 3 and its associated description.
And step 160, if not, rewriting the automation step to be tested and executing again. In some embodiments, step 160 may be performed by authoring module 510.
And 170, when the automation steps to be tested corresponding to the functions are all executed, saving the plurality of executed automation steps as test cases. In some embodiments, step 170 may be performed by storage module 560.
A test case may refer to a set of steps used to test software. For example, the functions of the goods on shelf may include a step of registering an APP, a step of creating goods, a step of placing goods on shelf, and the like, and when the step of registering an APP, the step of creating goods, and the step of placing goods on shelf are all executed, the step of registering an APP, the step of creating goods, and the step of placing goods on shelf and the execution sequence thereof are saved as test cases.
FIG. 2 is an exemplary flow diagram for reading input variables from a pool of variables provided by some embodiments of the invention. In some embodiments, the flow 200 may be performed by the acquisition module 520. As shown in fig. 2, the process 200 may include the following steps:
the steps are performed separately. I.e. the automation step under test is performed separately without performing a preceding automation step.
And judging whether the automation step to be tested uses variables or not.
If not, directly executing the automatic step to be tested.
If yes, judging whether the variable required by the automation step to be detected exists in the variable pool.
When a variable is used by the automated step to be tested, the variable used can be considered to be a desired variable. In some embodiments, the variable name of the required variable may be obtained, matching may be performed based on the variable name and the variable names existing in the variable pool, and if there is a variable whose variable name is consistent, it is determined whether the source of the variable in the variable pool is consistent with the preceding automation step of the required variable, and if so, it is determined that the required variable exists in the variable pool. If not, the required variable does not exist in the variable pool. And when the required variable does not exist in the variable pool, executing the preamble step to obtain the required variable. For example, the automation step 1 is executed to obtain the variable 1, the automation step 2 is executed to obtain the variable 2, the automation step 3 to be tested is executed, and the variable 2 is required for the execution of the automation step 3 to be tested, so that whether the variable 2 exists in the variable pool can be searched, and if yes, the automation step 3 to be tested is executed directly based on the value of the variable 2; when the variable 2 does not exist in the variable pool, the automation step 2 is executed again to obtain the variable 2, when the automation step 2 is executed, whether the variable 1 exists or not can be firstly searched in the variable pool, when the variable 1 exists, the automation step 2 is executed by directly using the variable 1 in the variable pool, when the variable 1 does not exist, the automation step 1 is executed again, and so on until the variable 2 is obtained.
If so, extracting the default value of the required variable from the variable pool as an input variable.
In the process of software testing, a plurality of variable values may exist for the same variable name, and in the case of a plurality of variable values, a certain variable value may be designated as a default value of the variable.
Fig. 3 is an exemplary flowchart for writing a variable obtained after execution passes into a variable pool according to some embodiments of the present invention. In some embodiments, the flow 300 may be performed by the determination module 550. In some embodiments, the storage structure of the variable pool may take a tree structure. As shown in fig. 3, the process 300 may include the following steps:
the use case/step is executed. The method comprises the following steps of testing a case, and carrying out automation.
And judging whether a variable is output.
And if so, judging whether the other automation steps output the variables with the same variable name. Other steps may refer to other automation steps of the test case, and may also refer to automation steps of other test cases.
If yes, the variable name is used as a father node, and the variable value and the variable source are added into the child node. For example, for variable 1, the parent node is variable name 1, and the child nodes are variable value 1 [ variable source ] and variable value 2 [ variable source ].
In some embodiments, the parent node further includes a default variable value, the variable name corresponding to the default variable value, the default variable value being a value of the same-name variable that was last added to the variable pool in which the variable name resides. For example, for the variable goods _ id, the variable value 1: GOOD01 and variable 2: GOOD02, variable value 2 is the variable added to the variable pool where the variable name is last, so GOOD02 can be used as the value of variable GOODs _ id. For example, for the variable goods _ id, its parent node may be the goods _ id: GOOD02 [ merchandise management-merchandise shelving case-merchandise shelving ], and child nodes thereof may include GOOD01 [ merchandise management-merchandise shelving case-merchandise creation ] and GOOD02 [ merchandise management-merchandise shelving case-merchandise shelving ].
And if not, adding the variable name, the variable value and the variable source into the variable pool. For example, for variable 2, variable name 2 [ variable value and variable source ]. Illustratively, for variable cookies, the form in the variable pool is cookie: EDNG10DFN3 [ merchandise management-merchandise listing case-user login ].
In some embodiments, the variable pool further provides a function of editing variables, so that the variables in the variable pool can be modified and added to adapt to scenes under different requirements.
FIG. 4 is an exemplary flow chart for use case writing using the automated test based test use case writing method provided by some embodiments of the present invention. As shown in fig. 4, the process 400 is a step of writing a test case for an on-shelf commodity, and may include:
the test case is started to be created.
Compiling a user login step, executing the user login step by using a user name and a password in a variable pool, and compiling and creating a commodity when the user login step is passed; and rewriting the user login when the execution is failed.
Writing a step of creating commodities, and executing the step of creating the commodities by using cookies in the variable pool; wherein the cookie includes user and user information. When the pass is executed, writing the goods on the shelf; and rewriting the step of creating the commodity when the execution is failed.
Writing goods on shelves, and executing the goods by using the goods _ id in the variable pool; the goods _ id includes the commodity and the information of the commodity. When the execution is passed, the test case is stored, and the creation of the test case is completed; and rewriting the goods on the shelf when the execution fails.
Fig. 5 is an exemplary block diagram of a test case writing system based on automated testing according to some embodiments of the present invention. As shown in FIG. 5, the system 500 includes a writing module 510, an obtaining module 520, an executing module 530, a determining module 540, a determining module 550, and a storing module 560.
The compiling module 510 is used for compiling the automation steps to be tested. The compiling module 510 is further configured to re-compile the automation step under test when the automation step under test is not executed. For more on the authoring module 510, see FIG. 1 and its associated description.
The obtaining module 520 is configured to read an input variable from the variable pool based on the automation step to be tested; one or more variables are stored in the variable pool; one or more variables are derived by performing historical automation steps. The obtaining module 520 is further configured to: judging whether a variable is used in the automation step to be tested; if yes, judging whether the variable required by the automation step to be tested exists in a variable pool or not; if so, extracting the default value of the required variable from the variable pool as an input variable. For more on the acquisition module 520, refer to fig. 1 and its associated description.
The execution module 530 is used for executing the automation step to be tested by using the input variable; the execution module is also used for re-executing the automatic step to be tested according to the re-written automatic step to be tested when the automatic step to be tested is not executed. For more of the execution module 530, refer to fig. 1 and its associated description.
The determining module 540 is used for determining whether the to-be-tested automation step is executed successfully. For more details of the determining module 540, refer to fig. 1 and the related description thereof.
The determining module 550 is configured to, when the to-be-tested automation step passes execution, write the variable obtained after the execution passes into the variable pool, and take the to-be-tested automation step as an automation step; and compiling the next automation step to be tested and executing. The determining module 550 is further configured to determine whether a variable is output; if yes, judging whether other automation steps output variables with the same variable name; if so, taking the variable name as a father node, and adding the variable value and the variable source into the child node; and if not, adding the variable name, the variable value and the variable source into the variable pool. For more of the determination module 550, refer to fig. 1 and its associated description.
The storage module 560 is configured to, when all the automation steps to be tested corresponding to the functions are executed and passed, store the executed automation steps as test cases. For more on the storage module 560, refer to fig. 1 and its associated description.
The above is only a preferred embodiment of the present invention, and is not intended to limit the present invention, and various modifications and changes will occur to those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (10)
1. A test case compiling method based on automatic test is characterized by comprising the following steps:
compiling an automation step to be tested;
reading input variables from a variable pool based on the automation step to be tested; one or more variables are stored in the variable pool; the one or more variables are derived by performing a historical automation step;
executing the automation step to be tested using the input variables;
judging whether the automation step to be tested passes the execution;
if so, writing the variable obtained after the execution is passed into the variable pool, and taking the automation step to be tested as an automation step; compiling the next automation step to be tested and executing;
if not, rewriting the automation step to be tested and executing again;
and when the automation steps to be tested corresponding to the functions are executed and passed, taking a plurality of the executed and passed automation steps as test cases for storage.
2. The method for test case authoring based on automated testing of claim 1 wherein reading input variables from a pool of variables comprises:
judging whether a variable is used in the automation step to be tested;
if yes, judging whether the variable required by the automation step to be tested exists in the variable pool;
and if so, extracting default values of the required variables from the variable pool as the input variables.
3. The method for test case authoring based on automated testing of claim 1, wherein the storage structure of the variable pool is in a tree structure.
4. The method for writing test cases based on automated testing according to claim 3, wherein the writing the variables obtained after the execution passes into the variable pool comprises:
judging whether a variable is output;
if yes, judging whether other automation steps output variables with the same variable name;
if so, taking the variable name as a father node, and adding the variable value and the variable source into a child node;
and if not, adding the variable name, the variable value and the variable source into the variable pool.
5. The method for writing the test case based on the automated testing as claimed in claim 4, wherein the parent node further includes a default variable value, the variable name corresponds to the default variable value, and the default variable value is a value of a same-name variable added to a variable pool where the variable name is located last time.
6. A test case compiling system based on automatic testing is characterized by comprising a compiling module, an obtaining module, an executing module, a judging module, a determining module and a storing module;
the compiling module is used for compiling the automation steps to be tested; the compiling module is also used for re-compiling the automation step to be tested when the automation step to be tested is not executed;
the acquisition module is used for reading input variables from a variable pool based on the automation step to be tested; one or more variables are stored in the variable pool; the one or more variables are derived by performing a historical automation step;
the execution module is used for executing the automation step to be tested by using the input variable; the execution module is also used for re-executing the to-be-tested automation step according to the re-written to-be-tested automation step when the to-be-tested automation step is not executed;
the judgment module is used for judging whether the automation step to be tested passes the execution;
the determination module is used for writing variables obtained after the execution of the automation step to be tested passes into the variable pool when the execution of the automation step to be tested passes, and taking the automation step to be tested as an automation step; compiling the next automation step to be tested and executing;
and the storage module is used for storing a plurality of automatic steps which are executed to pass as test cases when the automatic steps to be tested corresponding to the functions are executed to pass.
7. The automated test-based test case authoring system of claim 6, wherein the acquisition module is further configured to:
judging whether a variable is used in the automation step to be tested;
if yes, judging whether the variable required by the automation step to be tested exists in the variable pool;
and if so, extracting the default value of the required variable from the variable pool as the input variable.
8. The automated test-based test case authoring system of claim 6, wherein the storage structure of the variable pool is a tree structure.
9. The automated test-based test case authoring system of claim 8, wherein the determining module is further configured to determine whether a variable is output;
if yes, judging whether other automation steps output variables with the same variable name;
if so, taking the variable name as a father node, and adding the variable value and the variable source into a child node;
and if not, adding the variable name, the variable value and the variable source into the variable pool.
10. The automated test-based test case authoring system of claim 9, wherein the parent node further comprises a default variable value, wherein the variable name corresponds to the default variable value, and wherein the default variable value is a value of a same-name variable that was added to the variable pool in which the variable name was last added.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211405771.XA CN115687140B (en) | 2022-11-10 | 2022-11-10 | Test case writing method and system based on automatic test |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211405771.XA CN115687140B (en) | 2022-11-10 | 2022-11-10 | Test case writing method and system based on automatic test |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115687140A true CN115687140A (en) | 2023-02-03 |
CN115687140B CN115687140B (en) | 2024-01-30 |
Family
ID=85052337
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211405771.XA Active CN115687140B (en) | 2022-11-10 | 2022-11-10 | Test case writing method and system based on automatic test |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115687140B (en) |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106528395A (en) * | 2015-09-09 | 2017-03-22 | 阿里巴巴集团控股有限公司 | Test case generation method and apparatus |
CN107894952A (en) * | 2017-11-08 | 2018-04-10 | 中国平安人寿保险股份有限公司 | Generation method, device, equipment and the readable storage medium storing program for executing of interface testing use-case |
CN108287788A (en) * | 2017-12-26 | 2018-07-17 | 广东睿江云计算股份有限公司 | A kind of use-case step matching method based on test case, system |
CN108563539A (en) * | 2018-03-21 | 2018-09-21 | 广州视源电子科技股份有限公司 | Interface testing method, server, readable storage medium and system |
CN109697167A (en) * | 2018-12-27 | 2019-04-30 | 江苏满运软件科技有限公司 | Management method, system, electronic equipment and the storage medium of test variable |
CN110134612A (en) * | 2019-05-17 | 2019-08-16 | 深圳前海微众银行股份有限公司 | UI test data generating method, device, equipment and readable storage medium storing program for executing |
CN110287121A (en) * | 2019-06-28 | 2019-09-27 | 深圳前海微众银行股份有限公司 | A kind of test method and device automating use-case |
CN110968500A (en) * | 2018-09-30 | 2020-04-07 | 北京国双科技有限公司 | Test case execution method and device |
CN112328226A (en) * | 2020-09-17 | 2021-02-05 | 北京中数科技术有限公司 | Embedded system automatic test code generation method and device |
CN112579455A (en) * | 2020-12-23 | 2021-03-30 | 安徽航天信息有限公司 | Interface automatic testing method and device, electronic equipment and storage medium |
US11436130B1 (en) * | 2020-07-28 | 2022-09-06 | Amdocs Development Limited | System, method, and computer program for automating manually written test cases |
-
2022
- 2022-11-10 CN CN202211405771.XA patent/CN115687140B/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106528395A (en) * | 2015-09-09 | 2017-03-22 | 阿里巴巴集团控股有限公司 | Test case generation method and apparatus |
CN107894952A (en) * | 2017-11-08 | 2018-04-10 | 中国平安人寿保险股份有限公司 | Generation method, device, equipment and the readable storage medium storing program for executing of interface testing use-case |
CN108287788A (en) * | 2017-12-26 | 2018-07-17 | 广东睿江云计算股份有限公司 | A kind of use-case step matching method based on test case, system |
CN108563539A (en) * | 2018-03-21 | 2018-09-21 | 广州视源电子科技股份有限公司 | Interface testing method, server, readable storage medium and system |
CN110968500A (en) * | 2018-09-30 | 2020-04-07 | 北京国双科技有限公司 | Test case execution method and device |
CN109697167A (en) * | 2018-12-27 | 2019-04-30 | 江苏满运软件科技有限公司 | Management method, system, electronic equipment and the storage medium of test variable |
CN110134612A (en) * | 2019-05-17 | 2019-08-16 | 深圳前海微众银行股份有限公司 | UI test data generating method, device, equipment and readable storage medium storing program for executing |
CN110287121A (en) * | 2019-06-28 | 2019-09-27 | 深圳前海微众银行股份有限公司 | A kind of test method and device automating use-case |
US11436130B1 (en) * | 2020-07-28 | 2022-09-06 | Amdocs Development Limited | System, method, and computer program for automating manually written test cases |
CN112328226A (en) * | 2020-09-17 | 2021-02-05 | 北京中数科技术有限公司 | Embedded system automatic test code generation method and device |
CN112579455A (en) * | 2020-12-23 | 2021-03-30 | 安徽航天信息有限公司 | Interface automatic testing method and device, electronic equipment and storage medium |
Non-Patent Citations (3)
Title |
---|
MAHFUZA KHATUN ET AL.: "Testing pairs of continuous random variables for independence: A simple heuristic", JOURNAL OF COMPUTATIONAL MATHEMATICS AND DATA SCIENCE, vol. 1 * |
张旭: "基于上下文的错误定位方法研究", 《中国优秀硕士学位论文全文数据库信息科技辑》, no. 01 * |
王帅兵: "一种软件自动测试工具的研究与实现", 《中国优秀硕士学位论文全文数据库信息科技辑》, no. 03 * |
Also Published As
Publication number | Publication date |
---|---|
CN115687140B (en) | 2024-01-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9047402B2 (en) | Automatic calculation of orthogonal defect classification (ODC) fields | |
CN107665171B (en) | Automatic regression testing method and device | |
CN112306855B (en) | Interface automation test method, device, terminal and storage medium | |
CN112256581B (en) | Log playback test method and device for high-simulation securities trade trading system | |
CN107741903A (en) | Application compatibility method of testing, device, computer equipment and storage medium | |
CN114139923A (en) | Task relevance analysis method and device and computer readable storage medium | |
CN112948473A (en) | Data processing method, device and system of data warehouse and storage medium | |
CN111125067A (en) | Data maintenance method and device | |
CN109697161A (en) | A kind of test method of storing process, storage medium and database server | |
CN116089260A (en) | SQL sentence detection method and device | |
CN104360939A (en) | Method, equipment and system for positioning fault | |
CN115687140A (en) | Test case compiling method and system based on automatic test | |
CN112035308A (en) | Method and device for generating system interface test table | |
US20140129879A1 (en) | Selection apparatus, method of selecting, and computer-readable recording medium | |
CN116089518A (en) | Data model extraction method and system, terminal and medium | |
CN118503139B (en) | Automatic test method, equipment and medium for three-dimensional CAD system | |
CN113450114B (en) | Data file acquisition method and device based on block chain | |
CN116594917B (en) | UI testing method and device, electronic equipment and machine-readable storage medium | |
CN118377712B (en) | Simulation model test integrated method and device | |
CN114328408B (en) | Log screening method, system, equipment and medium | |
CN113297093A (en) | Testing method and device for bank software | |
CN111061632A (en) | Automatic testing method and system for report data | |
CN117873863A (en) | Fuzzy playback method for automatic test, terminal and readable storage medium | |
CN118051529A (en) | Shell language-based data updating method | |
CN118689762A (en) | Code quality test automatic configuration method and related equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |