CN115373988A - Test case generation method, test method, electronic device, and storage medium - Google Patents
Test case generation method, test method, electronic device, and storage medium Download PDFInfo
- Publication number
- CN115373988A CN115373988A CN202211010698.6A CN202211010698A CN115373988A CN 115373988 A CN115373988 A CN 115373988A CN 202211010698 A CN202211010698 A CN 202211010698A CN 115373988 A CN115373988 A CN 115373988A
- Authority
- CN
- China
- Prior art keywords
- test
- tested
- value
- function
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 513
- 238000000034 method Methods 0.000 title claims abstract description 130
- 238000003860 storage Methods 0.000 title claims abstract description 47
- 238000010998 test method Methods 0.000 title claims description 7
- 238000004806 packaging method and process Methods 0.000 claims abstract description 24
- 230000006870 function Effects 0.000 claims description 258
- 238000004458 analytical method Methods 0.000 claims description 42
- 230000003068 static effect Effects 0.000 claims description 27
- 238000010276 construction Methods 0.000 claims description 8
- 230000008569 process Effects 0.000 description 25
- 230000018109 developmental process Effects 0.000 description 20
- 238000011161 development Methods 0.000 description 18
- 238000010586 diagram Methods 0.000 description 16
- 238000012545 processing Methods 0.000 description 12
- 230000002787 reinforcement Effects 0.000 description 11
- 239000000284 extract Substances 0.000 description 10
- 239000011800 void material Substances 0.000 description 6
- 238000006243 chemical reaction Methods 0.000 description 5
- 238000013527 convolutional neural network Methods 0.000 description 5
- 238000009826 distribution Methods 0.000 description 5
- 230000008520 organization Effects 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000003062 neural network model Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 101001018259 Homo sapiens Microtubule-associated serine/threonine-protein kinase 1 Proteins 0.000 description 2
- 101000693728 Homo sapiens S-acyl fatty acid synthase thioesterase, medium chain Proteins 0.000 description 2
- 102100025541 S-acyl fatty acid synthase thioesterase, medium chain Human genes 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012905 input function Methods 0.000 description 2
- 230000008140 language development Effects 0.000 description 2
- 238000010606 normalization Methods 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000013100 final test Methods 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000006386 memory function Effects 0.000 description 1
- 238000000638 solvent extraction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3684—Test management for test design, e.g. generating new test cases
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
The application discloses a test case generation method, a test case testing method, electronic equipment and a computer readable storage medium, wherein the test case generation method comprises the following steps: acquiring a file to be tested; extracting a function to be tested in a file to be tested and annotation information of the function to be tested; determining at least one test value of the function to be tested, which is entered into the parameter, based on the function to be tested and the annotation information; and combining and packaging the function to be tested and the test value based on the template file corresponding to the file to be tested to generate a test case of the file to be tested. By the scheme, the risk of incomplete coverage of the test cases can be reduced, the labor cost can be reduced, and the generation flow of the cases is optimized.
Description
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a test case generation method, a test case testing method, an electronic device, and a computer-readable storage medium.
Background
With the development of computer technology and the increasing popularity of computers, the functions of software in computers are also increasing, so that more convenience can be brought to more users. In the process of software development by programmers, the programmers need to perform unit test on the written development codes, namely, the programmers need to write corresponding unit test codes so as to test the correctness of the development codes, thereby helping the developers to find problems and improving the software quality.
Generally, a unit test method is to manually write unit test codes by developers and then test the running units to determine the correctness of the codes; the method is based on the self quality of developers, has no normalization, is difficult to count and evaluate the quality of test codes; meanwhile, no existing tool is available for testing the codes, the compiling efficiency is low, the testing time is long, and the time cost of the whole software development is high. In addition, the test case can be compiled by depending on the unit test library, and developers can compile own test case codes according to the unit test library, so that the development efficiency is improved; however, the response mechanism of the unit test library to the events is still complicated, developers are required to develop the unit test library to adapt to different software, and some test events are easy to miss, so that the test cases are not covered completely.
Disclosure of Invention
The application at least provides a test case generation method, a test case testing method, electronic equipment and a computer readable storage medium.
A first aspect of the present application provides a method for generating a test case, where the method includes: acquiring a file to be tested; extracting a function to be tested in a file to be tested and annotation information of the function to be tested; determining at least one parameter-entering test value of the function to be tested based on the function to be tested and the annotation information; and combining and packaging the functions to be tested and the test values based on the template file corresponding to the files to be tested, and generating the test case of the files to be tested.
Therefore, the comment information corresponding to the function to be tested is utilized to endow the test value to at least one entry parameter of the function to be tested, so that the function to be tested and the test value are packaged into the test case in a combined mode, the test value of the function to be tested is enriched, the risk of incomplete case coverage is reduced, the cost of manual assignment is reduced, and the case generation flow is optimized.
In some embodiments, the annotation information comprises entry annotation information; determining at least one test value of the function to be tested, which is entered into the reference, based on the function to be tested and the annotation information, and the method comprises the following steps: determining a legal value range of at least one parameter-entering test value based on the parameter-entering annotation information; and determining at least one entered test value based on the legal value range.
Therefore, the legal value range of the test value of the entered parameter is determined according to the entered parameter annotation information, and then the test value of the entered parameter is determined according to the legal value range, so that the validity of the test value of the entered parameter is enhanced, the cost of manual value taking is reduced, and the process of generating the use case is optimized.
In some embodiments, the entry annotation information comprises function annotation information; determining at least one entered test value based on the legal value range, including: and determining at least one test value corresponding to at least one entry parameter according to at least one position of the function annotation information in the inner part, the boundary and the outer part of the legal value range.
Therefore, all test values of the entered parameter are determined based on at least one position of the function annotation information in the inner part, the boundary and the outer part of the legal value range of the entered parameter, so that the effectiveness of the test values of the entered parameter is enhanced, the cost of manual value taking is reduced, and the process of generating the use case is optimized.
In some embodiments, determining at least one entered test value based on the legal value range includes: and in response to the fact that the data type of the function to be tested is a preset data type, setting at least one test value corresponding to at least one entry parameter at least one of the inner part, the boundary and the outer part of the legal value range.
Therefore, at least one test value of the input parameter is determined based on the data type of the function, so that the effectiveness of the test value of the input parameter is enhanced, the cost of manual value taking is reduced, and the process of generating the use case is optimized.
In some embodiments, determining at least one entered test value based on the legal value range includes: responding to the data type of the function to be tested as a construction data type, and converting the data type of the function to be tested into character string data; the character string data comprises at least one type of key-value pair data, the number of each type of key-value pair data is more than or equal to one, each key-value pair data corresponds to one entry parameter in the function to be tested, and each key-value pair data comprises at least one entry key value; and determining at least one entry key value corresponding to each key value pair data as a test value of entry parameters corresponding to the key value pair data based on the legal value range.
Therefore, the function to be tested is converted into the character string data, and the key value pair data corresponding to the at least one parameter-entering test value is determined based on the key value pair data of the character string data of the corresponding type, so that the effectiveness of the parameter-entering test values is enhanced, the cost of manual value taking is reduced, and the process of generating the use case is optimized.
In some embodiments, extracting the function to be tested and the annotation information of the function to be tested in the file to be tested includes: matching the file to be tested based on a preset regular expression to obtain a function to be tested in the file to be tested; and carrying out static analysis on the function to be tested to obtain the annotation information of the function to be tested.
Therefore, the function to be tested is statically analyzed according to the preset regular expression to obtain the annotation information of the function, so that the selectivity of the function to be tested is enhanced, the cost for manually analyzing and annotating the function to be tested is reduced, and the process of generating the use case is optimized.
In some embodiments, the generating a test case of the file to be tested by performing combined packaging on the function to be tested and the test value based on the template file corresponding to the file to be tested includes: assembling the function to be tested and each test value based on the template file to generate unit test data corresponding to each test value; and combining and packaging the test data of each unit to generate a test case.
Therefore, the function to be tested and each test value are independently assembled by using the template file to obtain the unit test data of each test value, and then the unit test data of each test value are combined and packaged to obtain the test case, so that the enrichment of the unit test data of the function to be tested is facilitated, the risk of incomplete case coverage is reduced, the cost of manual assignment is reduced, and the generation flow of the case is optimized.
In some embodiments, the method further comprises: based on the annotation information and the at least one entered test value, an expected return value corresponding to each unit of test data is determined.
Therefore, the expected return value of the test data of the corresponding unit is determined by utilizing the annotation information of the function to be tested and at least one test value of the input parameter, the effectiveness of the expected return value is enhanced, the cost of manual value taking is reduced, and the process of generating the use case is optimized.
A second aspect of the present application provides a test method for a test case, where the method includes: obtaining a test case of a file to be tested and at least one expected return value corresponding to the test case; the test case is obtained based on the test case generation method; testing the file to be tested based on the test case to obtain at least one test return value corresponding to the file to be tested; wherein the at least one test return value corresponds to the at least one expected return value; and matching the at least one test return value with the at least one expected return value to obtain a test result of the file to be tested.
Therefore, the test case of the file to be tested is tested to obtain the test return value, and the test return value is matched and compared with the expected return value prepared in advance to obtain the test result, so that the test accuracy of the test case is improved, the cost of manually analyzing data is reduced, and the process of generating the case is optimized.
A third aspect of the present application provides an electronic device, comprising: the processor calls the program data stored in the memory to execute the test case generation method or the test case test method.
A fourth aspect of the present application provides a computer-readable storage medium, in which program instructions are stored, and the program instructions are executed to implement the method for generating a test case as described above, or to implement the method for testing a test case as described above.
According to the scheme, the comment information corresponding to the function to be tested is utilized, the test value is given to at least one entry parameter of the function to be tested, the function to be tested and the test value are packaged into the test case in a combined mode, the test value of the function to be tested is enriched, the risk of incomplete case coverage is reduced, the cost of manual assignment is reduced, and therefore the case generation flow is optimized.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and, together with the description, serve to explain the principles of the application.
FIG. 1 is a schematic structural diagram of an embodiment of an electronic device provided in the present application;
FIG. 2 is a flowchart illustrating an embodiment of a method for generating a test case provided in the present application;
FIG. 3 is a flow chart illustrating an embodiment of extracting function to be tested and annotation information according to the present application;
FIG. 4 is a schematic flow chart diagram illustrating one embodiment of determining a value for a reference test value;
FIG. 5 is a schematic flow chart diagram illustrating another embodiment of the present application for determining a value for a reference test;
FIG. 6 is a flow chart illustrating an embodiment of the present application for packaging the combination of functions to be tested and test values;
FIG. 7 is a flowchart illustrating an embodiment of a test case testing method provided in the present application;
FIG. 8 is a flowchart illustrating an embodiment of a test case generation and testing method provided in the present application;
FIG. 9 is a flowchart illustrating a test case generating and testing method according to another embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of another embodiment of an electronic device provided in the present application;
fig. 11 is a schematic structural diagram of an embodiment of a computer-readable storage medium provided in the present application.
Detailed Description
The embodiments of the present application will be described in detail below with reference to the drawings.
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, procedures, techniques, etc. in order to provide a thorough understanding of the present application.
The technical solutions in the embodiments of the present application are clearly and completely described with reference to the drawings in the embodiments of the present application. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be further noted that, for the convenience of description, only a part of the structure or the flow related to the present application is shown in the drawings, not all of the structure or the flow related to the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Reference in the application to "an embodiment" means that a particular feature, flow, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The steps in the embodiments of the present application are not necessarily processed according to the described step sequence, and may be optionally rearranged in a random manner, or steps in the embodiments may be deleted, or steps in the embodiments may be added according to requirements.
The term "and/or" in embodiments of the present application is merely one type of associative relationship that describes the associated object, and is a possible combination that includes any and all of one or more of the associated listed items, which means that there may be three types of relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter associated objects are in an "or" relationship. Further, "plurality" herein means two or more than two. In addition, the term "at least one" herein means any one of a variety or any combination of at least two of a variety, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C. It should also be noted that: when used in this specification, the term "comprises/comprising" specifies the presence of stated features, integers, steps, operations, elements and/or components but does not preclude the presence or addition of one or more other features, integers, steps, operations, elements and/or components and/or groups thereof.
The terms "first", "second", etc. in this application are used to distinguish different objects, and are not used to describe a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
In addition, although the terms "first", "second", etc. are used several times in this application to describe various operations (or various elements or various applications or various instructions or various thresholds) etc., these operations (or elements or applications or instructions or thresholds) should not be limited by these terms. These terms are only used to distinguish one operation (or element or application or instruction or threshold) from another operation (or element or application or instruction or threshold). For example, a first type may be referred to as a second type, and a second type may also be referred to as a first type, but both may include different ranges without departing from the scope of the present application, and the first type and the second type may both be a collection of various function types, except that both are not a collection of the same function types.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an embodiment of an electronic device provided in the present application.
In the embodiment of the present disclosure, the electronic device 10A includes a resource obtaining module 11A, a use case generating module 12A, and a use case testing module 13A.
In one embodiment, the electronic device 10A (e.g., mobile terminal, fixed terminal) may be implemented in various forms. Among them, the electronic device 10A may be a mobile terminal capable of acquiring and storing resource files including, for example, a mobile phone, a smart phone, a notebook computer, a Personal Digital Assistant (PDA), a tablet computer (PAD), and the like, and generating and testing cases based on the resource files, and the electronic device 10A may also be a fixed terminal capable of acquiring and storing resource files of a Digital broadcast transmitter, a Digital TV, a desktop computer, a server, and the like, and generating and testing cases based on the resource files. In the following, it is assumed that the electronic device 10A is a fixed terminal. However, it will be understood by those skilled in the art that the configuration according to the embodiment of the present application can also be applied to a mobile type terminal if there are operations or elements particularly for the purpose of movement.
In one embodiment, the electronic device 10A may load application programs, program data, and operating systems that may include various applications that are being executed, such as a code editing tool, a code packaging tool, a unit testing tool, a middle tier application, a relational database management system (RDBMS), and the like.
By way of example, the code editing tool may produce a resulting code editor for a development engineer based on code development type, structure, relationships, etc. that is used to write and modify code in the resource file. Alternatively, the code editing tool may also be a code editing platform or client, such as the chocolatapp editing tool of the Mac system, the Coda tool or Transmit editing tool of Web page design, or the like.
As an example, the operating system may include versions of Microsoft WindowsAppleAnd/or Linux operating system, various business or classOperating systems (including but not limited to various GNU/Linux operating systems, google)Etc.) and/or mobile operating systems, such as An operating system, and other operating systems.
In an embodiment, the resource obtaining module 11A is configured to obtain a file to be tested and a template file corresponding to the file to be tested.
In some embodiments, the files to be tested may be source and header files written by a development engineer. Wherein, the header file is used for storing declarations of data classes (including declarations of members of the data classes and declarations of functions in the data classes), function prototypes, entry information, return value information, macro commands (# define), constants and the like; the source file is used for storing specific code data and code data information (including data types such as preset data types, construction types and structure types) of the functions already declared in the header file, and the like.
In some embodiments, the data formats of the source file and the header file may be converted by a code format converter or a code conversion model, such as JSON array format, java object format, HTML array format, mysql array format, or XML object format. The data structure of the source file and the header file can be, for example, "Python", "javascript", "C + +", etc., and the value of the data structure can be obtained through the code index, and further can be stored in the corresponding array or object for temporary storage.
In an embodiment, the use case generating module 12A is configured to generate a test use case according to the acquired file to be tested and the file to be templated, and package the test use case.
In some embodiments, the use case generating module 12A may extract a to-be-tested function in the to-be-tested file and annotation information of the to-be-tested function, determine multiple test values of the to-be-tested function and multiple input parameters according to the to-be-tested function and the annotation information, and perform combined packaging on the to-be-tested function and the test values through a template file corresponding to the to-be-tested file to generate a test case of the to-be-tested file.
In some embodiments, the use case generation module 12A may be applied to various downstream code editing tools or code packaging tools to generate or package test cases. For example, test cases may be automatically generated or converted into a data format by a code editing tool and may be automatically generated without any manual encoding. The test cases may be executed or compiled by an executable program executed by one or more processors, or may be executed or compiled by an interpretable program, such as a web browser, to perform the case testing process.
In an embodiment, the use case testing module 13A is configured to test a file to be tested based on the obtained test case to obtain at least one test return value corresponding to the file to be tested, and match and compare the obtained at least one test return value with at least one pre-generated expected return value to obtain a test result of the test case corresponding to the file to be tested.
Wherein the obtained test return values correspond to expected return values one to one. For example, if the expected return value of the test case a is A1 and the test return value corresponding to A1 is A2, the obtained A1 and A2 are compared to obtain a test result of a, and then it is determined whether the actual output of a meets the expectation based on the test result.
In some embodiments, the use case testing module 13A may be a use case testing program manufactured by a development engineer based on the type, structure, relationship, and the like of the test case. Optionally, the use case testing module 13A may also be a use case testing platform or a client, such as a performance testing tool (e.g., google-perftools) for C programs or C + + programs, a performance testing tool (e.g., loadRunner) for cs architecture, and the like.
In some embodiments, the use case testing module 13A may be a use case testing platform applied to a Customer information system (Customer Integrated Systems, CIS), which is a digital Customer service system capable of providing various services to customers in the form of a portal.
In some embodiments, the use case testing platform may be based on the Liunx (GNU/Linux) system, the Mac (Macintosh) system, the Microsoft system, or the like for system operation. The operating system may be used for C language development, QT (application development framework) interface editing, and application layer application of the use case testing module 13A, and may also be used for programming a test case by using a classic combination GCC (GNU Compiler Collection, GNU Compiler suite) + Make/Makefile + GBD (GNU Project debug, GNU Project Debugger) + valgrid (memory analysis tool) + Vim/EMACS/gdit/blanket Text editor). The common data structure and algorithm can be packaged in the C language development, and the interface library of the QT can be applied to the secondary development of test cases.
In some embodiments, the use case testing platform comprises a portal port for providing one-to-one featured services to users in the foreground and a plurality of business systems in the background, and the application of the use case testing is expanded to the client side, so that the users can process the test cases at any place. Optionally, the foreground portal port of the use case testing platform is a user interface for displaying and applying the characteristic services of the platform, such as for a user to select and read a test case in the interface, or to display a use case testing result. The background business system of the case testing platform is a server, a database and the like and is used for processing and storing application services of the platform, for example, the background business system modifies and remanufactures the test cases in the interface, and stores the modified and remanufactured test cases in the database. Optionally, the user may also input corresponding code data or control parameters to the use case testing platform through the input device to apply the feature services of the platform and display the application services in the user interface.
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating a method for generating a test case according to an embodiment of the present disclosure. The method is applied to the electronic device in the above embodiment to be executed by the electronic device, and specifically, the method may include the following steps:
step 11: and acquiring a file to be tested.
In one embodiment, the files to be tested may be source and header files written by a development engineer. Wherein, the header file is used for storing the declaration of the function in the file to be tested (including the declaration of the member of the data class and the declaration of the function in the data class), the function prototype, the entry information, the return value information, the macro command (# define), the constant and the like; the source file is used for storing specific code data and code data information (including data types such as preset data types, construction types and structure types) of the functions already declared in the header file, and the like.
In an embodiment, the electronic device may obtain the file to be tested by connecting and reading a source file and a header file stored in the use case testing platform or a third-party organization (such as a database, a cloud server, and the like).
In some embodiments, the data format of the obtained source file and header file may be converted by a code format converter or a code conversion model, such as a JSON array format, a java object format, an HTML array format, a mysql array format, or an XML object format. The data structures of the source file and the header file can be, for example, "Python", "javascript", "C + +", etc., and the value of the data structures can be obtained through the code index, and further can be stored in the corresponding array or object for temporary storage.
Step 12: and extracting the functions to be tested in the files to be tested and the annotation information of the functions to be tested.
In an embodiment, the electronic device extracts a to-be-tested function in a to-be-tested file based on a code static analysis method, and further extracts corresponding annotation information based on the to-be-tested function. The code static analysis method can be executed through a static analysis model, and the execution method comprises two parts of performing regular matching on a header file and a source file and generating an analysis report for a function to be tested, wherein the analysis report comprises generated annotation information.
In some embodiments, the annotation information of the function to be tested may be manually annotated in the header file and the source file by a development engineer when the development engineer writes the function to be tested, and the electronic device extracts the annotation information manually written in the function to be tested based on the extracted function to be tested.
In some embodiments, the extracted functions to be tested in the files to be tested include a basic type function, an enumeration type, a void type, a derivative type and the like; the annotation information corresponding to the function to be tested is extracted and comprises parameter entering data, parameter entering data types, return value types and the like defined by the function to be tested.
Step 13: and determining at least one test value of the function to be tested, which is entered into the parameter, based on the function to be tested and the annotation information.
In one embodiment, the electronic device determines a test value of at least one entry parameter of the function to be tested based on at least one of entry parameter data, entry parameter data type and return value type in the function to be tested and the annotation information. The function to be tested in the file to be tested can comprise at least one, each function to be tested can comprise at least one parameter data, and each parameter data can comprise at least one test value.
In some embodiments, the at least one entered test value of the function to be tested may be manually determined and/or modified by the development engineer based on the obtained function to be tested and the annotation information.
Step 14: and combining and packaging the function to be tested and the test value based on the template file corresponding to the file to be tested to generate a test case of the file to be tested.
In an embodiment, the electronic device or a third-party organization (e.g., a database, a cloud storage center) stores template files corresponding to a variety of files to be tested. After the electronic equipment obtains the file to be tested, the type of the file to be tested is firstly identified, and then the template file corresponding to the file to be tested is obtained.
In an embodiment, the electronic device performs single combined packaging on the function to be tested and each corresponding test value according to the requirement of the corresponding template file, that is, each test value is separately packaged with the corresponding function to be tested and the corresponding template file to obtain a unit test data, and then performs combined packaging on each input unit test data to generate the test case.
In an implementation scenario, the electronic device connects to a cloud storage platform, and reads a source file and a header file from the platform to obtain a file to be tested. The electronic equipment firstly regularly matches all functions to be tested in the file to be tested based on a code static analysis method, and then further obtains a static analysis report of the functions to be tested, namely corresponding annotation information, through a static analysis model based on the functions to be tested. Finally, the electronic device determines a first test value A2 and a second test value A3 of the parameter A1 of the function A to be tested, and a test value B2 and a test value B4 of the first parameter B1 and the second parameter B3 of the function B to be tested based on the function to be tested and the annotation information. And finally, the electronic equipment extracts the corresponding template file from the self storage medium according to the type of the file to be tested, and combines and packages (A, A, A2), (A, A1, A3), (B, B, B2), (B, B, B4) according to the requirements of the template file respectively to generate the test case corresponding to the file to be tested.
According to the scheme, the electronic equipment combines the annotation information corresponding to the function to be tested to endow the test value to at least one entry parameter of the function to be tested, so that the omission of the function to be tested and incomplete parameter combination are reduced, the risk of incomplete use case coverage is further reduced, the cost of manual assignment is reduced, and the process of use case generation is optimized.
It will be understood by those of skill in the art that in the above method of the present embodiment, the order of writing the steps does not imply a strict order of execution and does not impose any limitations on the implementation, as the order of execution of the steps should be determined by their function and possibly inherent logic.
In a specific embodiment, the process of generating the test case by the electronic device further includes constructing a regular expression to match the file to be tested and determining a plurality of test values of the entry according to the description information of the entry data. And then, obtaining a test case of the file to be tested based on the template file and the test value.
Referring to fig. 3, fig. 3 is a schematic flow chart illustrating an embodiment of extracting function to be tested and annotation information according to the present application. Specifically, step 12 in the above embodiment may further include the following steps:
step 121: and matching the file to be tested based on a preset regular expression to obtain a function to be tested in the file to be tested.
In an embodiment, the electronic device establishes a regular expression according to a preset data type of a function and a keyword of the function, and respectively matches a header file and a source file in a file to be tested by using the regular expression to obtain a corresponding function to be tested.
In some embodiments, the header file in the file to be tested is a source code in a format of ". H", the source file in the format of ". C" in the C program is a source code in a format of ". Cpp", and after the electronic device responds to the reading of the header file and the source file in the corresponding formats, the electronic device analyzes the header file and the source file by using a text parser and parses the header file and the source file into a format of text character strings, and then the text character strings corresponding to the header file and the source file are stored in a device storage medium for temporary storage or read into a variable for the next extraction operation.
In some embodiments, the data type of the preset function includes a preset data type (e.g., basic data types Boolean, byte, char, string, int, long, float, double in arithmetic data); enumerated types (e.g., variables in arithmetic data that are used to define a discrete integer value that can only be assigned to it in a program); void type ("void" is a type specifier, indicating that no value is available); a derived type (including a pointer type, an array type, a structure type, a common body type) and a multi-level constructor type.
In some embodiments, the keywords of the predetermined function include at least one of "meta characters" having a specific special meaning, and ordinary "text characters". Wherein, the meta character may include 'abc', which is used to represent 'a' or 'b' or 'c'; "[0-9]", used to represent any one of the numbers 0-9, is equivalent to [0123456789]; "[ \ u4e00- \ u9fa5]", which is used for representing any Chinese character; "[ ^ a1< ]", is used for expressing any other character except "a", "1", "<"; "[. Lamda-z ]", is used to indicate any character except lower case letters. The "text characters" may include at least one of common character range abbreviations, decimal points, escape characters, quantifiers.
Step 122: and carrying out static analysis on the function to be tested to obtain the annotation information of the function to be tested.
In an embodiment, the electronic device may analyze the function to be tested through the static analysis model, generate a static analysis report (i.e., a function information table), and extract corresponding annotation information from the static analysis report. Or, when the development engineer writes the function to be tested, information annotation is manually performed in the header file and the source file, and when the electronic device responds to the static analysis of the function to be tested, the manually written annotation information in the function to be tested is extracted.
In some embodiments, the annotation information extracted to correspond to the function to be tested includes function annotation information and function entry annotation information. The function annotation information comprises a data type (including a preset data type and a constructed data type, wherein the constructed data type comprises a data type and a structure body) defined by a function to be tested, a return value type, an identifier, a calling relation, a function and the like; the function parameter entering annotation information comprises parameter entering data, parameter entering data types, parameter entering meanings and the like of the functions to be tested.
In some embodiments, the data types defined by the function to be tested include a preset data type and a constructed data type, wherein the constructed data type includes a data class (or data group) type and a structure body type.
In some embodiments, the static analysis model may scan the function to be tested in C, C + +, java, or JNI language based on the SAST tool to obtain the program source Code of the function to be tested, namely bytecode (Byte Code) and Native Code (Native Code); and obtaining various words (Token) of different types by the byte codes and the native codes through a lexical analyzer (Lexer), and obtaining an Abstract Syntax Tree (AST) of the function to be tested after the analysis and the syntax check of a syntax analyzer (Parser). Finally, the static analysis model performs semantic or undefined behavior analysis on the abstract syntax tree based on an Intermediate Representation (IR) tool thereof, and then outputs various annotation information of the function to be tested in combination with various internal predefined rules or user-defined rules.
In some embodiments, the abstract syntax tree is a tree representation of a program source code structure, the program source code will pass through a root node of the abstract syntax tree to represent the whole function program, an internal node of the abstract syntax tree is an abstract syntax structure or a word, and the program source code input in the AST is in one-to-one correspondence with each syntax element. The IR may be obtained by converting AST after type checking and normalization, and may Use a Control Flow Graph (CFG) to represent a Control Flow of the function program and a Static Single Assignment (SSA) to represent a Use-definition Chain (Use-Def Chain) of data in the function program.
In an implementation scenario, the electronic device establishes a regular expression "^ [0-9] + abc $" according to a data type and a keyword of a preset function, wherein "^" represents a starting position of a character string of a matched input function; "[0-9] +" indicates that multiple numbers are matched, i.e., "[0-9]" matches a single number, "+" matches one or more numbers; "abc $" indicates the matching letter "abc" and ends with "abc", and "$" indicates the end position of the string of matching input functions. And the electronic equipment matches the file to be tested according to the regular expression and obtains a function to be tested in the file to be tested. Further, the electronic device may sequentially perform analysis and definition based on a SAST tool, a lexical analyzer (Lexer), a syntax analyzer (Parser), and an Intermediate Representation (IR) tool on the function to be tested through the static analysis model, generate a static analysis report (i.e., a function information table) of the function to be tested, and obtain annotation information of the function to be tested from the static analysis report. The annotation information comprises function annotation information and function entry annotation information.
According to the scheme, the electronic equipment is combined with the preset regular expression to perform code static analysis on the function to be tested to obtain the annotation information of the code, so that the selectivity of the function to be tested is enhanced, the cost for manually analyzing and annotating the function to be tested is reduced, and the process of generating the use case is optimized.
Referring to fig. 4, fig. 4 is a flow chart illustrating an embodiment of determining a test value of a reference in the present application. Specifically, step 13 in the above embodiment may further include the following steps:
step 131: and determining a legal value range of at least one parameter-entering test value based on the parameter-entering annotation information.
In an embodiment, the electronic device may determine, by using the reinforcement learning model, a legal value range of at least one entry parameter of the function to be tested according to entry parameter data, entry parameter data type, and entry parameter meaning in entry parameter annotation information of the function to be tested.
In some embodiments, the reinforcement learning model obtains the optimal feedback value range of different action strategies (i.e. the parameter entry annotation information) under each initial state value (i.e. the initial parameter entry value) by learning the accumulated feedback value (i.e. the accumulated parameter entry value) obtained after the current function to be tested takes different parameter entry data, parameter entry data types and parameter entry meanings.
As an example, the reinforcement learning model sets the initial parameter value of all parameter data corresponding to each parameter data type as s p =(x 1 n ,x 2 n ,....,x n n ) Wherein s is p For entering parameters by type, x n n Is an initial parameter value, x, of parameter data i ∈[l i ,h i ]Is the initial parameter value of the ith parameter data, and the total initial parameter values to be learned have N e NAnd (4) respectively. The reinforcement learning model adopts the accumulated parameter value r of different parameter annotation information t =score(s t+1 )-score(s t ) Wherein, the step length of each parameter adjustment is set to be 1 by the accumulated parameter value, namely a = (± 1, ± 1.,. Once.,. 1.,. Once.), and score is the comprehensive optimal feedback value range determined by engineers under the current parameter value setting.
For example, in the data type of int entry parameters, the legal range of entry parameters may be 100 < c _ gap < =200; in the data type of the input parameter of the floating point, the legal value range of the input parameter can be a (a = m multiplied by n ^ e) < c _ gap < = b (b = m multiplied by n ^ f), wherein f > e, m is an integer, and n is a decimal number; in the input parameter data type of Boolean algebra, the legal value range of the input parameter can be c _ gap ∈ N · (True-False) of the N-order language.
Step 132: and determining at least one entered test value based on the legal value range.
In an embodiment, at least one test value of the entry of the function may be determined based on a legal value range of the corresponding function through the entry value model. Or, the development engineer can determine at least one test value of the input parameter of the function according to the legal value range of the function; wherein each entry comprises at least one test value.
In some embodiments, the electronic device may input the parameter entry data type, the parameter entry meaning, and the parameter entry method value range of each entry of the function to be tested into a preset entry value model (e.g., a convolutional neural network based on Attention's RNN, LSTM, etc.) to output at least one test value corresponding to the entry.
After the input parameter value model outputs the test values corresponding to the input parameters, developers can manually correct the test values at the positions corresponding to the functions to be tested based on the user interface of the electronic equipment so as to enhance the effectiveness of the test values.
In an implementation scene, the electronic equipment inputs parameter entering data, parameter entering data types and parameter entering meanings in all parameter entering annotation information of the function to be tested into a reinforcement learning model for reinforcement learning so as to output legal value ranges [ n ^ e, n ^ f ] of the parameter entering A of the function to be tested, wherein f is larger than e; and entering the legal value range of the parameter B [0, 100]. The electronic equipment inputs the legal value ranges of the input parameter A and the input parameter B into an input parameter value model of the RNN convolutional neural network based on the Attention to carry out prediction value so as to determine the test value of the input parameter A of the function as n ^ t, wherein e is more than t and less than f; the test value for reference B was 35.
According to the scheme, the electronic equipment determines the legal value range of the input parameter according to the input parameter annotation information by using the neural network model, and then determines at least one test value of each input parameter according to the legal value range, so that the validity of the test values of the input parameters is enhanced, the cost of manual value taking is reduced, and the process of generating the use case is optimized.
In another embodiment, the electronic device may determine at least one entered test value at least one of an inside, a boundary, and an outside of the legal value range according to the function annotation information; wherein the number of each entered test value is greater than or equal to one.
As an example, a legal value range of a reference test value is [ a, b ], and e < a < c < b < f, then "c" may be a first test value within the legal value range, "a" and "b" may be a second test value and a third test value at the boundary of the legal value range, and "e" and "f" may be a fourth test value and a fifth test value outside the legal value range.
In an embodiment, the electronic device inputs, to the entry value model, function annotation information such as a data type, a return value type, an identifier, a calling relationship, and a function defined by the function to be tested, and a legal value range of the test value of each entry in the function to be tested, so as to output a test value of each entry in the function to be tested corresponding to at least one position in the inside, the boundary, and the outside of the legal value range. The data type defined by the function to be tested comprises a preset data type and a constructed data type, and the constructed data type comprises a data type and a structural body.
In an embodiment, the predetermined data type may include at least one of Boolean, byte, char, string, int, long, float, double in the arithmetic type.
After the input parameter value model outputs all test values corresponding to each input parameter, a developer can manually correct the test values at the positions corresponding to the functions to be tested based on a user interface of the electronic equipment so as to enhance the effectiveness of the test values.
In an implementation scene, the data type defined by the function to be tested is a preset data type, the legal value range of the parameter A of the function to be tested is [ n ^ e, n ^ f ] of float, and f is larger than e; and the legal value range of the reference B is [0, 100] of int, and the reference value model is LSTM convolutional neural network of Attention. The electronic equipment inputs the data type, the return value type, the identification, the calling relation and the function defined by the function to be tested into the parameter value model for analysis so as to output a test value A1 of the parameter A of the function to be tested as n ^ e and a test value A2 of the parameter A of the function to be tested as n ^ f; and the test value B1 of the entry parameter B is-20, the test value B2 is 0, the test value B3 is 50, the test value B4 is 100 and the test value B5 is 120.
According to the scheme, the electronic equipment determines at least one test value of each parameter through the neural network model based on the function annotation information at least one position in the interior, the boundary and the exterior of the legal parameter range of the parameter, so that the effectiveness of the test values of the parameter is enhanced, the cost of manual parameter taking is reduced, and the process of generating the use case is optimized.
In another embodiment, in response to that the data type of the function to be tested is a preset data type, the electronic device sets at least one test value corresponding to each input parameter in the at least one input parameter at least one of the inside, the boundary, and the outside of the legal value range through the input parameter value model. After the parameter input value model outputs all test values corresponding to the parameters, developers can manually correct the test values at the positions corresponding to the functions to be tested based on a user interface of the electronic equipment so as to enhance the effectiveness of the test values.
In an embodiment, the predetermined data type may include at least one of Boolean, byte, char, string, int, long, float, double in the arithmetic type.
In an implementation scenario, for example, a data type defined by a function to be tested is Boolean in a preset data type, and a legal value range of an input parameter C of the function to be tested is Boolean [ True, false ]; the reference value model is LSTM convolutional neural network of Attention. The electronic equipment inputs the data type, the return value type, the identification, the calling relation and the function defined by the function to be tested into the parameter value model for analysis, so as to output a test value C1 of the parameter C of the function to be tested as False and a test value C2 as True.
According to the scheme, the electronic equipment determines at least one test value of each parameter through the data type of the neural network model based on the function, so that the effectiveness of the test values of the parameters is enhanced, the cost of manual value taking is reduced, and the process of generating the use case is optimized.
Referring to fig. 5, fig. 5 is a flow chart illustrating another embodiment of determining a test value for a reference in the present application. Specifically, step 132 in the above embodiment may further include the following steps:
step a1: and in response to the data type of the function to be tested being the construction data type, converting the data type of the function to be tested into character string data.
In one embodiment, the configuration data types may include a data class (or data set) type and a structure type.
In some embodiments, the electronic device converts the function to be tested of the constructed data type into the character string data in the corresponding format through a code format converter or a code conversion model, wherein the function to be tested of the constructed data type can be in any one of a JSON object format, a java object format, an HTML array format, a mysql array format or an XML object format; the character string data can be any other one of a JSON object format, a java object format, an HTML array format, a mysql array format or an XML object format which is different from a source code format corresponding to the function to be tested.
In some embodiments, the code format converter or code conversion model may utilize a json _ decode () function in PHP (text Preprocessor) language to convert a function under test in source code format into string data in a corresponding format. Wherein, the json _ decode () function can configure its basic syntax json _ decode ($ json, $ assoc = FALSE, $ depth =512, $ options = 0) to convert the original code format.
AS an example, $ JSON in a JSON _ decode () function represents a JSON character string to be converted by the function, $ assoc is a boolean variable, if the obtained value is correct (i.e. true), the returned FALSE OBJECT is an associative ARRAY, $ depth represents the recursion depth specified by the user (the recursion depth specified in the function is 512), and $ options represent a bit mask option when the JSON OBJECT is decoded, which includes two supported options, one is JSON _ OBJECT _ AS _ ary, which has the same effect AS when the obtained value returned by the $ assoc setting is correct; the other is JSON _ BIGINT _ AS _ STRING and JSON _ THROW _ ON _ ERROR, which are selected to convert STRING data of large integer type, or default float type; the value of $ options equals indicates that the converted PHP string correspondingly returns the value of the encoded JSON object, and the returned initial value is 0.
In an embodiment, the character string data includes at least one type of key-value pair data, the number of each type of key-value pair data is greater than or equal to one, each key-value pair data corresponds to one entry in the function to be tested, and each key-value pair data includes at least one entry key value.
In one embodiment, the type of the key-value pair data includes at least one of a basic type (e.g., a preset data type Boolean, byte, char, string, int, long, float, double in the arithmetic data; an enumeration type (e.g., a variable in the arithmetic data used to define that only a certain discrete integer value can be assigned in a program), a void type (i.e., a type specifier indicating that no value is available), a derived data type (including a pointer type, an array type, a structure type, a common body type), and a multi-level constructor type.
Step a2: and determining at least one entry key value corresponding to each key value pair data as a test value of entry parameters corresponding to the key value pair data based on the legal value range.
In an embodiment, the electronic device first obtains each key value pair data in the character string data corresponding to the function to be tested by using the parameter-entering value model, then determines at least one corresponding key value according to the type of each key value pair data and based on the legal value range, and takes all the key values of each key value pair data as all the test values corresponding to the parameter-entering. After the parameter input value model outputs all test values corresponding to the parameters, developers can manually correct the test values at the positions corresponding to the functions to be tested based on a user interface of the electronic equipment so as to enhance the effectiveness of the test values.
In an implementation scene, the character string data corresponding to the function to be tested comprises key-value pair data A and key-value pair data B, wherein the key-value pair data A is an int type in a preset data type, and the key-value pair data B is an array type in a derivative type; the legal value range of the key value pair data A into the parameter A1 is [0, 100]; the legal value range of the input parameter B1 of the key value pair data B is [ "red", "blue", "green" ], and the input parameter value model is the LSTM convolutional neural network of Attention. The electronic equipment firstly converts the function to be tested of the construction data type into character string data of JSON format through a code format converter. And the electronic equipment inputs the character string data in the JSON format into the parameter model to extract key-value pair data so as to extract key-value pair data A and key-value pair data B. And the input parameter value model is based on that the corresponding legal value ranges of the key-value pair data A and the key-value pair data B are [0, 100] and [ red "," blue "," green ], and the input and output parameter A1 has a test value A2 of-20, a test value A3 of 0, a test value A4 of 50, a test value A5 of 100 and a test value A6 of 120. And the test value B2 of the input/output parameter B1 is "red", the test value B3 is "blue", and the test value B4 is "green".
According to the scheme, the electronic equipment obtains all the test values of at least one parameter corresponding to the key-value pair data by converting the function to be tested into the character string data and based on the type of the key-value pair data corresponding to the character string data, so that the effectiveness of the test values of the parameter is enhanced, the cost of manual value taking is reduced, and the process of generating the use case is optimized.
Referring to fig. 6, fig. 6 is a flow chart illustrating an embodiment of combining and packaging functions to be tested and test values according to the present application. Specifically, step 14 in the above embodiment may further include the following steps:
step 141: and assembling the function to be tested and each test value based on the template file to generate unit test data corresponding to each test value.
In one embodiment, the test case template file is a test script stored in a storage medium of the electronic device or in a third-party organization (e.g., a database, a cloud storage center), and the function to be tested and each test value are assembled at a corresponding position in the test script, so as to generate unit test data.
In an embodiment, the electronic device separately assembles the function to be tested and each test value thereof according to the obtained test case template file, thereby generating a set of unit test data. Each unit test data in the group of unit test data corresponds to a test value, and each unit test data can be used as an independent test case.
In an embodiment, the independent assembly may be understood as independently packaging the function to be tested and each test value thereof, which means that the attribute and implementation details of the object (i.e. the function to be tested) are hidden, only the template file interface is externally disclosed, and the access level of reading and modifying the attribute in the program is controlled; combining abstracted data (i.e. test values) and behaviors (i.e. template file functions) to form an organic whole, namely organically combining the data and source codes of operation data to form a class, wherein the data and the functions are members of the class.
In another embodiment, after obtaining the unit test data for each test value, the electronic device further determines an expected return value corresponding to each unit test data based on the annotation information and the at least one entered test value.
In an embodiment, the electronic device inputs the annotation information of the function to be tested and at least one reference test value thereof into the prediction model for analysis, so as to output an expected return value corresponding to each test value, thereby obtaining an expected return value corresponding to each unit of test data.
In one embodiment, the test value of the function to be tested is the input data of the test data of its corresponding unit, and the expected return value is the expected output data of the test data of its corresponding unit. Wherein the definition of the input data is: initial values of internal data read by the function to be tested. For example, the initial value may be 0, 100 of int type, or n ^ e, n ^ f of float type, etc. The definition of the expected output data is: the function to be tested operates the expected result value after the 'write operation' according to the program design. For example, the expected result value may be a variable of a Boolean type True, false, a discrete integer value of an enumerated type, or various specifiers of a void type, etc.
In an embodiment, the "write operation" executed by the function to be tested according to the programming thereof may include the function to be tested sequentially performing initialization operation and assignment on input data according to the programming thereof, and performing expected assertion on the member variable, thereby obtaining an expected result value.
In some embodiments, the predictive model that outputs the expected return value may be a reinforcement learning model. The method obtains the optimal return value range of different action strategies (namely annotation information and test values) under each initial state value (namely initial return value) by learning the accumulated feedback value (namely the accumulated return value) obtained by adopting different input types, input meanings, input value ranges, identifications, calling relations, functions and test values of the current function to be tested.
As an example, the reinforcement learning model sets the initial amount of annotation information for each test function to s p =(x 1 n ,x 2 n ,....,x n n ) Wherein s is p To annotate information, x n n Is an initial amount of annotation information, x i ∈[l i ,h i ]Is the initial amount of the ith annotation information, and the total initial amount to be learned is N ∈ N. The reinforcement learning model takes the accumulated return value of different annotation information and test value as r t =score(s t+1 )-score(s t ) Wherein the accumulated return value sets each referenceStep size is 1, i.e., a = (± 1, ± 1..., ± 1,), and score sets a comprehensive optimal range of return values for the current return value, as determined by the engineer, within which any feasible return value can be considered an expected return value.
In an implementation scenario, the electronic device inputs annotation information A0 of a function to be tested and a test value A2 and a test value A3 of a reference A1 into a prediction model for analysis, so as to obtain an optimal return value range of the test value A2 of [0-50] and an optimal return value range of the test value A3 of [50-100], and therefore the electronic device uses 25 as an expected return value of the test value A2 and uses 75 as an expected return value of the test value A3. Thereby obtaining expected return values 25 and 75 of the unit test data corresponding to the test value A2 and the test value A3.
According to the scheme, the expected return value of the test data of the corresponding unit is determined by utilizing the annotation information of the function to be tested and at least one test value of the input parameter, so that the effectiveness of the expected return value is enhanced, the cost of manual value taking is reduced, and the process of generating the use case is optimized.
Step 142: and combining and packaging the test data of each unit to generate a test case.
In an embodiment, the electronic device performs combined packaging on at least one unit test data corresponding to each function to be tested (i.e., performs uniform templated packaging on all unit test data), so as to obtain a test case (or a test case group) corresponding to the function to be tested, and adds the test case into a storage medium for storage, or inputs the test case into a case test platform for case test, and obtains a test result.
In one embodiment, the test case includes at least one of the following parameters: the method comprises the following steps of interface verification layer class name, case method name, interface information to be called (including interface packaging layer class name, request method and parameters required by resource creation), test data layer information (including data test layer class name, data provider method name and whether test data are transmitted through a test data layer), an interface data variable array and expected information (including return result expected information, return code expected information, verification field and test case success condition).
In an implementation scenario, the electronic device separately assembles the function P to be tested, the test value a thereof, and the test value B thereof according to the obtained test case template file, thereby obtaining unit test data A1 and unit test data B2. The electronic device inputs the annotation information of the function P to be tested, the test value a and the test value B2 into the prediction model for analysis, so as to obtain an optimal return value of 25 for the test value a and an optimal return value of 100 for the test value B. Finally, the electronic device packages the unit test data A1 and the unit test data B2 in a combined manner, so that a test case of the function P to be tested is obtained.
According to the scheme, the functions to be tested and each test value are independently assembled by using the template file to obtain the unit test data of each test value, and then the unit test data of each test value are combined and packaged, so that the test cases are obtained, the unit test data of the functions to be tested are enriched, the risk of incomplete case coverage is reduced, the cost of manual assignment is reduced, and the case generation flow is optimized.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
In a specific embodiment, after the electronic device generates and packages the test case, the electronic device further includes a process of testing the test case, so as to obtain an effective rate of the generated package test case and a correct rate of writing a header file and a source file corresponding to a file to be tested.
Referring to fig. 7, fig. 7 is a flowchart illustrating an embodiment of a test method for a test case provided in the present application. The method is applied to the electronic device in the above embodiment to be executed by the electronic device, and specifically, the method may include the following steps:
step 21: and acquiring a test case of the file to be tested and at least one expected return value corresponding to the test case.
In an embodiment, the electronic device may read the test cases of the file to be tested and all corresponding expected return values through its own storage medium or by connecting to a third-party organization (e.g., a database, a cloud data storage center, etc.).
In an embodiment, the test case and the expected return value are obtained based on the method for generating the test case in the foregoing embodiment, and are not described here again.
Step 22: and testing the file to be tested based on the test case to obtain at least one test return value corresponding to the file to be tested.
In an embodiment, the electronic device performs a serialized test on each unit test data in the test case based on the code test platform, so as to call a function to be tested of the unit test data and a test value thereof in the process of platform test, so as to obtain at least one corresponding test return value, wherein each test return value corresponds to an expected return value after completion of manufacture.
Step 23: and matching the at least one test return value with the at least one expected return value to obtain a test result of the file to be tested.
In an embodiment, the code testing platform compares all the obtained testing return values corresponding to the files to be tested with the corresponding expected return values respectively to obtain return value comparison results, and obtains the testing results of the files to be tested based on the distribution conditions of all the comparison results.
In an embodiment, the code test platform compares whether the two return values are the same, if so, the comparison result is a test success, and if not, the comparison result is a test failure. And obtaining the distribution conditions of successful test of the input parameter data and successful test of the test value type according to the input parameter data and the test value type corresponding to the successful test of the comparison result so as to obtain the test result of the file to be tested.
In an implementation scenario, the electronic device may read the test cases of the file to be tested and all the corresponding expected return values thereof through its own storage medium. The test case comprises unit test data A and unit test data B, the expected return value corresponding to the unit test data A is A1, and the expected return value corresponding to the unit test data B is B1. The electronic equipment respectively carries out serialization testing on the unit testing data A and the unit testing data B based on the code testing platform, so that a testing return value A2 corresponding to the unit testing data A and a testing return value B2 corresponding to the unit testing data B are obtained. And the electronic equipment compares the expected return value A1 with the test return value A2 and the expected return value B1 with the test return value B2 through the code test platform to obtain that the comparison result of the unit test data A is test failure and the comparison result of the unit test data B is test success. And finally, the code test platform obtains the distribution condition of the test results of the unit test data A and the unit test data B according to the input parameter data and the test value types thereof corresponding to the unit test data A and the unit test data B so as to obtain the test result of the file to be tested.
According to the scheme, the test case of the file to be tested is tested to obtain the test return value, and then the test return value is matched and compared with the expected return value prepared in advance to obtain the test result, so that the test accuracy of the test case is improved, the cost of manually analyzing data is reduced, and the case generation flow is optimized.
Referring to fig. 8, fig. 8 is a schematic flowchart illustrating a method for generating a test case and testing the test case according to an embodiment of the present disclosure. The method is applied to the electronic device in the above embodiment to be executed by the electronic device, and specifically, the method may include the following steps:
step 31: a header file and a source file of a file to be tested are prepared.
In one embodiment, the header file is used to store declarations of functions in the file to be tested (including declarations of members of the data class and declarations of functions in the data class), function prototypes, argument information, return value information, macro commands (# define), constants, and the like; the source file is used for storing specific code data and code data information (including data types such as preset data types, construction types and structure body types) of the functions which are already declared in the header file, and the like.
Step 32: and performing AI deep analysis on the header file and the source file.
Step 33: and extracting all methods in the header file and the source file and annotation information such as parameter data, parameter types, return value types, data classes, structural bodies and the like of all the methods based on the deep analysis result.
In an embodiment, the AI deep analysis includes that the electronic device extracts all methods in the header file and the source file based on a code static analysis method, and further extracts annotation information corresponding to all methods based on all methods. The code static analysis method can be executed through a static analysis model, and the execution method comprises two parts of regular matching of a header file and a source file and generation of an analysis report of all methods, wherein the analysis report comprises generated annotation information.
In some embodiments, the annotation information of all methods can be manually annotated in the header file and the source file by a development engineer when writing the methods, and the electronic device re-extracts the annotation information manually written in all methods based on the extracted all methods.
In some embodiments, all method data classes or structures are extracted, including basic type functions, enumeration types, void types, derivative types, and the like.
Step 34: and constructing a test case and a data expected result based on all the methods and the annotation information thereof.
In one embodiment, the electronic device individually assembles each method and its parameters into a test case template file according to the annotation information corresponding to all methods (i.e., functions) to form a set of unit test data. Each unit test data in the group of unit test data corresponds to one input parameter value, and each unit test data can be used as an independent test case.
In an embodiment, the test case template file is a test script stored in a storage medium of the electronic device or in a third-party organization (e.g., a database, a cloud storage center), all methods and input parameters thereof are assembled at corresponding positions in the test script to form unit test data, and the electronic device performs unified template packaging on all the unit test data to obtain test cases of corresponding methods, and adds the test cases into the storage medium for storage, or inputs the test cases into a case test platform for case testing.
In one embodiment, the input of each method takes different values, the test case comprises a plurality of unit test data which are arranged and combined, and each unit test data corresponds to one expected data result.
Step 35: and combining the unit test framework to test the test cases in a serialization mode to obtain a data test result, and comparing the data test result with a data expected result to obtain a unit test result.
In an embodiment, the electronic device performs a serialized test on each unit test data in the test case based on the code test platform, so as to call the method corresponding to the unit test data and the input parameter value thereof in the process of platform test, so as to obtain the corresponding at least one test return value. The code testing platform can be a gtest testing framework based on C + + language; each test return corresponds to an expected return for a fabrication completion.
In an embodiment, the code testing platform compares all the obtained test return values with the corresponding expected return values respectively to obtain return value comparison results, and obtains the test results of the files to be tested based on the distribution conditions of all the comparison results.
In an embodiment, the code test platform compares whether the two return values are the same, if so, the comparison result is a test success, and if not, the comparison result is a test failure. And obtaining the distribution conditions of successful test according to the input parameter data and successful test according to the type of the input parameter data of the method according to the comparison result so as to obtain the final test results of the header file and the source file.
Referring to fig. 9, fig. 9 is a schematic flowchart illustrating a method for generating and testing a test case according to another embodiment of the present application. The method is applied to the electronic device in the above embodiment to be executed by the electronic device, and specifically, the method may include the following steps:
step 41: reading a source file and a header file into text character strings, and matching preset data types and special keywords in the text character strings by using a regular expression.
In an embodiment, the preset data types include int, char, string, pool, char15_ t, char32_ t, and the like; the special keywords include typedef, class, union, struct, mao, and the like.
In an embodiment, the electronic device may obtain the file to be tested by connecting and reading the use case testing platform or a third party entity (such as a database, a cloud server, and the like).
In some embodiments, the data format of the obtained source file and header file may be converted by a code format converter or a code conversion model, and may obtain, for example, a JSON array format, a java object format, an HTML array format, a mysql array format, or an XML object format. The data structures of the source file and the header file can be, for example, "Python", "javascript", "C + +", etc., and the value of the data structures can be obtained through the code index, and further can be stored in the corresponding array or object for temporary storage.
Step 42: and respectively matching all the methods, matching the input parameter type and the type of each method, matching the return value type of each method, matching the output class, the structural body and other complex types.
In an embodiment, the electronic device may analyze the matching result through the static analysis model to generate a static analysis report (i.e., all the method information tables), and then extract annotation information corresponding to the method from the static analysis report. Or, when a development engineer writes the methods, information annotation is manually performed in the header files and the source files, and when the electronic device responds to static analysis of all the methods, annotation information manually written in the methods is extracted.
Step 43: and analyzing the annotation information corresponding to each method, and matching the legal value range of the input parameter data corresponding to each method by using a regular expression.
In an embodiment, the electronic device may analyze annotation information corresponding to each method through a reinforcement learning model to determine a legal value range of the parameter data corresponding to each method.
In some embodiments, the reinforcement learning model obtains the optimal feedback value range of different action strategies (i.e. parameter-entering annotation information) at each initial state value (i.e. initial parameter-entering value) by learning the accumulated feedback value (i.e. accumulated parameter value) obtained after the current method takes different parameter-entering data, parameter-entering data types and parameter-entering meanings.
If the input parameter data type is a preset data type in a simple scene, the input parameter data of the int type is directly constructed, and a legal value range is generated according to the annotation information. For example, it may be 100 < c _ gap < =200.
And if the input parameter data type is the construction data type in a complex scene, constructing a method of the json character string data type and data of annotation information thereof, and determining a legal value range corresponding to each key value pair data as a legal value range corresponding to the input parameter data according to the type of the key value pair data and the annotation information of the method in the character string data. Namely, the key value corresponding to the key value pair data is used as the entry value, the value of each entry parameter is the entry key value of each combined key value pair data group, and each entry key value is different.
Step 44: and determining the input parameter value of each input parameter data based on the legal value range, and performing combined packaging on each method and the corresponding input parameter value based on the corresponding template file to obtain the test case.
In an embodiment, the electronic device determines at least one parameter value of each parameter data based on a legal parameter value range of the parameter data, and the parameter value may be a legal parameter value or an illegal parameter value, and is used as a positive example or a negative example of the test. And the electronic equipment independently assembles each method and the corresponding input parameter according to the test case template file to form a group of unit test data, and encapsulates all the unit test data to obtain the test case.
After the electronic equipment corresponds to at least one parameter value of the parameter, a developer can manually correct the parameter value on the data position corresponding to the method based on the user interface of the electronic equipment so as to enhance the effectiveness of the test case.
Referring to fig. 10, fig. 10 is a schematic structural diagram of another embodiment of the electronic device provided in the present application, where the electronic device 100 includes a processor 101 and a memory 102 connected to the processor 101, where the memory 102 stores program data, and the processor 101 calls the program data stored in the memory 102 to execute the method for generating the test case or the method for testing the test case.
Optionally, in an embodiment, the processor 101 is configured to execute the run-time data to implement the following method: acquiring a file to be tested; extracting a function to be tested in a file to be tested and annotation information of the function to be tested; determining at least one test value of the function to be tested, which is entered into the parameter, based on the function to be tested and the annotation information; and combining and packaging the functions to be tested and the test values based on the template file corresponding to the files to be tested, and generating the test case of the files to be tested.
According to the scheme, the electronic device 100 assigns the test value to at least one entry parameter of the function to be tested by using the annotation information corresponding to the function to be tested, so that the function to be tested and the test value are packaged into the test case in a combined manner, the test values of the function to be tested are enriched, the risk of incomplete case coverage is reduced, the cost of manual assignment is reduced, and the generation flow of the case is optimized.
The processor 101 may also be referred to as a Central Processing Unit (CPU). The processor 101 may be an electronic chip having signal processing capabilities. The Processor 101 may also be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. In addition, the processor 101 may be commonly implemented by integrated circuit chips.
The storage 102 may be a memory bank, a TF card, etc., and may store all information in the electronic device 100, including the input raw data, the computer program, the intermediate operation result, and the final operation result, all stored in the storage 102. Which stores and retrieves information based on the location specified by the processor 101. With the memory 102, the electronic device 100 has a memory function to ensure normal operation. The storage 102 of the electronic device 100 may be classified into a main storage (internal storage) and an auxiliary storage (external storage) according to the purpose, and there is a classification method into an external storage and an internal storage. The external memory is usually a magnetic medium, an optical disk, or the like, and can store information for a long period of time. The memory is a storage unit on the motherboard, which is used for storing data and programs currently being executed, but is only used for temporarily storing the programs and the data, and the data is lost when the power is turned off or the power is cut off.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other ways. For example, the above-described embodiment of the electronic device 100 is merely illustrative, and for example, the at least one entered legal value range is determined based on the entered comment information, the at least one entered test value is determined based on the legal value range, and the like, which are only an aggregate manner, and there may be another partitioning manner in actual implementation, for example, the first type and the second type may be combined or may be aggregated into another system, or some features may be omitted or not executed.
In addition, functional units (such as a database and a test case platform) in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
Referring to fig. 11, fig. 11 is a schematic structural diagram of an embodiment of a computer-readable storage medium provided in the present application, and the computer-readable storage medium 110 stores therein a program instruction 111 capable of implementing the method for generating the test case or the method for testing the test case.
The unit in which the functional units in the embodiments of the present application are integrated may be stored in the computer-readable storage medium 110 if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, the technical solution of the present application may be substantially implemented or contribute to the prior art, or all or part of the technical solution may be embodied in the form of a software product, and the computer-readable storage medium 110 includes several instructions in a program instruction 111 to enable a computer device (which may be a personal computer, a system server, or a network device, etc.), an electronic device (for example, MP3, MP4, etc., and may also be an intelligent terminal such as a mobile phone, a tablet computer, a wearable device, etc., or a desktop computer, etc.), or a processor (processor) to execute all or part of the steps of the method of the embodiments of the present application.
Optionally, in an embodiment, the program instructions 111, when executed by the processor, are configured to implement the following method: acquiring a file to be tested; extracting a function to be tested in a file to be tested and annotation information of the function to be tested; determining at least one test value of the function to be tested, which is entered into the parameter, based on the function to be tested and the annotation information; and combining and packaging the function to be tested and the test value based on the template file corresponding to the file to be tested to generate a test case of the file to be tested.
In the above scheme, the computer-readable storage medium 110 assigns a test value to at least one entry parameter of the function to be tested by using the annotation information corresponding to the function to be tested, so as to package the function to be tested and the test value in a combined manner as a test case, which is beneficial to enriching the test values of the function to be tested, thereby reducing the risk of incomplete case coverage and reducing the cost of manual assignment, and further optimizing the flow of case generation.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-readable storage media 110 (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It is to be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by the computer-readable storage medium 110. These computer-readable storage media 110 may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the program instructions 111, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer-readable storage media 110 may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the program instructions 111 stored in the computer-readable storage media 110 produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer-readable storage media 110 may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the program instructions 111 executing on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one embodiment, these programmable data processing devices include a processor and memory thereon. The processor may also be referred to as a CPU (Central Processing Unit). The processor may be an electronic chip having signal processing capabilities. The processor may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory may be a memory stick, TF card, etc. that stores and retrieves information based on the location specified by the processor. The memory is classified into a main memory (internal memory) and an auxiliary memory (external memory) according to the purpose, and also into an external memory and an internal memory. The external memory is usually a magnetic medium, an optical disk, or the like, and can store information for a long period of time. The memory refers to a storage component on the main board, which is used for storing data and programs currently being executed, but is only used for temporarily storing the programs and the data, and the data is lost when the power is turned off or the power is cut off.
The above description is only an embodiment of the present application, and is not intended to limit the scope of the present application, and all equivalent structures or equivalent processes performed according to the contents of the specification and the drawings, or applied directly or indirectly to other related technical fields, are all included in the scope of the present application.
Claims (11)
1. A method for generating a test case is characterized by comprising the following steps:
acquiring a file to be tested;
extracting a function to be tested in the file to be tested and annotation information of the function to be tested;
determining a test value of at least one parameter of the function to be tested based on the function to be tested and the annotation information;
and combining and packaging the function to be tested and the test value based on the template file corresponding to the file to be tested, and generating a test case of the file to be tested.
2. The method of claim 1, wherein the annotation information comprises entry annotation information;
the determining a test value of at least one parameter of the function to be tested based on the function to be tested and the annotation information includes:
determining a legal value range of the at least one parameter-entering test value based on the parameter-entering annotation information;
and determining the at least one entered test value based on the legal value range.
3. The method of claim 2, wherein the reference annotation information comprises function annotation information;
the determining the at least one test value of the at least one parameter based on the legal value range includes:
and determining the at least one entered test value according to at least one of the inside, the boundary and the outside of the legal value range of the function annotation information.
4. The method according to claim 2 or 3,
determining the at least one entered test value based on the legal value range includes:
responding to the data type of the function to be tested as a preset data type, and setting the at least one test value of the at least one parameter at least one position in the inner part, the boundary and the outer part of the legal value range; wherein each entry comprises at least one test value.
5. The method according to any one of claims 2 to 4,
determining the at least one entered test value based on the legal value range includes:
responding to the data type of the function to be tested as a construction data type, and converting the data type of the function to be tested into character string data; the character string data comprises at least one type of key-value pair data, the number of each type of key-value pair data is larger than or equal to one, each type of key-value pair data corresponds to one entry parameter in the function to be tested, and each type of key-value pair data comprises at least one entry key value;
and determining at least one entry key value corresponding to each key value pair data as a test value of entry parameters corresponding to the key value pair data based on the legal value range.
6. The method according to any one of claims 1-5, wherein the extracting the function to be tested in the file to be tested and the annotation information of the function to be tested comprises:
matching the files to be tested based on a preset regular expression to obtain the functions to be tested in the files to be tested;
and carrying out static analysis on the function to be tested to obtain the annotation information of the function to be tested.
7. The method according to any one of claims 1 to 6, wherein the generating a test case of the file to be tested by performing combined packaging on the function to be tested and the test value based on the template file corresponding to the file to be tested comprises:
assembling the function to be tested and each test value based on the template file to generate unit test data corresponding to each test value;
and combining and packaging the unit test data to generate the test case.
8. The method of claim 7, further comprising:
based on the annotation information and the at least one entered test value, an expected return value for each of the unit test data is determined.
9. A test method of a test case is characterized in that the method comprises the following steps:
acquiring a test case of a file to be tested and at least one expected return value corresponding to the test case; the test case is obtained based on the test case generation method of any one of claims 1 to 8;
testing the file to be tested based on the test case to obtain at least one test return value corresponding to the file to be tested; wherein the at least one test return value corresponds to the at least one expected return value;
and matching the at least one test return value with the at least one expected return value to obtain a test result of the file to be tested.
10. An electronic device, comprising a processor and a memory coupled to the processor, wherein the memory stores program data, and the processor retrieves the program data stored in the memory to perform the method. Method for generating a test case according to any of claims 1 to 8 and/or method for testing a test case according to claim 9
11. A computer-readable storage medium having stored therein program instructions, wherein the program instructions are executed to implement the test case generation method according to any one of claims 1 to 8 and/or the test case testing method according to claim 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211010698.6A CN115373988A (en) | 2022-08-22 | 2022-08-22 | Test case generation method, test method, electronic device, and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211010698.6A CN115373988A (en) | 2022-08-22 | 2022-08-22 | Test case generation method, test method, electronic device, and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115373988A true CN115373988A (en) | 2022-11-22 |
Family
ID=84068077
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211010698.6A Withdrawn CN115373988A (en) | 2022-08-22 | 2022-08-22 | Test case generation method, test method, electronic device, and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115373988A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118331891A (en) * | 2024-06-14 | 2024-07-12 | 北京比瓴科技有限公司 | Test case construction method based on call link and related device |
-
2022
- 2022-08-22 CN CN202211010698.6A patent/CN115373988A/en not_active Withdrawn
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118331891A (en) * | 2024-06-14 | 2024-07-12 | 北京比瓴科技有限公司 | Test case construction method based on call link and related device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11797298B2 (en) | Automating identification of code snippets for library suggestion models | |
US12032475B2 (en) | Automating identification of test cases for library suggestion models | |
US11875148B2 (en) | Library model addition | |
US12056487B2 (en) | Automating generation of library suggestion engine models | |
US9569342B2 (en) | Test strategy for profile-guided code execution optimizers | |
US9208057B2 (en) | Efficient model checking technique for finding software defects | |
US8381175B2 (en) | Low-level code rewriter verification | |
US20110239200A1 (en) | Method for compiling a computer program | |
KP et al. | Finite‐state model extraction and visualization from Java program execution | |
US10839124B1 (en) | Interactive compilation of software to a hardware language to satisfy formal verification constraints | |
Chen et al. | Hopper: Interpretative fuzzing for libraries | |
CN115373988A (en) | Test case generation method, test method, electronic device, and storage medium | |
Park et al. | Automatically deriving JavaScript static analyzers from specifications using meta-level static analysis | |
Ortin et al. | Cnerator: A Python application for the controlled stochastic generation of standard C source code | |
Casso et al. | An integrated approach to assertion-based random testing in Prolog | |
Farias et al. | ESBMC-Python: A Bounded Model Checker for Python Programs | |
CN114153435A (en) | EB2S system for intelligent contract code design and generation and use method | |
US20240370253A1 (en) | Automating generation of library suggestion engine models | |
Marques | Robust symbolic execution for webassembly | |
CN118409754A (en) | Method, device, equipment and storage medium for formalizing verification system by operating system scheduling algorithm | |
Florian | Analysis-Aware Design of Embedded Systems Software | |
CN115168188A (en) | Program testing method, program testing device, storage medium and electronic equipment | |
Catháin | Modelling flash memory device behaviour using CSP | |
CN114610320A (en) | LLVM-based variable type information repairing and comparing method and system | |
Bessonov et al. | One Approach to Automated Compiler Verification |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20221122 |
|
WW01 | Invention patent application withdrawn after publication |