CN114253851A - Test data processing method, device, equipment and storage medium - Google Patents

Test data processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN114253851A
CN114253851A CN202111567467.0A CN202111567467A CN114253851A CN 114253851 A CN114253851 A CN 114253851A CN 202111567467 A CN202111567467 A CN 202111567467A CN 114253851 A CN114253851 A CN 114253851A
Authority
CN
China
Prior art keywords
target
result
test data
analysis
program
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111567467.0A
Other languages
Chinese (zh)
Inventor
尹智伟
齐蓉
钟玉兴
林浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial and Commercial Bank of China Ltd ICBC
Original Assignee
Industrial and Commercial Bank of China Ltd ICBC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial and Commercial Bank of China Ltd ICBC filed Critical Industrial and Commercial Bank of China Ltd ICBC
Priority to CN202111567467.0A priority Critical patent/CN114253851A/en
Publication of CN114253851A publication Critical patent/CN114253851A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention discloses a test data processing method, a test data processing device, test data processing equipment and a storage medium. Wherein, the method comprises the following steps: analyzing the program to be tested by adopting a target analysis method to obtain an analysis result, wherein the target analysis method comprises at least one of the following steps: a static analysis method, a keyword analysis method, an association analysis method; generating target test data based on the analysis result; and operating the target test data to obtain a target operation result. The invention solves the technical problems of large testing workload and low testing efficiency caused by the fact that large data testing needs to be carried out manually in the prior art.

Description

Test data processing method, device, equipment and storage medium
Technical Field
The invention relates to the technical field of big data, in particular to the technical field of big data analysis and test processing, and specifically relates to a test data processing method, a test data processing device, test data processing equipment and a storage medium.
Background
The big data system takes a large amount of data or data service as a main line to run through the big data system, and has high requirements on the quality and the efficiency of the data, which are the key points and difficulties of big data testing. For example, except that the syntax error problem in the data quality test can be caused by log positioning, most other problems need to manually find out the part with data omission and exception handling through means of analysis, statistics, list comparison and the like of source data, so as to position a specific reason, a large amount of manpower and time are required to be consumed, the test threshold is high, and the problems can run through the whole test life cycle.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a test data processing method, a test data processing device, test data processing equipment and a storage medium, which are used for at least solving the technical problems of large test workload and low test efficiency caused by the fact that large data tests are required manually in the prior art.
According to an aspect of the embodiments of the present invention, a test data processing method is provided, which includes analyzing a program to be tested by using a target analysis method to obtain an analysis result, where the target analysis method includes at least one of: a static analysis method, a keyword analysis method, an association analysis method; generating target test data based on the analysis result; and operating the target test data to obtain a target operation result.
Optionally, analyzing the program to be tested by using the static analysis method to obtain the analysis result, including: detecting whether the program to be tested has a compiling error; and/or judging whether grammar errors exist in the program to be tested through a target grammar rule to obtain the analysis result.
Optionally, analyzing the program to be tested by using the keyword analysis method to obtain the analysis result, including: generating an initial test case according to an initial parameter combination in the program to be tested, wherein the initial parameter combination at least comprises: presetting keywords and a plurality of test vectors corresponding to the preset keywords; combining a plurality of keywords and a plurality of test vectors to obtain a target parameter combination; and taking the equivalent test case generated based on the target parameter combination as the analysis result.
Optionally, analyzing the program to be tested by using the incidence relation analysis method to obtain the analysis result, including: scanning a code field in the program to be tested to obtain a scanning result; and determining the dependency relationship between the original table and the corresponding result table in the code field according to the scanning result, and taking the dependency relationship as the analysis result.
Optionally, generating target test data based on the analysis result includes: optimizing the analysis result according to a preset data format to obtain an optimized test result; and generating the target test data based on the optimized test result.
Optionally, the method further includes: presetting a calling process according to the relation between the test cases in the target test data, wherein the calling process comprises the following steps: serial tuning and parallel tuning; and/or presetting the starting time of the target test data.
Optionally, after obtaining the target operation result, the method further includes: comparing the target operation result with a preset operation result to obtain a comparison result; and displaying the comparison result.
According to another aspect of the embodiments of the present invention, there is also provided a test data processing apparatus, including: the analysis module is used for analyzing the program to be tested by adopting a target analysis method to obtain an analysis result, wherein the target analysis method comprises at least one of the following steps: a static analysis method, a keyword analysis method, an association analysis method; the generating module is used for generating target test data based on the analysis result; and the operation module is used for operating the target test data to obtain a target operation result.
According to another aspect of the embodiments of the present invention, there is also provided a non-volatile storage medium storing a plurality of instructions, the instructions being adapted to be loaded by a processor and to perform any one of the above-mentioned test data processing methods.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device, including a memory and a processor, where the memory stores a computer program, and the processor is configured to execute the computer program to perform any one of the above test data processing methods.
In the embodiment of the present invention, a test data processing manner is adopted, and an analysis result is obtained by analyzing a program to be tested by using a target analysis method, wherein the target analysis method includes at least one of the following steps: a static analysis method, a keyword analysis method, an association analysis method; generating target test data based on the analysis result; the target test data are operated to obtain a target operation result, and the purpose of automatically testing the program to be tested is achieved, so that the technical effects of effectively reducing the data test workload and improving the data test efficiency are achieved, and the technical problems of large test workload and low test efficiency caused by the fact that manual large data test is needed in the prior art are solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a flow chart of a method of test data processing according to an embodiment of the invention;
FIG. 2 is a flow diagram of an alternative test data processing method according to an embodiment of the invention;
FIG. 3 is a diagram illustrating a correspondence between an optional original table and a result table according to an embodiment of the present invention;
FIG. 4 is a schematic structural diagram of a test data processing apparatus according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an alternative test data processing apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
The big data system takes a large amount of data or data service as a main line to run through the big data system, and has high requirements on the quality and the efficiency of the data, which are the key points and difficulties of big data testing. A gray box test method is mainly adopted in the big data test: on one hand, paying attention to the final result of the code for verifying the correctness of the index, on the other hand, reading the source code to verify the internal logic processing of the code, and simultaneously paying attention to the performance of the code, such as: consumption time under different data volumes, common code optimization, etc. In addition to the reason of the syntax error of the data quality test, the reason can be located through the log, and most other problems need to find out the part with data omission and exception handling through means of analysis, statistics, list comparison and the like of source data, so that the specific reason is located.
In the automatic testing method for big data in the prior art, in the data pre-testing stage, it is necessary to ensure that data resources from various aspects should be verified to ensure that correct data is loaded into the system and that source data can be correctly extracted; in the data analysis and calculation process, business logic verification is carried out on each node to ensure that the data subject aggregation is implemented or the isolation rule is normal; when the output stage is verified, the data integrity is checked, the data can be successfully loaded into a target system, the testing processes relate to the testing of a large amount of structured and unstructured data, a specific testing method is needed for testing the massive data, and the problems that a large amount of manpower and time are consumed for functional testing and non-functional testing, the testing threshold is high and the like run through the whole testing life cycle.
In view of the foregoing, it should be noted that the steps illustrated in the flowchart of the accompanying figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than presented herein.
Fig. 1 is a flowchart of a test data processing method according to an embodiment of the present invention, as shown in fig. 1, the method including the steps of:
step S102, analyzing the program to be tested by adopting a target analysis method to obtain an analysis result, wherein the target analysis method comprises at least one of the following steps: a static analysis method, a keyword analysis method, an association analysis method;
step S104, generating target test data based on the analysis result;
and step S106, operating the target test data to obtain a target operation result.
Optionally, the static analysis method is used to detect whether a compiling error exists in the program to be tested; and/or syntax errors; the keyword analysis method is used for expanding an initial test case to generate an equivalence class test case; the incidence relation analysis method is used for determining the dependency relation between an original table and a corresponding result table in a code field of a program to be tested.
Optionally, the test data is obtained by one of the following methods: manual recording mode, interface connection mode and file import mode. For example, a program to be tested is manually input, keyword association is provided, an initialization structure is provided, and a user is assisted in formatting an input script; directly acquiring a program to be tested in an interface mode, and extracting and collecting data; and directly acquiring the program to be tested in a file import mode.
In the embodiment of the present invention, a test data processing manner is adopted, and an analysis result is obtained by analyzing a program to be tested by using a target analysis method, wherein the target analysis method includes at least one of the following steps: a static analysis method, a keyword analysis method, an association analysis method; generating target test data based on the analysis result; the target test data are operated to obtain a target operation result, and the purpose of automatically testing the program to be tested is achieved, so that the technical effects of effectively reducing the data test workload and improving the data test efficiency are achieved, and the technical problems of large test workload and low test efficiency caused by the fact that manual large data test is needed in the prior art are solved.
In an optional embodiment, analyzing the program to be tested by using the static analysis method to obtain the analysis result includes:
step S202, detecting whether a compiling error exists in the program to be tested; and/or judging whether grammar errors exist in the program to be tested through a target grammar rule to obtain the analysis result.
Optionally, the static analysis method is used to detect whether a compiling error exists in the program to be tested; and/or syntax errors.
It should be noted that when a compiling error exists in a program, the testing efficiency and the accuracy of the testing result are seriously affected, so that whether the compiling error exists in the program to be tested is detected through a flexible configuration rule, so as to improve the testing efficiency and the accuracy of the testing result, or whether a program triggering the target grammar rule exists in the program to be tested is judged through the target grammar rule.
Optionally, the target grammar rule may include, but is not limited to: casehen statements are incomplete; two continuous single quotation marks appear on the limiting condition in value; the betweennend value range is invalid; the or statement condition must be accompanied by small brackets; two OR more inequality values of the same field are compared, and OR cannot be used; the insert partition table must contain part _ it _ key; the field separation must use english comma, etc., and the correspondence between the target syntax rule and the check scene is shown in table 1.
TABLE 1
Figure BDA0003422246310000051
Figure BDA0003422246310000061
Optionally, an alarm indication is sent out when a compiling error and/or a syntax error in the program to be tested is detected, so as to solve the technical problem that the monitoring process is complex and inefficient when the existence of manual detection is detected.
As an alternative embodiment, fig. 2 is a flowchart of an alternative test data processing method according to an embodiment of the present invention, and as shown in fig. 2, analyzing the program to be tested by using the keyword analysis method to obtain the analysis result includes:
step S302, generating an initial test case according to an initial parameter combination in the program to be tested, where the initial parameter combination at least includes: presetting keywords and a plurality of test vectors corresponding to the preset keywords;
step S304, combining a plurality of keywords and a plurality of test vectors to obtain a target parameter combination;
step S306, using the equivalent test case generated based on the target parameter combination as the analysis result.
Optionally, the keyword analysis method is used to expand the initial test case and generate an equivalence class test case.
Optionally, the preset keywords may include, but are not limited to: if. case where, join, and, or, etc.; the test vectors may include, but are not limited to: dimension, target parameter; the initial parameter combination comprises: a preset key and a plurality of test vectors corresponding to the preset key, for example, the initial parameter combination composed of the key shown in table 2, the dimension of the test vector and the target parameter.
TABLE 2
Serial number Key word Dimension (d) of Target parameter
1 M1 D1 Data1
2 M2 D2 Data2
3 M3 D3 Data3
In an optional embodiment, the test vector shown in table 2 is adopted, and through parallelization parameter pair testing, a plurality of keywords, dimensions and target parameters are combined, that is, the keywords, the dimensions, the keywords, the target parameters and the dimensions are combined in pairs to output one arrangement mode, and the combination result is shown in table 3, so that the equivalent test case is generated.
TABLE 3
Serial number Key word Dimension (d) of Target parameter
1 M1 D1 Data1
2 M2 D2 Data1
3 M2 D3 Data1
4 M2 D2 Data2
5 M2 D1 Data2
6 M1 D3 Data2
7 M1 D2 Data3
8 M2 D3 Data3
9 M2 D1 Data3
In an optional embodiment, analyzing the program to be tested by using the incidence relation analysis method to obtain the analysis result includes:
step S402, scanning the code field in the program to be tested to obtain a scanning result;
step S404, determining a dependency relationship between the original table and the corresponding result table in the code field according to the scanning result, and using the dependency relationship as the analysis result.
Optionally, the association analysis method is configured to determine a dependency relationship between an original table and a corresponding result table in a code field of a program to be tested, and determine a dependency relationship between the original table and the corresponding result table in the code field by reading and scanning the code field, where it is to be noted that a dependency relationship exists between the same original table and at least one result table, and in a diagram of a correspondence relationship between the original table and the result table shown in fig. 3, a dependency relationship exists between each original table and a plurality of result tables.
In an alternative embodiment, generating the target test data based on the analysis result includes:
step S502, optimizing the analysis result according to a preset data format to obtain an optimized test result;
step S504, generating the target test data based on the optimized test result.
Optionally, the preset data format at least includes: the preset data type and the preset data amount.
Optionally, test data meeting the conditions is generated according to the preset data type and the preset data amount, that is, the target test data is obtained.
In an optional embodiment, before running the target test data, the method further includes:
step S602, presetting a call-up flow according to a relationship between the test cases in the target test data, wherein the call-up flow includes: serial tuning and parallel tuning; and/or presetting the starting time of the target test data.
Optionally, the data node flow of the target test data is customized by a serial calling and/or parallel calling mode.
Optionally, a quartz timing task of the target test data is configured in advance, a crontab expression is used, timing is set, running time of minutes, seconds, hours, days, months, years and the like is set, and a test flow is automatically started.
In an optional embodiment, after obtaining the target operation result, the method further includes:
step S702, comparing the target operation result with a preset operation result to obtain a comparison result;
step S704, displaying the comparison result.
Optionally, the comparison result at least includes: the analysis result and the target operation result.
Optionally, the program to be tested, the comparison result, the target test data, and pre-configured data (such as a call-up flow and a call-up time) are stored in a target storage unit.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the invention. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required by the invention.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
Example 2
According to an embodiment of the present invention, an embodiment of an apparatus for implementing the test data processing method is further provided, and fig. 4 is a schematic structural diagram of a test data processing apparatus according to an embodiment of the present invention, as shown in fig. 4, the test data processing apparatus includes: an analysis module 80, a generation module 82, an operation module 84, wherein:
the analysis module 80 is configured to analyze a program to be tested by using a target analysis method to obtain an analysis result, where the target analysis method includes at least one of the following: a static analysis method, a keyword analysis method, an association analysis method;
the generating module 82 is configured to generate target test data based on the analysis result;
the operation module 84 is configured to operate the target test data to obtain a target operation result.
It should be noted that the above modules may be implemented by software or hardware, for example, for the latter, the following may be implemented: the modules can be located in the same processor; alternatively, the modules may be located in different processors in any combination.
It should be noted here that the analyzing module 80, the generating module 82, and the operating module 84 correspond to steps S102 to S106 in embodiment 1, and the modules are the same as the corresponding steps in the implementation example and application scenario, but are not limited to the disclosure in embodiment 1. It should be noted that the modules described above may be implemented in a computer terminal as part of an apparatus.
As an alternative embodiment, an embodiment of the present invention further provides another embodiment of an apparatus for implementing the test data processing method, and fig. 5 is a schematic structural diagram of an alternative test data processing apparatus according to an embodiment of the present invention, as shown in fig. 5, where the test data processing apparatus includes: the test system comprises a test program recording module 11, an intelligent analysis module 12 of a program to be tested, an automatic test data generation module 13, a test driving module 14, a test result checking module 15 and a data storage module 16, wherein:
the test program entry module 11 is configured to collect data to be tested, and includes an interface entry collection unit 111, an API collection unit 112, and a file import unit 113, which correspond to three test program acquisition modes of interface entry, API entry, and file import, respectively, and are applicable to different demand scenarios; the to-be-tested program intelligent analysis module 12 is configured to further analyze the program acquired by the to-be-tested program entry module 11, and start the program static analysis unit 121, the keyword analysis unit 122, and the association analysis unit 123 to analyze the to-be-tested program in a multi-dimensional manner; the to-be-tested program intelligent analysis module 12 is configured to generate to-be-tested data (i.e., target to-be-tested data) meeting conditions according to the user-defined data type and data amount; the test driver module 14 is configured to predefine a call-up procedure and a call-up time of the data to be tested that meet the condition, and run the data to be tested that meet the condition to obtain a running result, where the call-up time is preset by the custom call-up time unit 141, and the call-up procedure is preset by the custom call-up mode unit 142; the test result checking module 15 is configured to compare and check the operation result with an expected operation result to obtain a check result, and display the check result, where the check result is obtained by the test result checking unit 151, and is displayed by the test result displaying unit; the data storage module 16 is used for storing data related to the apparatus, including program storage, structured data, collation results, user configuration data, and the like.
The embodiment of the invention can at least realize the following technical effects: through visual angle triggering of testers, a testing process of big data statistical analysis, testing emphasis and testing difficulty are integrated into a tool, the whole testing process is standardized, a team is guided to carry out testing and verification work in a self-service manner, and pain points of difficulty in manufacturing, verification and testing case management in the testing process are solved; the threshold of big data development test is reduced, the big data test quality and efficiency are improved, and a certain project development process is taken as an example: after the requirements are clarified, developers can timely code the contents of the programs, testers synchronously define and configure the driving modes of the programs to be tested, and then the testing verification can be carried out through the intelligent analysis testing device, so that the testing result feedback period after development and submission of the tests is greatly shortened, and the purpose of using a set of complete software testing tools to complete all-round software quality testing is achieved.
It should be noted that, reference may be made to the relevant description in embodiment 1 for alternative or preferred embodiments of this embodiment, and details are not described here again.
The test data processing apparatus may further include a processor and a memory, and the analyzing module 80, the generating module 82, the running module 84, and the like are stored in the memory as program units, and the processor executes the program units stored in the memory to implement corresponding functions.
The processor comprises a kernel, and the kernel calls a corresponding program unit from the memory, wherein one or more than one kernel can be arranged. The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
According to an embodiment of the present application, there is also provided an embodiment of a non-volatile storage medium. Optionally, in this embodiment, the nonvolatile storage medium includes a stored program, and the device where the nonvolatile storage medium is located is controlled to execute any one of the test data processing methods when the program runs.
Optionally, in this embodiment, the nonvolatile storage medium may be located in any one of computer terminals in a computer terminal group in a computer network, or in any one of mobile terminals in a mobile terminal group, and the nonvolatile storage medium includes a stored program.
Optionally, the device in which the non-volatile storage medium is controlled to execute the following functions when the program runs: analyzing the program to be tested by adopting a target analysis method to obtain an analysis result, wherein the target analysis method comprises at least one of the following steps: a static analysis method, a keyword analysis method, an association analysis method; generating target test data based on the analysis result; and operating the target test data to obtain a target operation result.
Optionally, the device in which the non-volatile storage medium is controlled to execute the following functions when the program runs: detecting whether the program to be tested has a compiling error; and/or judging whether grammar errors exist in the program to be tested through a target grammar rule to obtain the analysis result.
Optionally, the device in which the non-volatile storage medium is controlled to execute the following functions when the program runs: generating an initial test case according to an initial parameter combination in the program to be tested, wherein the initial parameter combination at least comprises: presetting keywords and a plurality of test vectors corresponding to the preset keywords; combining a plurality of keywords and a plurality of test vectors to obtain a target parameter combination; and taking the equivalent test case generated based on the target parameter combination as the analysis result.
Optionally, the device in which the non-volatile storage medium is controlled to execute the following functions when the program runs: scanning a code field in the program to be tested to obtain a scanning result; and determining the dependency relationship between the original table and the corresponding result table in the code field according to the scanning result, and taking the dependency relationship as the analysis result.
Optionally, the device in which the non-volatile storage medium is controlled to execute the following functions when the program runs: optimizing the analysis result according to a preset data format to obtain an optimized test result; and generating the target test data based on the optimized test result.
Optionally, the device in which the non-volatile storage medium is controlled to execute the following functions when the program runs: presetting a calling process according to the relation between the test cases in the target test data, wherein the calling process comprises the following steps: serial tuning and parallel tuning; and/or presetting the starting time of the target test data.
Optionally, the device in which the non-volatile storage medium is controlled to execute the following functions when the program runs: comparing the target operation result with a preset operation result to obtain a comparison result; showing the comparison result
According to an embodiment of the present application, there is also provided an embodiment of a processor. Optionally, in this embodiment, the processor is configured to execute a program, where the program executes any one of the test data processing methods.
There is further provided, in accordance with an embodiment of the present application, an embodiment of a computer program product, adapted to execute a program, when executed on a data processing device, for initializing the steps of the test data processing method of any of the above.
Optionally, the computer program product is adapted to perform a program for initializing the following method steps when executed on a data processing device: analyzing the program to be tested by adopting a target analysis method to obtain an analysis result, wherein the target analysis method comprises at least one of the following steps: a static analysis method, a keyword analysis method, an association analysis method; generating target test data based on the analysis result; and operating the target test data to obtain a target operation result.
According to an embodiment of the present application, there is also provided an embodiment of an electronic device, including a memory and a processor, where the memory stores a computer program, and the processor is configured to run the computer program to execute any one of the above test data processing methods.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable non-volatile storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a non-volatile storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned nonvolatile storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (10)

1. A method for processing test data, comprising:
analyzing the program to be tested by adopting a target analysis method to obtain an analysis result, wherein the target analysis method comprises at least one of the following steps: a static analysis method, a keyword analysis method, an association analysis method;
generating target test data based on the analysis result;
and operating the target test data to obtain a target operation result.
2. The method of claim 1, wherein analyzing the program to be tested using the static analysis method to obtain the analysis result comprises:
detecting whether a compiling error exists in the program to be tested; and/or judging whether grammar errors exist in the program to be tested through a target grammar rule to obtain the analysis result.
3. The method of claim 1, wherein analyzing the program to be tested by the keyword analysis method to obtain the analysis result comprises:
generating an initial test case according to an initial parameter combination in the program to be tested, wherein the initial parameter combination at least comprises: presetting keywords and a plurality of test vectors corresponding to the preset keywords;
combining the plurality of keywords and the plurality of test vectors to obtain a target parameter combination;
and taking the equivalent test case generated based on the target parameter combination as the analysis result.
4. The method of claim 1, wherein analyzing the program to be tested by the incidence relation analysis method to obtain the analysis result comprises:
scanning a code field in the program to be tested to obtain a scanning result;
and determining the dependency relationship between the original table and the corresponding result table in the code field according to the scanning result, and taking the dependency relationship as the analysis result.
5. The method of claim 1, wherein generating target test data based on the analysis results comprises:
optimizing the analysis result according to a preset data format to obtain an optimized test result;
and generating the target test data based on the optimized test result.
6. The method of claim 1, wherein prior to running the target test data, the method further comprises:
presetting a calling process according to the relation among the test cases in the target test data, wherein the calling process comprises the following steps: serial tuning and parallel tuning; and/or
And presetting the starting time of the target test data.
7. The method of claim 1, wherein after obtaining the target operational result, the method further comprises:
comparing the target operation result with a preset operation result to obtain a comparison result;
and displaying the comparison result.
8. A test data processing apparatus, comprising:
the analysis module is used for analyzing the program to be tested by adopting a target analysis method to obtain an analysis result, wherein the target analysis method comprises at least one of the following steps: a static analysis method, a keyword analysis method, an association analysis method;
a generation module for generating target test data based on the analysis result;
and the operation module is used for operating the target test data to obtain a target operation result.
9. A non-volatile storage medium, characterized in that it stores a plurality of instructions adapted to be loaded by a processor and to execute the test data processing method according to any one of claims 1 to 7.
10. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and the processor is configured to execute the computer program to perform the test data processing method of any one of claims 1 to 7.
CN202111567467.0A 2021-12-20 2021-12-20 Test data processing method, device, equipment and storage medium Pending CN114253851A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111567467.0A CN114253851A (en) 2021-12-20 2021-12-20 Test data processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111567467.0A CN114253851A (en) 2021-12-20 2021-12-20 Test data processing method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114253851A true CN114253851A (en) 2022-03-29

Family

ID=80793340

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111567467.0A Pending CN114253851A (en) 2021-12-20 2021-12-20 Test data processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114253851A (en)

Similar Documents

Publication Publication Date Title
CN106951364B (en) Test method and device
CN105630656B (en) System robustness analysis method and device based on log model
CN111767350A (en) Data warehouse testing method and device, terminal equipment and storage medium
CN111427928A (en) Data quality detection method and device
US7496898B1 (en) Error analysis and diagnosis for generic function calls
CN110688173B (en) Positioning method and device of components in cross-platform interface framework and electronic equipment
CN114253851A (en) Test data processing method, device, equipment and storage medium
CN111767213A (en) Method and device for testing database check points, electronic equipment and storage medium
CN111309584B (en) Data processing method, device, electronic equipment and storage medium
CN110968518A (en) Analysis method and device for automatic test log file
CN114416581A (en) Method, device and equipment for determining test failure reason
CN103577758A (en) Program code verification method and device
CN113377675A (en) Feedback-based reduction method for SMT solver performance test cases
CN112631946A (en) Software product running environment detection method and system
CN111767222A (en) Data model verification method and device, electronic equipment and storage medium
CN113220594B (en) Automatic test method, device, equipment and storage medium
CN114860549B (en) Buried data verification method, buried data verification device, buried data verification equipment and storage medium
CN112069749B (en) Power supply connection verification method and device, electronic equipment and storage medium
CN113626304A (en) Program change verification method and device, electronic equipment and readable storage medium
CN117033176A (en) FPGA software defect analysis method based on multidimensional association rule mining
CN116955322A (en) Performance comparison method and device for stored program, electronic equipment and storage medium
CN116680106A (en) Abnormality locating method, device, equipment and storage medium
CN115858390A (en) mock testing method, mock testing device, storage medium and computer equipment
CN116991724A (en) Interface testing method and device based on monitoring log, electronic equipment and storage medium
CN115756907A (en) Uncertain test result judgment method and system based on keywords

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination