WO2023103640A1 - Procédé et appareil permettant de générer un cas test, dispositif électronique et support de stockage - Google Patents

Procédé et appareil permettant de générer un cas test, dispositif électronique et support de stockage Download PDF

Info

Publication number
WO2023103640A1
WO2023103640A1 PCT/CN2022/128119 CN2022128119W WO2023103640A1 WO 2023103640 A1 WO2023103640 A1 WO 2023103640A1 CN 2022128119 W CN2022128119 W CN 2022128119W WO 2023103640 A1 WO2023103640 A1 WO 2023103640A1
Authority
WO
WIPO (PCT)
Prior art keywords
product
sub
standard data
test
test cases
Prior art date
Application number
PCT/CN2022/128119
Other languages
English (en)
Chinese (zh)
Inventor
陈惠娟
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2023103640A1 publication Critical patent/WO2023103640A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3676Test management for coverage analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Definitions

  • the embodiments of the present application relate to the field of software testing, and in particular to a method, device, electronic device and storage medium for generating test cases.
  • test design is an important part of the test work.
  • the test plan and test cases output in this link determine the direction and content of test execution.
  • the completeness and redundancy of test plans and test cases directly determine the quality and efficiency of test work.
  • test cases test the content that test cases need to cover needs to come from at least two aspects: requirement specification and code implementation. If the test cases only focus on the software code, identify test points and design use cases based on the code, it is difficult to find behaviors that have been specified but not implemented by the software, and 100% code test coverage cannot guarantee the usability of the software. On the contrary, if the test cases only focus on the requirement specifications, identify test points and design use cases based on the requirements, it is difficult to find behaviors that are not specified in the requirements but implemented by the software (for example, Trojan horse virus), and there may be serious redundancy among test cases.
  • the completeness and redundancy of test cases are generally measured by statistical code coverage after the test cases are executed, and then corrected by updating the test cases.
  • Common code coverage includes: line coverage, branch coverage, condition coverage, path coverage, etc.
  • this method can only judge the completeness and redundancy of test cases from the perspective of code, and cannot measure the completeness and redundancy of test cases from the perspective of requirements specification, especially for large-scale wireless communication systems, which have a large number of industry protocols specification, this measure is less effective, which in turn leads to less validity of the final generated test cases.
  • the main purpose of the embodiment of the present application is to propose a test case generation method, device, electronic equipment and storage medium, aiming to measure the completeness and redundancy of the test case from the two perspectives of code implementation and requirement specification, Improve the effectiveness of test cases.
  • the embodiment of the present application provides a method for generating test cases, including: obtaining the standardized input and output data of each software entity of the product to be tested as product standard data; Value, to determine the characteristics of the product to be tested, the characteristics represent the product function and value; for each characteristic, obtain the sub-functions used to realize the characteristics, and generate test cases for each sub-function; among them, the sub-functions and sub-functions Test cases are described by product standard data.
  • the embodiment of the present application also proposes a test case generation device, including: an acquisition module for acquiring standardized input and output data of each software entity of the product to be tested as product standard data; a determination module for According to the function of the product to be tested and the value of the function to the user, determine the characteristics of the product to be tested, and the characteristics represent the product function and value; the generation module is used to obtain the sub-functions used to realize the characteristics for each characteristic, and for each characteristic Test cases are generated for each sub-function; among them, the sub-functions and the test cases of sub-functions are described by product standard data.
  • an embodiment of the present application also proposes an electronic device, the device includes: at least one processor; and a memory connected to the at least one processor in communication; wherein, the memory stores information that can be executed by the at least one processor. Instructions, the instructions are executed by at least one processor, so that the at least one processor can execute the method for generating test cases as described above.
  • the embodiment of the present application also proposes a computer-readable storage medium storing a computer program, and when the computer program is executed by a processor, the method for generating test cases as described above is implemented.
  • test case generation method before generating the test case, according to the standardized input and output data of each software entity of the product to be tested, a product standard data set is generated, and when the test case is generated, according to the function of the product to be tested and the value of the function to the user, determine all the features contained in the product to be tested, and then determine the sub-functions that need to be used when each feature is realized, use the pre-generated product standard data to describe each sub-function, and provide automatic Can generate test cases described by product standard data.
  • the test case is generated by the product standard data generated according to the standardized input and output data of each software entity of the product under test, so that the generated test case and the function of the product under test are mapped and linked; Generate multiple features of the product to be tested, and obtain the sub-functions needed for each feature, and generate test cases for the sub-functions, so that the completeness of the test cases' coverage of the expected content of the requirements can be measured, and then it can be realized from the code and the requirements. Standardize two angles to measure the completeness and redundancy of test cases, and improve the effectiveness of the final generated test cases.
  • Fig. 1 is the generation method flowchart of the test case in the embodiment of the present application.
  • Fig. 2 is a schematic diagram of the product standard data generation process in the embodiment of the present application.
  • FIG. 3 is a schematic diagram of timing interaction of DRX characteristics in an embodiment of the present application.
  • Fig. 4 is a schematic diagram of the sub-function splitting of the characteristics in the embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of a test case generation device in another embodiment of the present application.
  • Fig. 6 is a schematic structural diagram of an electronic device in another embodiment of the present application.
  • test cases and test case generation methods can only judge the completeness and redundancy of test cases from the perspective of code, and cannot measure the completeness and redundancy of test cases from the perspective of requirements specifications.
  • the generated test cases and measurement The method is less effective. Therefore, how to establish test cases that can measure completeness and redundancy from the perspective of code and requirement specification is an urgent problem that needs to be solved.
  • the embodiment of the present application provides a method for generating test cases, including: obtaining the standardized input and output data of each software entity of the product to be tested as product standard data; The value of each feature of the product to be tested is determined, and the feature represents the function and value of the product; for each feature, each sub-function used to implement the feature is obtained, and test cases are generated for each sub-function; among them, sub-functions and sub-functions All test cases are described by product standard data.
  • test case generation method before generating the test case, according to the standardized input and output data of each software entity of the product to be tested, a product standard data set is generated, and when the test case is generated, according to the function of the product to be tested and the value of the function to the user, determine all the features contained in the product to be tested, and then determine the sub-functions that need to be used when each feature is realized, use the pre-generated product standard data to describe each sub-function, and provide automatic Can generate test cases described by product standard data.
  • the test case is generated by the product standard data generated according to the standardized input and output data of each software entity of the product under test, so that the generated test case and the function of the product under test are mapped and linked; Generate multiple features of the product to be tested, and obtain the sub-functions needed for each feature, and generate test cases for the sub-functions, so that the completeness of the test cases' coverage of the expected content of the requirements can be measured, and then it can be realized from the code and the requirements. Standardize two angles to measure the completeness and redundancy of test cases, and improve the effectiveness of the final generated test cases.
  • the first aspect of the embodiment of the present application provides a method for generating test cases.
  • a method for generating test cases Refer to FIG. 1 for the specific flow of the method for generating test cases.
  • the method for generating test cases is applied to a terminal device with analysis capabilities.
  • a terminal device with analysis capabilities Such as computers, tablets, mobile phones and other electronic devices, this embodiment takes the application in computers as an example for illustration, and the method for generating test cases includes at least but not limited to the following steps:
  • Step 101 acquire standardized input and output data of each software entity of the product to be tested as product standard data.
  • the computer when the computer generates the test cases of the product to be tested, it first obtains the software entities contained in the product to be tested, and then uses the standardized input and output data of each software entity of the product to be tested as the product standard data describing the test case.
  • the computer obtains standardized input and output data of each software entity of the product to be tested as product standard data, including: decomposing the implementation architecture of the product to be tested as a software entity; Input and output data, the standardized input and output data of each software entity is extracted from the software product version package; wherein, each software entity obtained after decomposition includes one of the following or any combination thereof: system, subsystem, module, and submodule.
  • each software entity obtained after decomposition includes one of the following or any combination thereof: system, subsystem, module, and submodule.
  • the computer acquires product standard data, it needs to obtain the overall structure and level of the realization function of the product under test, and decompose the realization structure of the product under test into structured and hierarchical software entities to obtain software entities of various granularities. .
  • the product standard data can support the requirements of generating test cases for each sub-function, and it is convenient to meet the demand expectations through the generated test cases. Content coverage completeness is measured.
  • the specific flow diagram of computer-generated product standard data can be referred to Figure 2.
  • the product to be tested is split into several subsystems.
  • the product to be tested is a communication product, it can be split into Configuration management, control plane, user plane, wireless scheduling, platform and other subsystems.
  • each subsystem is further split to obtain several modules of each subsystem (for example, the control plane subsystem in a communication product is further split into modules such as cell management, physical resource allocation, process management, etc.), and the communication product to be tested is obtained Software entities at different granularities.
  • the computer sorts out the content and rules of the interaction between each software entity and the peripheral environment. According to the sorted out rules, The standardized input and output data of each software entity is extracted from the software product version package as product standard data.
  • the computer extracts the standardized input and output data of each software entity from the software product version package, including: extracting the standardized input and output data of each software entity from the software product version package according to the category of the input and output data interacting with the outside;
  • the categories include one or any combination of the following: configuration parameters, key indicators, abnormal alarms, protocol cells, and software interfaces.
  • the computer extracts the standardized input and output data of each software entity, it classifies the product standard data that needs to be extracted, which can be divided into configuration parameters, key indicators, and abnormal alarms according to the purpose and implementation management methods in software products. , protocol cells, software interfaces and other categories.
  • the main product standard data that can be extracted include: configuration parameters, protocol cells, software interfaces, test logs, counters, key indicators, alarms, etc.
  • the respective data attributes are sorted out, and the storage format of each type of data is determined in combination with the attributes.
  • the data attributes of key indicators include: indicator number, indicator name, measurement type, indicator meaning, calculation formula, unit, etc.
  • extract rules that are automatically extracted from version packages and code bases. Complete the automatic extraction of these standardized data and their attributes in the version data package according to specific rules, and then complete the storage according to the storage address of this type of data, forming a standard input and output data set of the expected content of the requirements specification of the communication system under test.
  • Step 102 according to the function of the product under test and the value of the function to the user, determine the characteristics of the product under test.
  • the functions provided by the product to be tested and the value of these functions to users are tested respectively, and then the functions are evaluated according to the specific functions and their value to users. Classify and split out a series of characteristics of the product to be tested, where the characteristics represent product functions and values.
  • the typical characteristics of a communication system include: providing uplink power control services to achieve the effect of ensuring demodulation, reducing interference and saving power; providing DRX (discontinuous reception) configuration services to achieve a balance between service delay and terminal power saving effect .
  • the computer can also clarify the effect of each characteristic, and the implementation plan of each characteristic uses the dismantled system, subsystem, module, submodule, and products that interact with them Standard data to describe, and then determine the product standard data involved in each characteristic.
  • the modification content of the requirements is divided into one or several characteristics that have been determined, and can be extracted if necessary.
  • New features so as to ensure that the functions and values of software products are described through the feature system. Revise the software entity and product standard data of the affected features according to the changes of new requirements. At the same time, the required product standard data collection can also be maintained according to the granularity of the requirement.
  • DRX discontinuous Reception
  • this requirement can be divided into the feature “providing DRX (Discontinuous Reception) configuration services to achieve a balance between service delay and terminal power saving effect.” ", revise the product standard data involved in this characteristic according to the change of this requirement.
  • Step 103 for each feature, obtain each sub-function for realizing the feature, and generate test cases for each sub-function.
  • the computer divides the characteristics of the product to be tested, according to the functions and values of the characteristics, it analyzes the functions that need to be completed by each software entity that has been split, and splits the sub-functions of the characteristics step by step.
  • the sub-functions build a systematic test framework for the product to be tested, and design test cases for each sub-function in the test framework, where each sub-function of the product to be tested and the test cases for each sub-function are described by product standard data .
  • test cases for the sub-functions of the features the completeness of the test cases' coverage of the expected content of the requirements can be measured, and then the completeness and redundancy of the test cases can be measured from the perspectives of code implementation and requirement specification.
  • the computer when generating test cases, draws the timing interaction diagrams of the subsystems and modules involved for each feature, and combines the drawn timing interaction diagrams to split each A subfunction of a feature.
  • a systematic test framework of the communication system under test is constructed. Design test cases for each sub-function in the test architecture, and the input and output of each sub-function and its use cases are described by the above-mentioned product standard data.
  • the sequence interaction diagram of the feature "provide DRX (discontinuous reception) configuration service to achieve a balance between service delay and terminal power saving effect" (referred to as feature DRX) is shown in Figure 3 .
  • the computer splits the sub-functions of this feature step by step according to the order of the software carrying entity from the system —> multi-subsystem —> single subsystem —> single module, and the schematic diagram of the sub-function splitting of each feature As shown in Figure 4, the feature is split into the most granular sub-function level. These sub-functions make up the test architecture for this feature. Use cases are designed for each sub-function, and the input and output of each sub-function and its use cases are described by product standard data. All features and their sub-functions constitute the systematic test framework of the communication system under test, and the use cases of all sub-functions form the systematic test case set of the communication system under test.
  • the computer finds that the extracted product standard data is insufficient during the design process of feature sub-functions and use cases, it can further dismantle the physical software of the product to be tested, and based on the standardized input and output of the newly added physical software Data, adding specific product standard data; if it is found that the functions of some subsystems/modules cannot be decomposed into the extracted feature set, the function and value of the product to be tested can be further expanded to obtain new specific features.
  • the computer after the computer generates the test cases for each sub-function, it also includes: testing the test completeness of the product to be tested according to the product standard data, and/or, the redundancy of the generated test cases according to the product standard data Test the degree; supplement the test cases according to the test results of the completeness, and/or integrate the test cases according to the test results of the redundancy.
  • the computer After the computer generates initial test cases for each sub-function based on the product standard data, it can check the test completeness of the product to be tested based on the product standard data according to the preset order, or directly in the process of continuous integration of product requirements and continuous design of use cases During the process, the latest standardized data set of each version is automatically extracted, compared with the standardized data and its attributes covered by the current version of the use case, the test completeness of the product to be tested is tested, and the test case is tested if the test completeness is insufficient. Replenish. While testing the completeness of the test, the computer can also check the redundancy between the generated test cases according to the product standard data, and integrate the redundant test cases. Based on product standard data, during the test case generation process, the test completeness and test case redundancy are detected, so that the test case architecture and test cases can be improved and optimized in the design process, and the test case can be improved. Use-case refinement optimizes efficiency.
  • the detection of the test completeness of the product to be tested by the computer based on the product standard data includes one of the following or any combination thereof: according to the quantity of product standard data covered by sub-functions and the quantity of all product standard data, the test framework Detect the completeness of the test case; according to the quantity of product standard data covered by the test case and the quantity of all product standard data, test the completeness of the test case; according to the quantity of product standard data covered by the sub-function of the feature and all The quantity of product standard data is used to detect the completeness of the characteristic test framework; according to the quantity of product standard data covered by the characteristic test cases and the quantity of all product standard data involved in the characteristic, the completeness of the characteristic test cases is detected; according to the product The quantity of product standard data that has been covered by the sub-functions of the requirements and the quantity of all product standard data involved in the product requirements are used to test the completeness of the requirements test architecture; The quantity of all product standard data involved is used to test the completeness of the requirement test cases. Test the structure of various
  • the completeness of the test can be measured from multiple angles by the following calculation method, according to the product standard data
  • completeness of the requirement test structure the quantity of product standard data covered by the sub-functions of the requirement / the data involved in the requirement Quantity of all product standard data
  • completeness of requirement test case quantity of product standard data covered by use cases of the requirement/quantity of all product standard data involved in the requirement.
  • the computer can also supplement sub-functions or test cases if the completeness of the test does not meet the requirements. For example, for each data of various product standard data such as configuration parameters, protocol cells, software interfaces, counters, key indicators, and alarms of the communication system to be tested, count the number of sub-functions and use cases that refer to the data as input/output in sequence , when the overall test structure or test case completeness is low, refine the product standard data that has not been covered, and determine the characteristics of the product standard data that have not been referenced according to the feature splitting scheme, and perform sub-function splitting for this feature Sub-functions and supplements, and supplementary test cases for supplementary sub-functions, to complete the test coverage of product standard data that is currently not covered.
  • product standard data such as configuration parameters, protocol cells, software interfaces, counters, key indicators, and alarms of the communication system to be tested.
  • the computer detects the redundancy of the generated test cases according to the product standard data, including: detecting the similarity between the sub-functions according to the carrying entity of the sub-function and the product standard data; judging that the similarity is greater than There is redundancy in the test cases of sub-functions with preset thresholds.
  • the computer detects the redundancy of a test case, it directly characterizes the redundancy based on the functional similarity between the sub-functions corresponding to the test case, and through the carrying entity of the sub-function and its product standard data, Calculate the similarity between different sub-functions according to the similarity calculation algorithm between sub-functions, identify the sub-functions whose similarity is higher than the preset threshold, and then determine the existence of test cases between multiple sub-functions whose similarity is higher than the preset threshold Redundancy, and de-redundancy for the integration of sub-functions and use cases above the preset threshold, reducing the redundancy of test cases.
  • the similarity of the sub-function is checked, and the test cases that are likely to be relatively redundant are obtained by identifying similar sub-functions, and then the test cases are integrated to reduce the redundancy of the test cases.
  • S_Input represents the input similarity
  • S_Output represents the output Similarity
  • S_Bearer means bearer entity similarity.
  • the input similarity, output similarity, and bearer entity similarity can be calculated by the following formulas:
  • S_Bearer (M1Bearer ⁇ M2Bearer) ⁇ (M1Bearer ⁇ M2Bearer);
  • M1_Input and M2_Input represent the input product standard data sets of sub-functions M1 and M2 respectively
  • M1_Output and M2_Output represent the output product standard data sets of sub-functions M1 and M2 respectively
  • M1_Bearer and M2_Bearer represent the software bearing entities of sub-functions M1 and M2 respectively gather.
  • the computer after the computer generates test cases for each sub-function, it also includes: identifying the cross-influence between sub-functions according to the relationship between the product standard data of different sub-functions; Example. Specifically, for each sub-function in the test architecture, the mutual cross-influence can be identified based on the relationship between their corresponding product standard data. After creation, the poor impact between sub-functions can be detected, and supplementary test cases are used to cover the intersections of sub-functions that are currently not covered to improve the completeness of test cases.
  • M1_Output ⁇ M2_Input ⁇ indicating that part of the input of the sub-function M2 is determined by M1, after the change of the sub-function M1 needs to be revised synchronously to cover its impact on the sub-function M2;
  • M1_Input_Resource ⁇ M2_Input_Resource ⁇ where M1_Input_Resource and M2_Input_Resource represent the sub-function M1 , resource product standard input of M2, this formula indicates that sub-function M1 and sub-function M2 share some resources, when one of the sub-functions increases or decreases the occupancy of the corresponding resources, it is necessary to revise the use case synchronously to cover its use for the other sub-function
  • M1_Output ⁇ M2_Output ⁇ which indicates that the standard output of some products is jointly determined by sub-function M1 and sub-function M2. It is necessary to analyze the interaction between the operation timing and operation ratio of this part of output M1 and M2, and design use cases cover.
  • the computer further includes: supplementing the test cases according to the changed product standard data when the product standard data changes.
  • the computer needs to maintain the test case set.
  • the product under test may change in requirements and expectations in different versions, which will cause data increase or decrease in various product standard data and data Attribute changes. In order to ensure that the test cases can support new requirements, the redundancy of the test cases is as low as possible.
  • the computer extracts the current product standard data in the software version package or code library according to the pre-extraction rules, and extracts a new one each time.
  • FIG. 5 Another aspect of the embodiment of the present application relates to a device for generating test cases, referring to FIG. 5 , including:
  • An acquisition module 501 configured to acquire standardized input and output data of each software entity of the product to be tested as product standard data
  • a determining module 502 configured to determine each characteristic of the product to be tested according to the function of the product to be tested and the value of the function to the user, where the characteristic represents the function and value of the product;
  • the generating module 503 is used to obtain each sub-function for realizing the feature for each feature, and generate test cases for each sub-function;
  • this embodiment is an apparatus embodiment corresponding to the method embodiment, and this embodiment can be implemented in cooperation with the method embodiment.
  • the relevant technical details mentioned in the method embodiments are still valid in this embodiment, and will not be repeated here in order to reduce repetition.
  • the related technical details mentioned in this embodiment can also be applied in the method embodiment.
  • modules involved in this embodiment are logical modules.
  • a logical unit can be a physical unit, or a part of a physical unit, or multiple physical units. Combination of units.
  • units that are not closely related to solving the technical problem proposed by the present invention are not introduced in this embodiment, but this does not mean that there are no other units in this embodiment.
  • FIG. 6 it includes: at least one processor 601; Instructions executed by at least one processor 601, the instructions are executed by at least one processor 601, so that at least one processor 601 can execute the method for generating test cases described in any one of the above method embodiments.
  • the memory 602 and the processor 601 are connected by a bus, and the bus may include any number of interconnected buses and bridges, and the bus connects one or more processors 601 and various circuits of the memory 602 together.
  • the bus may also connect together various other circuits such as peripherals, voltage regulators, and power management circuits, all of which are well known in the art and therefore will not be further described herein.
  • the bus interface provides an interface between the bus and the transceivers.
  • a transceiver may be a single element or multiple elements, such as multiple receivers and transmitters, providing means for communicating with various other devices over a transmission medium.
  • the data processed by the processor 601 is transmitted on the wireless medium through the antenna, and further, the antenna also receives the data and transmits the data to the processor 601 .
  • Processor 601 is responsible for managing the bus and general processing, and may also provide various functions including timing, peripheral interface, voltage regulation, power management, and other control functions. And the memory 602 may be used to store data used by the processor 601 when performing operations.
  • Another aspect of the embodiments of the present application also provides a computer-readable storage medium storing a computer program.
  • the above method embodiments are implemented when the computer program is executed by the processor.
  • a storage medium includes several instructions to make a device ( It may be a single-chip microcomputer, a chip, etc.) or a processor (processor) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disk or optical disc, etc., which can store program codes. .

Abstract

Sont divulgués dans la présente demande un procédé et un appareil permettant de générer un cas test, ainsi qu'un dispositif électronique et un support de stockage. Le procédé consiste : à acquérir des données d'entrée et de sortie normalisées de chaque entité logicielle d'un produit à tester, puis à utiliser les données d'entrée et de sortie normalisées en tant que données de norme de produit ; à déterminer des caractéristiques dudit produit en fonction des fonctions dudit produit et des valeurs des fonctions dudit produit pour un utilisateur, les caractéristiques représentant les fonctions et les valeurs du produit ; et, pour chaque caractéristique, à acquérir chaque sous-fonction permettant de réaliser la caractéristique et à générer respectivement un cas test pour chaque sous-fonction, la sous-fonction et le cas test de la sous-fonction étant décrits par les données de norme de produit.
PCT/CN2022/128119 2021-12-08 2022-10-28 Procédé et appareil permettant de générer un cas test, dispositif électronique et support de stockage WO2023103640A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111493037.9A CN116302902A (zh) 2021-12-08 2021-12-08 测试用例的生成方法、装置、电子设备和存储介质
CN202111493037.9 2021-12-08

Publications (1)

Publication Number Publication Date
WO2023103640A1 true WO2023103640A1 (fr) 2023-06-15

Family

ID=86729585

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/128119 WO2023103640A1 (fr) 2021-12-08 2022-10-28 Procédé et appareil permettant de générer un cas test, dispositif électronique et support de stockage

Country Status (2)

Country Link
CN (1) CN116302902A (fr)
WO (1) WO2023103640A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101916225A (zh) * 2010-09-02 2010-12-15 于秀山 图形用户界面软件功能覆盖测试方法
CN102360331A (zh) * 2011-10-09 2012-02-22 中国航空无线电电子研究所 基于形式化描述的测试程序自动生成方法
CN103279415A (zh) * 2013-05-27 2013-09-04 哈尔滨工业大学 基于组合测试的嵌入式软件测试方法
CN105260300A (zh) * 2015-09-24 2016-01-20 四川长虹电器股份有限公司 基于会计准则通用分类标准应用平台的业务测试方法
US20180095867A1 (en) * 2016-10-04 2018-04-05 Sap Se Software testing with minimized test suite
CN111752833A (zh) * 2020-06-23 2020-10-09 南京领行科技股份有限公司 一种软件质量体系准出方法、装置、服务器及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101916225A (zh) * 2010-09-02 2010-12-15 于秀山 图形用户界面软件功能覆盖测试方法
CN102360331A (zh) * 2011-10-09 2012-02-22 中国航空无线电电子研究所 基于形式化描述的测试程序自动生成方法
CN103279415A (zh) * 2013-05-27 2013-09-04 哈尔滨工业大学 基于组合测试的嵌入式软件测试方法
CN105260300A (zh) * 2015-09-24 2016-01-20 四川长虹电器股份有限公司 基于会计准则通用分类标准应用平台的业务测试方法
US20180095867A1 (en) * 2016-10-04 2018-04-05 Sap Se Software testing with minimized test suite
CN111752833A (zh) * 2020-06-23 2020-10-09 南京领行科技股份有限公司 一种软件质量体系准出方法、装置、服务器及存储介质

Also Published As

Publication number Publication date
CN116302902A (zh) 2023-06-23

Similar Documents

Publication Publication Date Title
CN111290916B (zh) 大数据监控方法、装置、设备及计算机可读存储介质
CN109491894A (zh) 一种接口测试的方法及设备
US9378114B2 (en) Code analysis method, code analysis system and computer storage medium
CN110554958B (zh) 图数据库测试方法、系统、设备和存储介质
CN112311617A (zh) 一种配置化数据监控告警方法及系统
JPWO2012157471A1 (ja) 複数の制御システムの異常を検知する異常検知システム
CN111818136A (zh) 数据处理方法、装置、电子设备及计算机可读介质
CN110135590B (zh) 信息处理方法、装置、介质及电子设备
CN110737689B (zh) 数据标准符合性检测方法、装置、系统及存储介质
CN111782900B (zh) 异常业务检测方法、装置、电子设备及存储介质
CN111240876A (zh) 微服务的故障定位方法、装置、存储介质及终端
CN112306833A (zh) 应用程序的崩溃统计方法、装置、计算机设备及存储介质
CN110716912B (zh) 一种sql性能检测方法及服务器
WO2023103640A1 (fr) Procédé et appareil permettant de générer un cas test, dispositif électronique et support de stockage
CN110727565B (zh) 一种网络设备平台信息收集方法及系统
CN115509797A (zh) 一种故障类别的确定方法、装置、设备及介质
CN116414717A (zh) 基于流量回放的自动测试方法、装置、设备、介质及产品
CN109257348A (zh) 一种基于工业控制系统的集群漏洞挖掘方法和装置
CN115980541A (zh) 数据处理、测试方法、装置、设备及存储介质
CN111324542B (zh) 一种Web应用回归测试用例选择系统、方法以及设备
CN116401113B (zh) 一种异构众核架构加速卡的环境验证方法、装置及介质
CN112837040B (zh) 应用于智能电网的电力数据管理方法及系统
CN117313856B (zh) 一种可靠性测试规划系统及方法
CN115203685A (zh) 可信报告输出方法、装置、图像形成设备及介质
CN117112542A (zh) 异常血缘关系的报警方法、装置及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22903064

Country of ref document: EP

Kind code of ref document: A1