US10872034B2 - Method, device and computer program product for executing test cases - Google Patents

Method, device and computer program product for executing test cases Download PDF

Info

Publication number
US10872034B2
US10872034B2 US16/705,686 US201916705686A US10872034B2 US 10872034 B2 US10872034 B2 US 10872034B2 US 201916705686 A US201916705686 A US 201916705686A US 10872034 B2 US10872034 B2 US 10872034B2
Authority
US
United States
Prior art keywords
test
cases
subsets
case
executing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US16/705,686
Other versions
US20200117586A1 (en
Inventor
Shuo Lv
Quanhong Wang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
EMC Corp
Original Assignee
EMC IP Holding Co LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US16/705,686 priority Critical patent/US10872034B2/en
Application filed by EMC IP Holding Co LLC filed Critical EMC IP Holding Co LLC
Assigned to EMC IP Holding Company LLC reassignment EMC IP Holding Company LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LV, SHUO
Assigned to EMC CORPORATION reassignment EMC CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WANG, QUANHONG
Assigned to EMC IP Holding Company LLC reassignment EMC IP Holding Company LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EMC CORPORATION
Assigned to THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS COLLATERAL AGENT reassignment THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS COLLATERAL AGENT PATENT SECURITY AGREEMENT (NOTES) Assignors: DELL PRODUCTS L.P., EMC IP Holding Company LLC
Assigned to CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH reassignment CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH SECURITY AGREEMENT Assignors: DELL PRODUCTS L.P., EMC IP Holding Company LLC
Publication of US20200117586A1 publication Critical patent/US20200117586A1/en
Assigned to THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A. reassignment THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A. SECURITY AGREEMENT Assignors: CREDANT TECHNOLOGIES INC., DELL INTERNATIONAL L.L.C., DELL MARKETING L.P., DELL PRODUCTS L.P., DELL USA L.P., EMC CORPORATION, EMC IP Holding Company LLC, FORCE10 NETWORKS, INC., WYSE TECHNOLOGY L.L.C.
Assigned to THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS COLLATERAL AGENT reassignment THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS COLLATERAL AGENT SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DELL PRODUCTS L.P., EMC CORPORATION, EMC IP Holding Company LLC
Publication of US10872034B2 publication Critical patent/US10872034B2/en
Application granted granted Critical
Assigned to EMC IP Holding Company LLC, DELL PRODUCTS L.P. reassignment EMC IP Holding Company LLC RELEASE OF SECURITY INTEREST AF REEL 052243 FRAME 0773 Assignors: CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH
Assigned to EMC IP Holding Company LLC, DELL PRODUCTS L.P. reassignment EMC IP Holding Company LLC RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (052216/0758) Assignors: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT
Assigned to EMC CORPORATION, EMC IP Holding Company LLC, DELL PRODUCTS L.P. reassignment EMC CORPORATION RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (053311/0169) Assignors: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3604Software analysis for verifying properties of programs
    • G06F11/3612Software analysis for verifying properties of programs by runtime analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/02Knowledge representation; Symbolic representation

Definitions

  • Embodiments of the present disclosure generally relate to the field of test, and more specifically to a method, device for and a computer program product executing test cases.
  • test case refers to a group of test inputs, execution conditions and expected result compiled for a specific purpose, so as to test whether a certain component, module or function satisfies a predetermined demand.
  • the test script is written in a script language.
  • the script language is a programming language such as Ruby, Perl, Python, Shell and C # which focuses on interpretation upon execution.
  • each object to be tested comprises a plurality of components, and one or more test cases might exist for each component.
  • a plurality of test cases such as a set of test cases, usually need to be executed for the object.
  • a test suite refers to a group of associated test cases in the set of test cases, and is also referred to as a test subset. By use of the test suite, it is possible to execute, in combination, a series of test cases serving the same test target or under the same running environment.
  • Embodiments of the present disclosure provide a method, device and a computer program product for executing test cases.
  • a method for executing a test case comprises: obtaining a set of test cases to be executed, wherein the set of test cases is divided into one or more test subsets; for a test case in the set of test cases, determining a test platform type and a test script associated with the test case based on a knowledge base; determining a test subset to which the test case belongs based on the test platform type; and executing the test case in the test subset based on the test script.
  • a device for executing a test case comprises a processing unit and a memory coupled to the processing unit and storing instructions thereon.
  • the instructions when executed by the processing unit, perform the following acts: obtaining a set of test cases to be executed, wherein the set of test cases is divided into one or more test subsets; for a test case in the set of test cases, determining a test platform type and a test script associated with the test case based on a knowledge base; determining a test subset to which the test case belongs based on the test platform type; and executing the test case in the test subset based on the test script.
  • a computer program product that is tangibly stored on a non-transient computer readable medium and includes machine-executable instructions.
  • the machine-executable instructions when executed, cause a computer to execute the method according to embodiments of the present disclosure.
  • FIG. 1 illustrates a schematic view of architecture of a system for executing a test case according to an embodiment of the present disclosure
  • FIG. 2 illustrates a flow chart of a method for executing a test case according to an embodiment of the present disclosure
  • FIG. 3 illustrates a schematic diagram of generation of a test suite according to an embodiment of the present disclosure
  • FIG. 4 illustrates a schematic diagram of running information for a test script according to an embodiment of the present disclosure
  • FIG. 5 illustrates a schematic diagram for creating a test case table in a knowledge base according to an embodiment of the present disclosure
  • FIG. 6 illustrates a schematic diagram for creating a test case execution record table in a knowledge base according to an embodiment of the present disclosure
  • FIG. 7 illustrates a schematic diagram of retry of test cases according to an embodiment of the present disclosure.
  • FIG. 8 illustrates a block diagram of a device that may be used to implement an embodiment of the present disclosure.
  • test suite described herein represents a group of relevant test cases in the test set, it may be divided based on a predetermined policy, and it may also be called a test subset, test subtask or test sub-job. Other explicit or implicit definitions may further be included hereunder.
  • a tester needs to manually combine one or more test cases to generate various test suites, and manually collects test scripts and then executes test. During test, the tester further needs to manually handle test exception, and collect test results one by one. Since the number of test cases might be very large and a conventional test procedure is complicated, a lot of manpower is needed and makes the testing efficiency very low.
  • Embodiments of the present disclosure propose a solution for executing test case(s). According to embodiments described here, after a set of test cases to be executed is obtained, a test platform type of each test case is determined based on a knowledge base, thereby automatically dividing the set of test cases into a plurality of test suites. Then, a respective test environment and test script are used to automatically execute each test suite. Accordingly, embodiments of the present disclosure can implement automatic generation and execution of the test suites, and can improve the operation efficiency of the test.
  • test suites when the test suites are divided, thoughts are given to the test platform types of the test cases as well as to total runtime of each test suite, so that the runtime of all test suites differs not much to enable parallel processing among a plurality of test suites.
  • a regression test refers to, after an old code is amended, performing test again to confirm that the amendment does not introduce new errors or cause other codes to generate errors.
  • the regression test aims to ensure old components or modules can still operate normally when the system adds new components or modules, and ensure constant advancement of new-version systems. Since the test workload of the regression test is very large and test cases that are already test previously are used again, applying the technology of embodiments of the present disclosure to regression test can substantially improve the test efficiency.
  • FIG. 1 through FIG. 8 illustrate basic principles and several example implementation modes of the present disclosure. It should be appreciated that these example embodiments are presented here to enable those skilled in the art to better understand and then implement embodiments of the present disclosure, not limit the scope of the present disclosure in any manner.
  • FIG. 1 illustrates a schematic view of an architecture of a system 100 for executing a test case(s) according to an embodiment of the present disclosure.
  • the system 100 comprises a device 110 and a knowledge base 120 .
  • the device 110 may be various computing devices with a processing capability, and includes but not limited to a personal computer and a server. An example implementation of the device 110 is described with reference to FIG. 8 .
  • the knowledge base 120 may be a database or server, it may be located at the local of the device 110 or at the remote (e.g., in cloud) of the device 110 , and may communicate with the device 110 via a network.
  • An example implementation of a table in the knowledge base is described with reference to FIG. 5 , FIG. 6 and Table 1.
  • the device 110 comprises a test suite generation module 130 and a test suite execution module 140 .
  • the test suite generation module 130 may dynamically generate one or more test suites according to the set of test cases (an example implementation of dynamically generating a plurality of test suites is described below with reference to FIG. 3 ).
  • the test suite execution module 140 may include one or more test execution engines for executing the test suites, as will be described below in detail. In the case of presence of a plurality of test suites (each test suite may include one or more test cases), the plurality of test execution engines in the test suite execution module 140 may execute a plurality of test suites in parallel, and each test execution engine may execute all test cases in each test suite in series.
  • the test suite generation module 130 may generate a plurality of test suites based on the knowledge base. As shown by arrow 160 , the test suite generation module 130 may transfer the generated plurality of test suites to the test suite execution module 140 for execution. As shown by arrow 170 , the test suite execution module 140 executes the plurality of test suites in parallel by querying the knowledge base 120 , and may write a test execution record and a test report into the knowledge base 120 .
  • FIG. 2 illustrates a flow chart of a method 200 for executing a test case(s) according to an embodiment of the present disclosure.
  • the method 200 may be executed by the device 110 described above with reference to FIG. 1 and device 800 described below with reference to FIG. 8 .
  • a set of test cases to be executed is obtained, and the set of test cases is divided into one or more test subsets (namely, test suites) based on a predetermined policy.
  • the test suite generation module 130 may determine the set of test cases to be executed according to the user's selection, e.g., the user may directly select some test cases.
  • the test suite generation module 130 may further obtain the set of test cases based on a set of components to be tested and by querying the knowledge base 120 . For example, the user may select the components to be tested.
  • the following table 1 shows an example data table in the knowledge base 120 .
  • Example data table in the knowledge base Test Test Component platform case name name type Test script address Runtime Test 1 Component X AAA / . . . /bin/test1.rb 182 s Test 2 Component X AAA / . . . /bin/test2.rb 255 s Test 3 Component X BBB / . . . /bin/test3.rb 72 s Test 4 Component X AAA / . . . /bin/test4.rb 68 s Test 5 Component Y BBB / . . . /bin/test5.rb 165 s Test 6 Component Y CCC / . . .
  • the example data table in Table 1 includes name of each test case, name of a component to which it belongs, the required test platform type, a corresponding test script address, and runtime spent in running the test.
  • the test cases in the table may be pre-collected test cases. In for example in a scenario of automatic regression test, the test cases in the table may be historical test cases which have been executed.
  • a test platform type and a test script associated with the test case is determined based on the knowledge base.
  • the test suite generation module 130 may determine the test platform type, test script address, component name and running information of each test case by querying the knowledge base 120 (a reference implementation of running information is described below with reference to FIG. 4 ).
  • the knowledge base 120 may store a plurality of test cases written in various programming languages, and the test platform run by it varies with the test cases, wherein the test platform is a running platform of test cases, for example, a hardware system and/or operating system.
  • a test subset (namely, test suite) to which the test case belongs is determined according to the test platform type.
  • the test suite generation module 130 may group test cases of the same test platform type in a test suite.
  • thoughts are given to the test platform types of the test cases as well as to total runtime of each test suite, so that the runtime of all test suites differs not much to be suitable for parallel processing between a plurality of test suites.
  • these test cases cannot share the test platform.
  • division of test suites needs to consider improvement of test efficiency as well as some internal other demands.
  • the test case(s) in the test subset is executed based on the test script.
  • the test suite execution module 140 may instantiate a test environment for the test platform type based on the test platform type, and for each test suite, a plurality of test scripts associated with a plurality of test cases in the test suite are executed sequentially.
  • the test suite execution module 140 may further obtain running parameters. During the running of the test, the test suite execution module 140 executes parameter substitution, and automatically executes exception processing and fault retry. In some embodiments, after completion of execution of the test subset, the test suite execution module 140 may collect a test result from a running log, and generate a test report based on the collected test results.
  • a plurality of test execution engines in the test suite execution module 140 execute a plurality of test suites in parallel, thereby improving the test running speed on the whole. In this way, a plurality of test suites are executed in parallel, and a plurality of test cases of each test suite are executed in series.
  • FIG. 3 illustrates a schematic diagram 300 for generating test suite according to an embodiment of the present disclosure.
  • a knowledge base 302 in FIG. 3 may be the knowledge base 120 described in FIG. 1 , and it is assumed that the knowledge base 302 stores the data table as shown in Table 1.
  • the knowledge base 302 includes a plurality of test cases, and each test case is stored in association with the component name corresponding thereto. For example, TEST 1 is performed for Component X, and TEST 5 is performed for Component Y.
  • test cases 304 including TEST 1 , TEST 2 , TEST 3 , TEST 4 , TEST 5 and TEST 6 . It is possible to, by querying the knowledge base 302 , determine the test platform type needed by each test case.
  • TEST 1 , TEST 2 and TEST 4 run on AAA platform
  • TEST 3 and TEST 5 run on BBB platform
  • TEST 6 runs on CCC platform
  • platforms AAA, BBB, and CCC may be for example different hardware systems and/or operating systems.
  • the set of test cases 304 is divided into a plurality of test suites that is, namely, a test suite 308 including TEST 1 and TEST 4 , a test suite 310 including TEST 2 , a test suite 312 including TEST 3 and TEST 5 , and a test suite 314 including TEST 6 (if test 1 , test 2 and test 4 all are grouped in a test suite, the runtime of this test suite is much larger than other test suites).
  • the total runtime of the test suite may not be considered, and the test platform type is only considered. In this case, the test suite 308 and test suite 310 with respect to the test platform type AAA are merged into one test suite.
  • each test script is run as an independent procedure having a corresponding parameter.
  • the test script information may include test case name, name of a component to which the test case belongs, a storage address or location of the test script, and running parameters including variables to be replaced during execution.
  • the control information may include pass or abortion information of a key test case, minimum version control, maximum version control, log detail degree level, and other information for use in control. These information may be combined together to form running information (also called “hash line”) with respect to the test script, and it may be interpreted by the test suite execution module 140 during execution.
  • deduplication processing may be performed for the running information before the set of test cases is divided into a plurality of test suites.
  • FIG. 4 illustrates a schematic diagram of running information for a test script according to an embodiment of the present disclosure.
  • the test suite generation module 130 may obtain the running information of the test script based on the knowledge base 120 .
  • the running information of the test script may include component name (component field), test case name (testname field), test script address or location (script field), running parameters (params field), minimum version control (minlimitver field) and timeout information (timeout field), and so on.
  • FIG. 5 illustrates a schematic diagram for creating a test case table in a knowledge base according to an embodiment of the present disclosure.
  • the test suite generation module 130 may store, in the knowledge base 120 , a mapping relationship between test case name and running information (hash line).
  • the test case table may include test case name (case name field), component name (component field), running information (hash line field), and test platform type (test platform_type field), wherein the test case name is a major key.
  • FIG. 6 illustrates a schematic diagram for creating a test cases execution record table in a knowledge base according to an embodiment of the present disclosure.
  • the test suite execution module 140 analyzes the generated test log, and stores a test record for each test case into the knowledge base.
  • the test case execution record table may include test case name (case_name field), version number (build_no field), post time (post_time field), test status (status field), constant running duration (duration field), commend ID (cmd_id field), log details (log_details field), digest identification (digest_id field), and job name (job_name field).
  • the system After completion of execution of all test suites, the system will retrieve test results and generate a final report, then may send the final report to the user in the form of a visualized document (e.g., HTML file).
  • a visualized document e.g., HTML file
  • content of the test report may include a test result status (pass or fail) of each test case, log details, test execution duration, and fault situations and so on.
  • test suite execution module 140 may, in response to a plurality of faults happening during execution of the test subset, adjust the order of the first test case associated with the first fault in the plurality of faults to be after the second test case associated with the second fault. Then, for the adjusted plurality of faults, the test suite execution module 140 may execute test cases associated with the plurality of faults again.
  • FIG. 7 illustrates a schematic diagram 700 of retry of test cases according to an embodiment of the present disclosure.
  • a test case 720 fails; at a time point 783 , a test case 740 fails; and at a time point 784 , a test case 760 fails.
  • the execution order of failed test cases may be adjusted, and then tests are executed again.
  • the test case 720 that fails the earliest is adjusted to be executed finally.
  • the test case 740 , test case 760 and test case 720 are executed in an adjusted order, thereby enabling all the three test cases to pass successfully.
  • the user only needs to select components or test cases to be tested, regardless how the test suites are specifically generated, thereby implementing automatic generation and execution of the test suites.
  • embodiments of the present disclosure can improve the test running efficiency, implement reliable fault processing, and output a detailed test report.
  • FIG. 8 illustrates a block diagram of a device 800 which is used to implement an embodiment of the present disclosure.
  • the device 800 comprises a central processing unit (CPU) 801 that may perform various appropriate actions and processing based on computer program instructions stored in a read-only memory (ROM) 802 or computer program instructions loaded from a memory unit 808 to a random access memory (RAM) 803 .
  • ROM read-only memory
  • RAM random access memory
  • the CPU 801 , ROM 802 and RAM 803 are connected to each other via a bus 804 .
  • An input/output (I/O) interface 805 is also connected to the bus 804 .
  • Various components in the device 800 are connected to the I/O interface 805 , including: an input unit 806 such as a keyboard, a mouse and the like; an output unit 807 such as various kinds of displays and a loudspeaker, and the like; a memory unit 808 such as a magnetic disk, an optical disk, and the like; a communication unit 809 such as a network card, a modem, and a wireless communication transceiver, and the like.
  • the communication unit 809 allows the device 800 to exchange information/data with other devices through a computer network such as the Internet and/or various kinds of telecommunications networks.
  • the method may be implemented as a computer software program that is tangibly embodied on a machine readable medium, e.g., the storage unit 808 .
  • part or all of the computer program may be loaded and/or mounted onto the device 800 via ROM 802 and/or communication unit 809 .
  • the computer program is loaded to the RAM 803 and executed by the CPU 801 , one or more steps of the method as described above may be executed.
  • the method 200 described above may be implemented as a computer program product.
  • the computer program product may include a computer readable storage medium which carries computer readable program instructions for executing aspects of the present disclosure.
  • the computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device.
  • the computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • a non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disk
  • memory stick a floppy disk
  • a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon
  • a computer readable storage medium is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
  • Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network.
  • the network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
  • Computer readable program instructions for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language, and conventional procedural programming languages.
  • the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
  • These computer readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.

Abstract

A method and device for executing test cases includes obtaining a set of test cases to be executed, and determining a test platform type and a test script associated with each test case in the set of test cases based on a knowledge base. The set of test cases may be divided into a plurality of test subsets or test suites based on the test platform type, and test cases in each test subset executed using the respective test environment and test script. The test suites may be generated automatically based on the knowledge base, and the respective test environment and test script are used for executing each test suite. Automatic generation and execution of the test suites can improve the operation efficiency for test cases.

Description

RELATED APPLICATIONS
This application is a continuation of U.S. application Ser. No. 16/173,077 filed Oct. 29, 2018, which claims priority from Chinese Patent Application Number CN 201711025184.7, filed on Oct. 27, 2017 at the State Intellectual Property Office, China, titled “METHOD, DEVICE AND COMPUTER PROGRAM PRODUCT FOR PERFORMING TEST CASES” the contents of which are incorporated herein by reference in their entirety.
BACKGROUND
Embodiments of the present disclosure generally relate to the field of test, and more specifically to a method, device for and a computer program product executing test cases.
A test case refers to a group of test inputs, execution conditions and expected result compiled for a specific purpose, so as to test whether a certain component, module or function satisfies a predetermined demand. To execute the text case automatically, it is necessary to write a respective test script which is usually characterized by a series of program instructions. Generally, the test script is written in a script language. The script language is a programming language such as Ruby, Perl, Python, Shell and C # which focuses on interpretation upon execution.
Generally, each object to be tested comprises a plurality of components, and one or more test cases might exist for each component. Hence, a plurality of test cases, such as a set of test cases, usually need to be executed for the object. A test suite refers to a group of associated test cases in the set of test cases, and is also referred to as a test subset. By use of the test suite, it is possible to execute, in combination, a series of test cases serving the same test target or under the same running environment.
SUMMARY
Embodiments of the present disclosure provide a method, device and a computer program product for executing test cases.
According to one aspect of the present disclosure, there is provided a method for executing a test case. The method comprises: obtaining a set of test cases to be executed, wherein the set of test cases is divided into one or more test subsets; for a test case in the set of test cases, determining a test platform type and a test script associated with the test case based on a knowledge base; determining a test subset to which the test case belongs based on the test platform type; and executing the test case in the test subset based on the test script.
According to another aspect of the present disclosure, there is provided a device for executing a test case. The device comprises a processing unit and a memory coupled to the processing unit and storing instructions thereon. The instructions, when executed by the processing unit, perform the following acts: obtaining a set of test cases to be executed, wherein the set of test cases is divided into one or more test subsets; for a test case in the set of test cases, determining a test platform type and a test script associated with the test case based on a knowledge base; determining a test subset to which the test case belongs based on the test platform type; and executing the test case in the test subset based on the test script.
According to a further aspect of the present disclosure, there is provided a computer program product that is tangibly stored on a non-transient computer readable medium and includes machine-executable instructions. The machine-executable instructions, when executed, cause a computer to execute the method according to embodiments of the present disclosure.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
The above and other features, advantages and aspects of embodiments of the present disclosure will be made more apparent by describing the present disclosure in more detail with reference to figures. In the figures, the same or like reference signs represent the same or like elements, wherein,
FIG. 1 illustrates a schematic view of architecture of a system for executing a test case according to an embodiment of the present disclosure;
FIG. 2 illustrates a flow chart of a method for executing a test case according to an embodiment of the present disclosure;
FIG. 3 illustrates a schematic diagram of generation of a test suite according to an embodiment of the present disclosure;
FIG. 4 illustrates a schematic diagram of running information for a test script according to an embodiment of the present disclosure;
FIG. 5 illustrates a schematic diagram for creating a test case table in a knowledge base according to an embodiment of the present disclosure;
FIG. 6 illustrates a schematic diagram for creating a test case execution record table in a knowledge base according to an embodiment of the present disclosure;
FIG. 7 illustrates a schematic diagram of retry of test cases according to an embodiment of the present disclosure; and
FIG. 8 illustrates a block diagram of a device that may be used to implement an embodiment of the present disclosure.
DETAILED DESCRIPTION OF EMBODIMENTS
Preferred embodiments of the present disclosure will be described below in more detail with reference to figures. Although figures show preferred embodiments of the present disclosure, it should be appreciated that the present disclosure may be implemented in various forms and should not be limited by embodiments stated herein. On the contrary, these embodiments are provided to make the present disclosure more apparent and complete, and to convey the scope of the present disclosure entirely to those skilled in the art.
As used herein, the term “include” and its variants are to be read as open terms that mean “include, but is not limited to.” Unless otherwise specified, the term “or” represents “and/or”. The term “based on” is to be read as “based at least in part on.” The term “an implementation” is to be read as “at least one implementation.” The term “another implementation” is to be read as “at least one other implementation.” Terms “first” and “second” may refer to different or identical objects. In addition, the term “test suite” described herein represents a group of relevant test cases in the test set, it may be divided based on a predetermined policy, and it may also be called a test subset, test subtask or test sub-job. Other explicit or implicit definitions may further be included hereunder.
Conventionally, for a test task including a plurality of text cases, a tester needs to manually combine one or more test cases to generate various test suites, and manually collects test scripts and then executes test. During test, the tester further needs to manually handle test exception, and collect test results one by one. Since the number of test cases might be very large and a conventional test procedure is complicated, a lot of manpower is needed and makes the testing efficiency very low.
Embodiments of the present disclosure propose a solution for executing test case(s). According to embodiments described here, after a set of test cases to be executed is obtained, a test platform type of each test case is determined based on a knowledge base, thereby automatically dividing the set of test cases into a plurality of test suites. Then, a respective test environment and test script are used to automatically execute each test suite. Accordingly, embodiments of the present disclosure can implement automatic generation and execution of the test suites, and can improve the operation efficiency of the test.
In addition, information related to each test case is stored in the knowledge base according to the embodiment of the present disclosure, so the set of test cases to be executed can be very conveniently obtained from the knowledge base. Meanwhile, in embodiments of the present disclosure, when the test suites are divided, thoughts are given to the test platform types of the test cases as well as to total runtime of each test suite, so that the runtime of all test suites differs not much to enable parallel processing among a plurality of test suites.
A regression test refers to, after an old code is amended, performing test again to confirm that the amendment does not introduce new errors or cause other codes to generate errors. The regression test aims to ensure old components or modules can still operate normally when the system adds new components or modules, and ensure constant advancement of new-version systems. Since the test workload of the regression test is very large and test cases that are already test previously are used again, applying the technology of embodiments of the present disclosure to regression test can substantially improve the test efficiency. Those skilled in the art should appreciate that although some implementations of the present disclosure are applied to regression test, embodiments of the present disclosure may also be applied to ordinary hardware and software test.
Reference is made below to FIG. 1 through FIG. 8 to illustrate basic principles and several example implementation modes of the present disclosure. It should be appreciated that these example embodiments are presented here to enable those skilled in the art to better understand and then implement embodiments of the present disclosure, not limit the scope of the present disclosure in any manner.
FIG. 1 illustrates a schematic view of an architecture of a system 100 for executing a test case(s) according to an embodiment of the present disclosure. As shown in FIG. 1, the system 100 comprises a device 110 and a knowledge base 120. The device 110 may be various computing devices with a processing capability, and includes but not limited to a personal computer and a server. An example implementation of the device 110 is described with reference to FIG. 8. The knowledge base 120 may be a database or server, it may be located at the local of the device 110 or at the remote (e.g., in cloud) of the device 110, and may communicate with the device 110 via a network. An example implementation of a table in the knowledge base is described with reference to FIG. 5, FIG. 6 and Table 1.
Referring to FIG. 1, the device 110 comprises a test suite generation module 130 and a test suite execution module 140. The test suite generation module 130 may dynamically generate one or more test suites according to the set of test cases (an example implementation of dynamically generating a plurality of test suites is described below with reference to FIG. 3). The test suite execution module 140 may include one or more test execution engines for executing the test suites, as will be described below in detail. In the case of presence of a plurality of test suites (each test suite may include one or more test cases), the plurality of test execution engines in the test suite execution module 140 may execute a plurality of test suites in parallel, and each test execution engine may execute all test cases in each test suite in series.
As shown by arrow 150, the test suite generation module 130 may generate a plurality of test suites based on the knowledge base. As shown by arrow 160, the test suite generation module 130 may transfer the generated plurality of test suites to the test suite execution module 140 for execution. As shown by arrow 170, the test suite execution module 140 executes the plurality of test suites in parallel by querying the knowledge base 120, and may write a test execution record and a test report into the knowledge base 120.
FIG. 2 illustrates a flow chart of a method 200 for executing a test case(s) according to an embodiment of the present disclosure. Those skilled in the art should appreciate that the method 200 may be executed by the device 110 described above with reference to FIG. 1 and device 800 described below with reference to FIG. 8.
At 202, a set of test cases to be executed is obtained, and the set of test cases is divided into one or more test subsets (namely, test suites) based on a predetermined policy. For example, the test suite generation module 130 may determine the set of test cases to be executed according to the user's selection, e.g., the user may directly select some test cases. In some embodiments, the test suite generation module 130 may further obtain the set of test cases based on a set of components to be tested and by querying the knowledge base 120. For example, the user may select the components to be tested. The following table 1 shows an example data table in the knowledge base 120.
TABLE 1
Example data table in the knowledge base
Test
Test Component platform
case name name type Test script address Runtime
Test
1 Component X AAA / . . . /bin/test1.rb 182 s
Test 2 Component X AAA / . . . /bin/test2.rb 255 s
Test 3 Component X BBB / . . . /bin/test3.rb  72 s
Test 4 Component X AAA / . . . /bin/test4.rb  68 s
Test 5 Component Y BBB / . . . /bin/test5.rb 165 s
Test 6 Component Y CCC / . . . /bin/test6.rb 230 s
Test 7 Component Z DDD / . . . /bin/test7.rb 168 s
Test 8 Component Z AAA / . . . /bin/test8.rb 293 s
Test 9 Component Z AAA / . . . /bin/test9.rb 117 s
Test 10 Component Z BBB / . . . /bin/test10.rb  85 s

Wherein the example data table in Table 1 includes name of each test case, name of a component to which it belongs, the required test platform type, a corresponding test script address, and runtime spent in running the test. The test cases in the table may be pre-collected test cases. In for example in a scenario of automatic regression test, the test cases in the table may be historical test cases which have been executed.
At 204, for a test case in the set of test cases, a test platform type and a test script associated with the test case is determined based on the knowledge base. For example, the test suite generation module 130 may determine the test platform type, test script address, component name and running information of each test case by querying the knowledge base 120 (a reference implementation of running information is described below with reference to FIG. 4). The knowledge base 120 may store a plurality of test cases written in various programming languages, and the test platform run by it varies with the test cases, wherein the test platform is a running platform of test cases, for example, a hardware system and/or operating system.
At 206, a test subset (namely, test suite) to which the test case belongs is determined according to the test platform type. Optionally, the test suite generation module 130 may group test cases of the same test platform type in a test suite. Alternatively, when the test suites are divided, thoughts are given to the test platform types of the test cases as well as to total runtime of each test suite, so that the runtime of all test suites differs not much to be suitable for parallel processing between a plurality of test suites. In addition, since some test cases will cause damages to the system during execution, these test cases cannot share the test platform. Hence, division of test suites needs to consider improvement of test efficiency as well as some internal other demands.
At 208, the test case(s) in the test subset is executed based on the test script. For example, the test suite execution module 140 may instantiate a test environment for the test platform type based on the test platform type, and for each test suite, a plurality of test scripts associated with a plurality of test cases in the test suite are executed sequentially.
In some embodiments, the test suite execution module 140 may further obtain running parameters. During the running of the test, the test suite execution module 140 executes parameter substitution, and automatically executes exception processing and fault retry. In some embodiments, after completion of execution of the test subset, the test suite execution module 140 may collect a test result from a running log, and generate a test report based on the collected test results.
In some embodiments, a plurality of test execution engines in the test suite execution module 140 execute a plurality of test suites in parallel, thereby improving the test running speed on the whole. In this way, a plurality of test suites are executed in parallel, and a plurality of test cases of each test suite are executed in series.
FIG. 3 illustrates a schematic diagram 300 for generating test suite according to an embodiment of the present disclosure. It should be appreciated that a knowledge base 302 in FIG. 3 may be the knowledge base 120 described in FIG. 1, and it is assumed that the knowledge base 302 stores the data table as shown in Table 1. As shown in FIG. 3, the knowledge base 302 includes a plurality of test cases, and each test case is stored in association with the component name corresponding thereto. For example, TEST 1 is performed for Component X, and TEST 5 is performed for Component Y.
In the case that the tester needs to test Component X and Component Y (e.g., the user selects Component X and Component Y through a visualized interface), all test cases associated with Component X and Component Y are obtained to form a set of test cases 304, including TEST 1, TEST 2, TEST 3, TEST 4, TEST 5 and TEST 6. It is possible to, by querying the knowledge base 302, determine the test platform type needed by each test case. As shown in 306, TEST 1, TEST 2 and TEST 4 run on AAA platform, TEST 3 and TEST 5 run on BBB platform, and TEST 6 runs on CCC platform, wherein platforms AAA, BBB, and CCC may be for example different hardware systems and/or operating systems.
As shown in FIG. 3, it is possible to, based on the test platform types and total runtime of each test suite, the set of test cases 304 is divided into a plurality of test suites that is, namely, a test suite 308 including TEST 1 and TEST 4, a test suite 310 including TEST 2, a test suite 312 including TEST 3 and TEST 5, and a test suite 314 including TEST 6 (if test 1, test 2 and test 4 all are grouped in a test suite, the runtime of this test suite is much larger than other test suites). It should be appreciated that the total runtime of the test suite may not be considered, and the test platform type is only considered. In this case, the test suite 308 and test suite 310 with respect to the test platform type AAA are merged into one test suite.
It should be appreciated that each test script is run as an independent procedure having a corresponding parameter. During the run of the test script, it is necessary to use test script information and some control information (e.g., timeout information). In some embodiments, the test script information may include test case name, name of a component to which the test case belongs, a storage address or location of the test script, and running parameters including variables to be replaced during execution. In some embodiments, the control information may include pass or abortion information of a key test case, minimum version control, maximum version control, log detail degree level, and other information for use in control. These information may be combined together to form running information (also called “hash line”) with respect to the test script, and it may be interpreted by the test suite execution module 140 during execution. In some embodiments, deduplication processing may be performed for the running information before the set of test cases is divided into a plurality of test suites.
FIG. 4 illustrates a schematic diagram of running information for a test script according to an embodiment of the present disclosure. For example, the test suite generation module 130 may obtain the running information of the test script based on the knowledge base 120. As shown in FIG. 4, the running information of the test script may include component name (component field), test case name (testname field), test script address or location (script field), running parameters (params field), minimum version control (minlimitver field) and timeout information (timeout field), and so on.
FIG. 5 illustrates a schematic diagram for creating a test case table in a knowledge base according to an embodiment of the present disclosure. During automatic generation of test suites, the test suite generation module 130 may store, in the knowledge base 120, a mapping relationship between test case name and running information (hash line). As shown in FIG. 5, the test case table may include test case name (case name field), component name (component field), running information (hash line field), and test platform type (test platform_type field), wherein the test case name is a major key.
FIG. 6 illustrates a schematic diagram for creating a test cases execution record table in a knowledge base according to an embodiment of the present disclosure. After completing execution of each test suite, the test suite execution module 140 analyzes the generated test log, and stores a test record for each test case into the knowledge base. As shown in FIG. 6, the test case execution record table may include test case name (case_name field), version number (build_no field), post time (post_time field), test status (status field), constant running duration (duration field), commend ID (cmd_id field), log details (log_details field), digest identification (digest_id field), and job name (job_name field).
After completion of execution of all test suites, the system will retrieve test results and generate a final report, then may send the final report to the user in the form of a visualized document (e.g., HTML file). For example, content of the test report may include a test result status (pass or fail) of each test case, log details, test execution duration, and fault situations and so on. In some embodiments, it is possible to store fault information in a fault tracking tool into the knowledge base. In this way, when the same fault happens again next time, fault-related reference information will be retrieved from the knowledge base.
During system test, execution of a certain test case or some test cases may need to depend on other test cases. Hence, if faults happen during the test, it is possible to adjust the order of faults and then retry the faults. For example, the test suite execution module 140 may, in response to a plurality of faults happening during execution of the test subset, adjust the order of the first test case associated with the first fault in the plurality of faults to be after the second test case associated with the second fault. Then, for the adjusted plurality of faults, the test suite execution module 140 may execute test cases associated with the plurality of faults again.
FIG. 7 illustrates a schematic diagram 700 of retry of test cases according to an embodiment of the present disclosure. As shown in FIG. 7, on a time axis 780, during the running of the first round of tests from a time point 781 to a time point 785, seven test cases 710, 720, 730, 740, 750, 760 and 770 in a certain test suite are executed sequentially. However, at the time point 782, a test case 720 fails; at a time point 783, a test case 740 fails; and at a time point 784, a test case 760 fails.
According to embodiments of the present disclosure, the execution order of failed test cases may be adjusted, and then tests are executed again. For example, the test case 720 that fails the earliest is adjusted to be executed finally. Further referring to FIG. 7, during a second round of execution from a time point 785 to a time point 786, the test case 740, test case 760 and test case 720 are executed in an adjusted order, thereby enabling all the three test cases to pass successfully. Those skilled in the art should appreciate that if faults still happen after the adjustment, it may repeat the above procedure to continue to adjust the running order of failed test cases.
Hence, according to the technique of the embodiment of the present disclosure, the user only needs to select components or test cases to be tested, regardless how the test suites are specifically generated, thereby implementing automatic generation and execution of the test suites. Hence, embodiments of the present disclosure can improve the test running efficiency, implement reliable fault processing, and output a detailed test report.
FIG. 8 illustrates a block diagram of a device 800 which is used to implement an embodiment of the present disclosure. As shown in the figure, the device 800 comprises a central processing unit (CPU) 801 that may perform various appropriate actions and processing based on computer program instructions stored in a read-only memory (ROM) 802 or computer program instructions loaded from a memory unit 808 to a random access memory (RAM) 803. In the RAM 803, there further store various programs and data needed for operations of the device 800. The CPU 801, ROM 802 and RAM 803 are connected to each other via a bus 804. An input/output (I/O) interface 805 is also connected to the bus 804.
Various components in the device 800 are connected to the I/O interface 805, including: an input unit 806 such as a keyboard, a mouse and the like; an output unit 807 such as various kinds of displays and a loudspeaker, and the like; a memory unit 808 such as a magnetic disk, an optical disk, and the like; a communication unit 809 such as a network card, a modem, and a wireless communication transceiver, and the like. The communication unit 809 allows the device 800 to exchange information/data with other devices through a computer network such as the Internet and/or various kinds of telecommunications networks.
Various processes and processing described above may be executed by the processing unit 801. For example, in some embodiments, the method may be implemented as a computer software program that is tangibly embodied on a machine readable medium, e.g., the storage unit 808. In some embodiments, part or all of the computer program may be loaded and/or mounted onto the device 800 via ROM 802 and/or communication unit 809. When the computer program is loaded to the RAM 803 and executed by the CPU 801, one or more steps of the method as described above may be executed.
In some embodiments, the method 200 described above may be implemented as a computer program product. The computer program product may include a computer readable storage medium which carries computer readable program instructions for executing aspects of the present disclosure.
The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
Computer readable program instructions for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language, and conventional procedural programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
These computer readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The descriptions of the various embodiments of the present disclosure have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (20)

What is claimed is:
1. A method for testing a multi-component system, comprising:
maintaining a knowledge base containing test case records for corresponding test cases, the test case record for each test case including a component identifier for a component tested by the test case, and a test platform identifier for a test platform type required for the test case; and
based on an identification of a set of components to be tested, using the knowledge base to identify a set of test cases and group the set of test cases into platform-specific test subsets, by:
first selecting those test cases whose test case records contain component identifiers for the components to be tested;
then grouping the selected test cases into the test subsets based on test platform types according to the test platform identifiers; and
executing the test subsets in parallel on respective test platforms of the required types.
2. The method according to claim 1, wherein the test case record for each test case further includes runtime information for the test case, and wherein the grouping of the selected test cases is further based on runtime according to the runtime information.
3. The method according to claim 2, wherein the grouping of the selected test cases into the test subsets is performed such that differences in runtime among all of the test subsets are within a threshold range.
4. The method according to claim 3, wherein the grouping produces two or more test subsets for a given test platform type in order to bring the respective runtimes of the test subsets into the threshold range.
5. The method according to claim 1, wherein the executing the test subsets comprises:
executing, by a plurality of test execution engines, the plurality of test subsets in parallel.
6. The method according to claim 1, wherein the executing the test subsets comprises:
instantiating respective test environments for the test platform types based on the test platform identifiers; and
executing, in each test environment, a plurality of test scripts associated with the corresponding test subset sequentially.
7. The method according to claim 6, wherein the executing the plurality of test scripts sequentially comprises:
in response to a plurality of faults including a first fault and a second fault occurring during the execution of the test subset, adjusting a first test case associated with the first fault to be after a second test case associated with the second fault in order; and
re-executing, for the adjusted plurality of faults, test cases associated with the plurality of faults.
8. The method according to claim 1, further comprising:
after completion of the execution of the test subsets, collecting test results from running logs; and
generating a test report based on the collected test results.
9. The method according to claim 1, wherein the method is used for automatic regression test, and the knowledge base is built by collecting historically executed test cases.
10. A device for executing a test case, comprising:
a processing unit; and
a memory coupled to the processing unit and storing instructions thereon, the instructions, when executed by the processing unit, perform the following acts:
maintaining a knowledge base containing test case records for corresponding test cases, the test case record for each test case including a component identifier for a component tested by the test case, and a test platform identifier for a test platform type required for the test case; and
based on an identification of a set of components to be tested, using the knowledge base to identify a set of test cases and group the set of test cases into platform-specific test subsets, by:
first selecting those test cases whose test case records contain component identifiers for the components to be tested;
then grouping the selected test cases into the test subsets based on test platform types according to the test platform identifiers; and
executing the test subsets in parallel on respective test platforms of the required types.
11. The device according to claim 10, wherein the test case record for each test case further includes runtime information for the test case, and wherein the grouping of the selected test cases is further based on runtime according to the runtime information.
12. The device according to claim 11, wherein the grouping of the selected test cases into the test subsets is performed such that differences in runtime among all of the test subsets are within a threshold range.
13. The device according to claim 12, wherein the grouping produces two or more test subsets for a given test platform type in order to bring the respective runtimes of the test subsets into the threshold range.
14. The device according to claim 10, wherein the executing the test subsets comprises:
executing, by a plurality of test execution engines, the plurality of test subsets in parallel.
15. The device according to claim 10, wherein the executing the test subsets comprises:
instantiating respective test environments for the test platform types based on the test platform identifiers; and
executing, in each test environment, a plurality of test scripts associated with the corresponding test subset sequentially.
16. The device according to claim 15, wherein the executing the plurality of test scripts sequentially comprises:
in response to a plurality of faults including a first fault and a second fault occurring during the execution of the test subset, adjusting a first test case associated with the first fault to be after a second test case associated with the second fault in order; and
re-executing, for the adjusted plurality of faults, test cases associated with the plurality of faults.
17. The device according to claim 10, wherein the acts further comprise:
after completion of the execution of the test subsets, collecting test results from running logs; and
generating a test report based on the collected test results.
18. The device according to claim 10, wherein the acts are used for automatic regression test, and the knowledge base is built by collecting historically executed test cases.
19. A computer program product that is tangibly stored on a non-transient computer readable medium and includes machine-executable instructions for testing a multi-component system, the testing comprising:
maintaining a knowledge base containing test case records for corresponding test cases, the test case record for each test case including a component identifier for a component tested by the test case, and a test platform identifier for a test platform type required for the test case; and
based on an identification of a set of components to be tested, using the knowledge base to identify a set of test cases and group the set of test cases into platform-specific test subsets, by:
first selecting those test cases whose test case records contain component identifiers for the components to be tested;
then grouping the selected test cases into the test subsets based on test platform types according to the test platform identifiers; and
executing the test subsets in parallel on respective test platforms of the required types.
20. The computer program product of claim 19, wherein the test case record for each test case further includes runtime information for the test case, and wherein the grouping of the selected test cases is further based on runtime according to the runtime information.
US16/705,686 2017-10-27 2019-12-06 Method, device and computer program product for executing test cases Active US10872034B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/705,686 US10872034B2 (en) 2017-10-27 2019-12-06 Method, device and computer program product for executing test cases

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
CN201711025184 2017-10-27
CN201711025184.7A CN109726093B (en) 2017-10-27 2017-10-27 Method, apparatus and computer program product for executing test cases
CN201711025184.7 2017-10-27
US16/173,077 US10534699B2 (en) 2017-10-27 2018-10-29 Method, device and computer program product for executing test cases
US16/705,686 US10872034B2 (en) 2017-10-27 2019-12-06 Method, device and computer program product for executing test cases

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/173,077 Continuation US10534699B2 (en) 2017-10-27 2018-10-29 Method, device and computer program product for executing test cases

Publications (2)

Publication Number Publication Date
US20200117586A1 US20200117586A1 (en) 2020-04-16
US10872034B2 true US10872034B2 (en) 2020-12-22

Family

ID=66243024

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/173,077 Active US10534699B2 (en) 2017-10-27 2018-10-29 Method, device and computer program product for executing test cases
US16/705,686 Active US10872034B2 (en) 2017-10-27 2019-12-06 Method, device and computer program product for executing test cases

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/173,077 Active US10534699B2 (en) 2017-10-27 2018-10-29 Method, device and computer program product for executing test cases

Country Status (2)

Country Link
US (2) US10534699B2 (en)
CN (1) CN109726093B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11099837B2 (en) 2019-10-29 2021-08-24 EMC IP Holding Company LLC Providing build avoidance without requiring local source code

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110297757A (en) * 2019-05-16 2019-10-01 平安科技(深圳)有限公司 Test cases management method, device, equipment and computer readable storage medium
CN110297767B (en) * 2019-06-03 2024-02-23 平安科技(深圳)有限公司 Automatic execution method, device, equipment and storage medium for test cases
CN112180890B (en) * 2019-07-05 2022-01-07 北京新能源汽车股份有限公司 Test case generation method, device and equipment
CN110377520B (en) * 2019-07-22 2024-03-15 中国工商银行股份有限公司 Transaction scenario testing method and device, electronic equipment and readable storage medium
CN111208798B (en) * 2019-12-26 2021-07-27 深圳市优必选科技股份有限公司 Robot testing method and device, electronic equipment and storage medium
CN111246286B (en) * 2020-01-10 2022-06-10 北京百度网讯科技有限公司 Test case obtaining method and device and electronic equipment
CN113377572A (en) * 2020-02-25 2021-09-10 伊姆西Ip控股有限责任公司 Method, electronic device and computer program product for managing backup jobs
US11182282B2 (en) * 2020-02-28 2021-11-23 International Business Machines Corporation Executing tests in deterministic order
CN111651350B (en) * 2020-05-29 2024-03-08 泰康保险集团股份有限公司 Test case processing method, device, equipment and computer readable storage medium
CN111813685B (en) * 2020-07-17 2023-12-05 京东科技控股股份有限公司 Automatic test method and device
US11782823B2 (en) * 2020-07-27 2023-10-10 International Business Machines Corporation Automatically capturing weather data during engineering tests
CN112015638A (en) * 2020-07-30 2020-12-01 西安雷风电子科技有限公司 Automatic testing method and system
CN114268569B (en) * 2020-09-16 2023-10-31 中盈优创资讯科技有限公司 Configurable network operation and maintenance acceptance test method and device
CN112286806B (en) * 2020-10-28 2023-10-03 成都佰维存储科技有限公司 Automatic test method and device, storage medium and electronic equipment
CN112416706A (en) * 2020-11-16 2021-02-26 珠海格力电器股份有限公司 Power consumption testing method, device and system, storage medium and electronic device
CN112732556A (en) * 2020-12-29 2021-04-30 北京浪潮数据技术有限公司 Automatic testing method, device, equipment and storage medium for distributed system
US11537508B1 (en) 2021-02-23 2022-12-27 State Farm Mutual Automobile Insurance Company Software testing in parallel threads with a record-locking database
US11714745B1 (en) 2021-02-23 2023-08-01 State Farm Mutual Automobile Insurance Company Software testing in parallel with different database instances
US11816023B1 (en) 2021-02-23 2023-11-14 State Farm Mutual Automobile Insurance Company Test conflict guard for parallel software testing
US11720482B1 (en) 2021-02-23 2023-08-08 State Farm Mutual Automobile Insurance Company Retrying failed test cases in software testing using parallel threads
CN113297083B (en) * 2021-05-27 2022-08-19 山东云海国创云计算装备产业创新中心有限公司 Cross-platform IC test method, device, equipment and medium
US20230325306A1 (en) * 2022-04-08 2023-10-12 Micro Focus Llc Machine-based source code assessment
CN114760235B (en) * 2022-04-24 2024-03-22 青岛海尔科技有限公司 Method and device for executing dial testing task, storage medium and electronic device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100083053A1 (en) * 2008-10-01 2010-04-01 Narayanan Ajikumar Thaitharani System and method for generating an orthogonal array for software testing
US8949793B1 (en) 2012-12-20 2015-02-03 Emc Corporation Test bed design from customer system configurations using machine learning techniques
US20150378873A1 (en) * 2014-06-25 2015-12-31 Hcl Technologies Ltd Automatically recommending test suite from historical data based on randomized evolutionary techniques
US20160077956A1 (en) * 2014-09-11 2016-03-17 Wipro Limited System and method for automating testing of software
US9482464B1 (en) 2013-10-18 2016-11-01 EMC IP Holding Company, LLC Controlling temperature of a test chamber which is equipped with a refrigerant cooling subsystem and a liquid nitrogen cooling subsystem
US10169206B2 (en) * 2016-11-15 2019-01-01 Accenture Global Solutions Limited Simultaneous multi-platform testing

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101052020B (en) * 2007-05-21 2010-06-09 中兴通讯股份有限公司 Monitor method and system for automatically measuring executing process
CN102075381A (en) * 2010-12-14 2011-05-25 云海创想信息技术(北京)有限公司 Automatic test platform server and system applied to cloud storage
CN103441900B (en) * 2013-08-27 2016-04-27 上海新炬网络技术有限公司 Centralized cross-platform automatization test system and control method thereof
CN104778118B (en) * 2013-12-30 2018-08-28 深圳键桥通讯技术股份有限公司 The improved method of automatization testing technique
US10049031B2 (en) * 2014-12-09 2018-08-14 International Business Machines Corporation Correlation of violating change sets in regression testing of computer software
CN107168880A (en) * 2017-05-31 2017-09-15 中标软件有限公司 Virtual machine method of testing and instrument

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100083053A1 (en) * 2008-10-01 2010-04-01 Narayanan Ajikumar Thaitharani System and method for generating an orthogonal array for software testing
US8949793B1 (en) 2012-12-20 2015-02-03 Emc Corporation Test bed design from customer system configurations using machine learning techniques
US9482464B1 (en) 2013-10-18 2016-11-01 EMC IP Holding Company, LLC Controlling temperature of a test chamber which is equipped with a refrigerant cooling subsystem and a liquid nitrogen cooling subsystem
US20150378873A1 (en) * 2014-06-25 2015-12-31 Hcl Technologies Ltd Automatically recommending test suite from historical data based on randomized evolutionary techniques
US20160077956A1 (en) * 2014-09-11 2016-03-17 Wipro Limited System and method for automating testing of software
US10169206B2 (en) * 2016-11-15 2019-01-01 Accenture Global Solutions Limited Simultaneous multi-platform testing

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Kevin P. Twomey, et al.; "Testing Electronic Products Using Smart Allocation of Test Resources via Resource Mappings," U.S. Appl. No. 14/578,902, filed Dec. 22, 2014.

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11099837B2 (en) 2019-10-29 2021-08-24 EMC IP Holding Company LLC Providing build avoidance without requiring local source code

Also Published As

Publication number Publication date
US10534699B2 (en) 2020-01-14
US20200117586A1 (en) 2020-04-16
CN109726093B (en) 2022-03-22
CN109726093A (en) 2019-05-07
US20190129833A1 (en) 2019-05-02

Similar Documents

Publication Publication Date Title
US10872034B2 (en) Method, device and computer program product for executing test cases
US10552301B2 (en) Completing functional testing
US11144439B2 (en) Emulation-based testing of a microservices architecture
US8990778B1 (en) Shadow test replay service
US10565096B2 (en) Generation of test scenarios based on risk analysis
US9836388B1 (en) Software testing environment that includes a duplicating proxy service
US7664986B2 (en) System and method for determining fault isolation in an enterprise computing system
CN109800258B (en) Data file deployment method, device, computer equipment and storage medium
US20150347212A1 (en) Error classification in a computing system
KR20140043081A (en) Application security testing
US20180357143A1 (en) Testing computing devices
US20100218049A1 (en) Method of Creating Signatures for Classifying Program Failures
CN107704369B (en) Operation log recording method, electronic device, storage medium and system
US20150089296A1 (en) Derivation of generalized test cases
US11169910B2 (en) Probabilistic software testing via dynamic graphs
US20150248344A1 (en) Z/os workload mapper for potential problem areas using modules and defect data
CN111309570A (en) Pressure testing method, medium, device and computing equipment
US11663113B2 (en) Real time fault localization using combinatorial test design techniques and test case priority selection
CN112988578A (en) Automatic testing method and device
CN112799939A (en) Incremental code coverage rate testing method and device, storage medium and electronic equipment
US9354962B1 (en) Memory dump file collection and analysis using analysis server and cloud knowledge base
US11656977B2 (en) Automated code checking
CN111290942A (en) Pressure testing method, device and computer readable medium
US20180225165A1 (en) Configurable system wide tests
CN115454856A (en) Multi-application security detection method, device, medium and electronic equipment

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: EMC CORPORATION, MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WANG, QUANHONG;REEL/FRAME:051605/0213

Effective date: 20161125

Owner name: EMC IP HOLDING COMPANY LLC, MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LV, SHUO;REEL/FRAME:051605/0209

Effective date: 20181019

AS Assignment

Owner name: EMC IP HOLDING COMPANY LLC, MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EMC CORPORATION;REEL/FRAME:051779/0001

Effective date: 20160906

AS Assignment

Owner name: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS COLLATERAL AGENT, TEXAS

Free format text: PATENT SECURITY AGREEMENT (NOTES);ASSIGNORS:DELL PRODUCTS L.P.;EMC IP HOLDING COMPANY LLC;REEL/FRAME:052216/0758

Effective date: 20200324

AS Assignment

Owner name: CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH, NORTH CAROLINA

Free format text: SECURITY AGREEMENT;ASSIGNORS:DELL PRODUCTS L.P.;EMC IP HOLDING COMPANY LLC;REEL/FRAME:052243/0773

Effective date: 20200326

AS Assignment

Owner name: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., TEXAS

Free format text: SECURITY AGREEMENT;ASSIGNORS:CREDANT TECHNOLOGIES INC.;DELL INTERNATIONAL L.L.C.;DELL MARKETING L.P.;AND OTHERS;REEL/FRAME:053546/0001

Effective date: 20200409

AS Assignment

Owner name: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS COLLATERAL AGENT, TEXAS

Free format text: SECURITY INTEREST;ASSIGNORS:DELL PRODUCTS L.P.;EMC CORPORATION;EMC IP HOLDING COMPANY LLC;REEL/FRAME:053311/0169

Effective date: 20200603

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: EMC IP HOLDING COMPANY LLC, TEXAS

Free format text: RELEASE OF SECURITY INTEREST AF REEL 052243 FRAME 0773;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:058001/0152

Effective date: 20211101

Owner name: DELL PRODUCTS L.P., TEXAS

Free format text: RELEASE OF SECURITY INTEREST AF REEL 052243 FRAME 0773;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:058001/0152

Effective date: 20211101

AS Assignment

Owner name: EMC IP HOLDING COMPANY LLC, TEXAS

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (052216/0758);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:060438/0680

Effective date: 20220329

Owner name: DELL PRODUCTS L.P., TEXAS

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (052216/0758);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:060438/0680

Effective date: 20220329

Owner name: EMC IP HOLDING COMPANY LLC, TEXAS

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (053311/0169);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:060438/0742

Effective date: 20220329

Owner name: EMC CORPORATION, MASSACHUSETTS

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (053311/0169);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:060438/0742

Effective date: 20220329

Owner name: DELL PRODUCTS L.P., TEXAS

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (053311/0169);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:060438/0742

Effective date: 20220329