CN116401139A - Software testing method and device - Google Patents

Software testing method and device Download PDF

Info

Publication number
CN116401139A
CN116401139A CN202111627026.5A CN202111627026A CN116401139A CN 116401139 A CN116401139 A CN 116401139A CN 202111627026 A CN202111627026 A CN 202111627026A CN 116401139 A CN116401139 A CN 116401139A
Authority
CN
China
Prior art keywords
test
information
function
software testing
test case
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111627026.5A
Other languages
Chinese (zh)
Inventor
付瑶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenyang Jingyi Zhijia Technology Co ltd
Original Assignee
Shenyang Jingyi Zhijia Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang Jingyi Zhijia Technology Co ltd filed Critical Shenyang Jingyi Zhijia Technology Co ltd
Priority to CN202111627026.5A priority Critical patent/CN116401139A/en
Publication of CN116401139A publication Critical patent/CN116401139A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention provides a software testing method and device and a computer readable storage medium. The software testing method comprises the following steps: obtaining a test case of a function to be tested; generating a test script according to the test case; executing the test script and acquiring response data of the function to be tested; determining a test result and/or a failure reason of the function to be tested according to the response data; and filling the test result and/or the failure reason into the test case to generate a test report of the function to be tested. By executing the steps, the software testing method can realize the automation of the software testing process, thereby improving the software testing efficiency, shortening the software testing process and reducing the professional requirements for testers.

Description

Software testing method and device
Technical Field
The present invention relates to software testing technology, and in particular, to a software testing method, a software testing device, and a computer readable storage medium.
Background
With the explosion of software engineering, software testing demands are increasing. However, the existing software testing technology generally requires a tester to manually write a test script, execute the test script, check the test result, and backfill the test result one by one to generate a test report. The manually-realized test scheme is characterized by long and tedious flow and low test efficiency, and cannot meet the increasing software test requirements, and has higher professional requirements for testers, so that the problems of scarcity of testers and high cost of personnel are solved.
In order to overcome the above-mentioned drawbacks of the prior art, there is a need in the art for a software testing technique for improving the software testing efficiency, shortening the software testing process, and reducing the professional requirements for the testers.
Disclosure of Invention
The following presents a simplified summary of one or more aspects in order to provide a basic understanding of such aspects. This summary is not an extensive overview of all contemplated aspects, and is intended to neither identify key or critical elements of all aspects nor delineate the scope of any or all aspects. Its sole purpose is to present some concepts of one or more aspects in a simplified form as a prelude to the more detailed description that is presented later.
In order to overcome the above-mentioned drawbacks of the prior art, the present invention provides a software testing method, a software testing apparatus, and a computer readable storage medium.
Specifically, the software testing method provided according to the first aspect of the present invention includes the following steps: obtaining a test case of a function to be tested; generating a test script according to the test case; executing the test script and acquiring response data of the function to be tested; determining a test result and/or a failure reason of the function to be tested according to the response data; and filling the test result and/or the failure reason into the test case to generate a test report of the function to be tested. By executing the steps, the software testing method can realize the automation of the software testing process, thereby improving the software testing efficiency, shortening the software testing process and reducing the professional requirements for testers.
In addition, the software testing device provided in the second aspect of the invention comprises a memory and a processor. The processor is connected to the memory and is configured to implement the software testing method provided in the first aspect of the invention. By implementing the software testing method, the software testing device can realize the automation of the software testing process, thereby improving the software testing efficiency, shortening the software testing process and reducing the professional requirements on testers.
Further, the above-described computer-readable storage medium according to the third aspect of the present invention has stored thereon computer instructions. The computer instructions, when executed by a processor, implement the software testing method provided in the first aspect of the present invention. By implementing the software testing method, the computer readable storage medium can realize the automation of the software testing process, thereby improving the software testing efficiency, shortening the software testing process and reducing the professional requirements for testers.
Drawings
The above features and advantages of the present invention will be better understood after reading the detailed description of embodiments of the present disclosure in conjunction with the following drawings. In the drawings, the components are not necessarily to scale and components having similar related features or characteristics may have the same or similar reference numerals.
Fig. 1 illustrates an architecture diagram of a software testing apparatus provided according to some embodiments of the invention.
Fig. 2 illustrates a flow diagram of a software testing method provided in accordance with some embodiments of the invention.
FIG. 3 illustrates an illustration of extracting information from a configuration file provided in accordance with some embodiments of the invention.
FIG. 4 illustrates a schematic diagram of generating a test script provided in accordance with some embodiments of the invention.
Fig. 5A and 5B are schematic diagrams illustrating test result feedback interfaces provided according to some embodiments of the invention.
FIG. 6 illustrates a schematic diagram of generating a troubleshooting task sheet provided in accordance with some embodiments of the present invention.
Detailed Description
Further advantages and effects of the present invention will become apparent to those skilled in the art from the disclosure of the present specification, by describing the embodiments of the present invention with specific examples. While the description of the invention will be presented in connection with a preferred embodiment, it is not intended to limit the inventive features to that embodiment. Rather, the purpose of the invention described in connection with the embodiments is to cover other alternatives or modifications, which may be extended by the claims based on the invention. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. The invention may be practiced without these specific details. Furthermore, some specific details are omitted from the description in order to avoid obscuring the invention.
In the description of the present invention, it should be noted that, unless explicitly stated and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present invention will be understood in specific cases by those of ordinary skill in the art.
In addition, the terms "upper", "lower", "left", "right", "top", "bottom", "horizontal", "vertical" as used in the following description should be understood as referring to the orientation depicted in this paragraph and the associated drawings. This relative terminology is for convenience only and is not intended to be limiting of the invention as it is described in terms of the apparatus being manufactured or operated in a particular orientation.
It will be understood that, although the terms "first," "second," "third," etc. may be used herein to describe various elements, regions, layers and/or sections, these elements, regions, layers and/or sections should not be limited by these terms and these terms are merely used to distinguish between the various elements, regions, layers and/or sections. Accordingly, a first component, region, layer, and/or section discussed below could be termed a second component, region, layer, and/or section without departing from some embodiments of the present invention.
As described above, the existing software testing technology generally requires a tester to manually write a test script, execute the test script, check the test result, and backfill the test result piece by piece to generate a test report. The test scheme realized manually has the problems of lengthy and tedious flow and low test efficiency, can not meet the increasing software test requirement, has higher professional requirements for testers, and has the problems of scarcity of testers and high personnel cost.
In order to overcome the defects in the prior art, the invention provides a software testing method, a software testing device and a computer readable storage medium, which can realize the automatization of a software testing process, thereby improving the software testing efficiency, shortening the software testing process and reducing the professional requirements for testers.
Referring to fig. 1, fig. 1 is a schematic diagram illustrating an architecture of a software testing apparatus according to some embodiments of the invention.
As shown in fig. 1, in some non-limiting embodiments, the software testing method provided in the first aspect of the present invention may be implemented by the software testing apparatus 10 provided in the second aspect of the present invention. The software testing device 10 is provided with a memory 11 and a processor 12. The memory 11 includes, but is not limited to, the above-described computer-readable storage medium provided by the third aspect of the present invention, on which computer instructions are stored. The processor 12 is coupled to the memory 11 and is configured to execute computer instructions stored on the memory 11 to implement the software testing method provided in the first aspect of the present invention.
The principle of operation of the software testing device 10 described above will be described below in connection with some embodiments of the software testing method. It will be appreciated by those skilled in the art that these examples of software testing methods are merely some non-limiting embodiments provided by the present invention, and are intended to clearly illustrate the general concepts of the present invention and to provide some embodiments that are convenient for public implementation, rather than to limit the overall functionality or overall manner of operation of the software testing apparatus 10 described above. Similarly, the software testing device 10 is only a non-limiting embodiment provided by the present invention, and does not limit the implementation of each step in the software testing method.
Referring to fig. 2, fig. 2 is a flow chart illustrating a software testing method according to some embodiments of the invention.
As shown in fig. 2, in the process of performing a software test, a tester may first select one or more corresponding test cases on the software test interface according to one or more software functions to be tested. The test case can be pre-written by a technician, and comprises function names, service scenes and other function information of the functions to be tested, and/or test information such as a request interface, a parameter entering message and the like. The software testing device 10 can obtain operation instructions input by a tester through the soft part testing interface, and determine test cases to be used according to the operation instructions.
Taking the cloud interface function of the added collection as an example, the cloud interface function of the added collection can further comprise one or more business scenes such as music adding collection, local radio station adding collection, network radio station adding collection, news adding collection, video adding collection, album adding collection, song adding collection and the like. The software testing device 10 can determine the testing cases of all the business scenes under the added and collected column vision required to be used in the test according to the operation instruction input by the tester.
Then, in response to an operation instruction of clicking the operation button on the software test interface by the user, the software test device 10 may generate corresponding test scripts one by one according to the obtained test cases.
In some embodiments, the above-mentioned flow of generating a test script according to the acquired test case may be automatically implemented based on a JMeter interface test plug-in written in Java code. JMeter is a Java-based stress testing tool developed by the Apache organization for stress testing software. The protocols supported by JMeter are Web (Http, https), SOAP, FTPd, JDBC, mail, mongoDB, TCP, native commands, or Shell scripts. Jmeters can be used to test static files, java servlets, CGI scripts, java objects, DB, FTP servers, etc. static and dynamic resources, and can be used to simulate huge loads on servers, networks, or objects to test their strength and analyze overall performance under different pressure categories. In addition, the JMeter can also perform functional/regression testing on the application program to verify whether the program returned the expected result by creating a script with assertions. Further, for maximum flexibility, jmeters allow creation of assertions using regular expressions. In addition, the JMeter has the capability to perform pressure testing on any DB; pure Java has strong portability; the lightweight assembly supports package (precompiled JAR uses Java.swing. Multi-thread (multiple threads operate different functions simultaneously or through independent thread groups), timing is accurate, caching and offline analysis are supported, test results can be played back, the Java meter can be developed for the second time through complete source opening, plug-in units required by service correspondence are increased, and the like.
Referring to Table 1, table 1 shows test cases provided in accordance with some embodiments of the invention. As shown in Table 1, in some embodiments of the present invention, a pre-written test case may include a request interface field and/or a parameter entry field, in which request interface information and/or parameter entry information required for performing a software test are described.
TABLE 1
Figure BDA0003440216110000051
Figure BDA0003440216110000061
In some embodiments, in the process of generating the test script, the software testing device 10 with the JMeter interface test plug-in may extract the request interface information directly from the request interface field of the test case, extract the entry message information from the entry message field of the test case, and generate the test script according to the request interface information and the entry message information obtained by extraction.
Optionally, in other embodiments, the pre-written test case may include a function information field such as a function name, a service scenario, and a blank request interface field and/or a parameter entry field. The function information field records function information such as function names, service scenes and the like of functions to be tested.
In the process of generating the test script, the software testing device 10 provided with the JMeter interface test plug-in can firstly extract function information such as function names, service scenes and the like from the function information field of the test case, and then extract request interface information and/or request parameter information of the function to be tested from the corresponding configuration file (for example, an API file) according to the extracted function information, so as to serve as a data base of the software test.
Referring specifically to fig. 3, fig. 3 is a schematic diagram illustrating information extraction from a configuration file according to some embodiments of the present invention.
As shown in fig. 3, the software testing apparatus 10 may first find corresponding content from the corresponding API file according to the extracted and obtained function information (e.g., user collection), and then extract information such as a request address from the corresponding content of the user collection, so as to be used as the request interface information of the function to be tested. In addition, the software testing device 10 may further extract at least one piece of request parameter information about a content (content) and a type (type) of a request object (request object) from the corresponding content collected by the user, and generate the input message information of the function to be tested according to the at least one piece of extracted request parameter information.
For example, for a function to be tested to which a music collection is added, after extracting the request parameter information for obtaining the collection content (content), the software testing apparatus 10 may randomly generate a 10-bit character string containing "12345abCD" based on the format standard or the content requirement of the request parameter information. Then, the software testing device 10 may generate the parameter entering message information POST/user/questions of the function to be tested of the added music collection { "content" { $ { raw string (10, collection 12345abcd, content) } }, "type": "1" }, according to at least one piece of request parameter information such as the collection content (content) and the collection type (type) obtained by extraction, and the randomly generated character string information.
By configuring the function of automatically extracting and generating the request interface information and/or the input message information from configuration files such as an API (application program interface), the invention can further reduce the workload of test case writers, further reduce the professional requirements on the test case writers, avoid the situation of constructing error scripts by using correct cases and improve the test efficiency.
Further, in some embodiments, after extracting the request interface information from the configuration file, the software testing apparatus 10 may further backfill the extracted request interface information to a test case blank request interface field, so that the test troubleshooting and/or the next software test is directly extracted from the test case. Likewise, after generating the entry message information according to the request parameter information extracted from the configuration file, the software testing apparatus 10 may backfill the generated entry message information to the entry message column of the test case blank, so as to remove the test fault and/or extract the next software test directly from the test case.
By backfilling the request interface information and/or the input message information extracted from the configuration file into the test case, the invention can facilitate the test fault removal and/or the next software test to be directly extracted from the test case, thereby further improving the efficiency of test abnormality diagnosis and software test, and can enable software developers responsible for test fault removal to clearly know the related information such as the request interface information, the input message information, the random character string information and the like for implementing the software test, thereby providing clearer guidance for the fault removal stage and being beneficial to improving the efficiency and the success rate of fault removal.
Referring next to fig. 4, fig. 4 illustrates a schematic diagram of generating a test script provided in accordance with some embodiments of the present invention.
As shown in fig. 4, after determining the request interface information and the entry message information of the function to be tested, the software testing device 10 may automatically generate a test script for the software test according to the request interface information and the entry message information. Specifically, the software testing apparatus 10 may determine a request path of the test script according to the request interface information, determine message body data of the test script according to the incoming message information, and fill the request path of the test script and the message body data into corresponding positions of the HTTP request respectively, so as to automatically generate the test script of the software test.
It will be appreciated by those skilled in the art that the above-mentioned scheme of generating the corresponding test script online according to the test case selected by the user is merely a non-limiting embodiment provided by the present invention, and is intended to clearly illustrate the main concept of the present invention and provide a specific scheme for public implementation, not to limit the scope of protection of the present invention.
Alternatively, in other embodiments, the software testing device 10 may also generate a corresponding test script in advance according to the test case of each function to be tested of the software to be tested, and construct a test script library of the software to be tested. Then, in the process of software testing, a tester can select one or more corresponding test scripts from a pre-constructed test script library through a software test interface according to one or more to-be-tested software functions, and directly execute the selected test scripts to test the to-be-tested functions, so that the efficiency of software testing is further improved.
As shown in fig. 2, after generating the test script, the software testing device 10 may automatically execute the generated test script to acquire response data of the function under test, and determine a test result and/or a failure cause of the function under test according to the acquired response data.
In some embodiments, the above-mentioned process of executing the test script and obtaining the response data may also be automatically implemented based on the JMeter interface test plug-in written in Java code. Specifically, in the process of executing the test script and acquiring the response data, the software test device 10 installed with the JMeter interface test plug-in may send an HTTP request to the corresponding request interface according to the request path of the test script, and acquire the response data of the HTTP request via the request interface. The software testing device 10 may then automatically parse the response data of the HTTP request to determine the test result and/or the failure cause of the function under test.
Please refer to fig. 5A and fig. 5B in combination. Fig. 5A and 5B are schematic diagrams illustrating test result feedback interfaces provided according to some embodiments of the invention.
As shown in FIG. 5A, if the test result of the function to be tested (e.g., adding music collection) is failure, the response data will include the test result information of "system error" and the related error reporting information. The JMeter interface test plug-in can analyze the response data to extract the test result information and error reporting information, determine the test result of the function to be tested according to the test result information, and determine the failure reason of the function to be tested according to the error reporting information.
In contrast, as shown in FIG. 5B, if the test result of the function to be tested (e.g., add local station collection) is successful, the response data will include "request successful" test result information. The JMeter interface test plug-in can analyze the response data to extract the test result information and determine the test result of the function to be tested according to the test result information.
As shown in fig. 2, after determining the test result and/or the failure cause of the function to be tested, the software testing apparatus 10 may automatically fill the obtained test result and/or the failure cause into the test case via the JMeter interface test plug-in, so as to generate a test report of the function to be tested.
Specifically, the test cases written in advance may include blank execution result fields and/or failure cause fields. In response to a successful test result, the software testing device 10 may populate the execution result column of the test case with successful execution result information to generate a test report. Otherwise, in response to the failed test result, the software testing apparatus 10 may fill the failed execution result information into the execution result column of the test case, and fill the acquired response data into the failure cause column of the test case, so as to generate the test report.
Referring to table 2, table 2 shows test reports provided according to some embodiments of the invention. As shown in Table 2, in some embodiments of the invention, the generated test report may include an execution result field and/or a failure cause field in which the test result and/or the failure cause of the software test are described. In the fault elimination stage after the software test, a developer of the function to be tested can quickly and conveniently determine the reasons causing the failure of the function test according to the function names, service scenes and other function information recorded in the test report, the request interface, the input message and other test information, and the test result, failure reasons and other result information, thereby improving the fault elimination efficiency.
TABLE 2
Figure BDA0003440216110000101
Figure BDA0003440216110000111
Based on the above description, the present invention provides a software testing method, a software testing device, and a computer readable storage medium, which can automatically generate a test script required for testing, automatically execute the test script, obtain response data of a function to be tested, and automatically backfill the response data into a test case to generate a test report of the function to be tested. Therefore, the invention can realize the automatic test of the function to be tested, and replaces the operations of manually writing test scripts and manually uploading test results by the software testing device 10, thereby improving the software testing efficiency, shortening the software testing flow and reducing the professional requirements for the testers.
Further, in some embodiments of the present invention, after completing the test flow of each function to be tested of the software to be tested, the software testing device 10 may further screen the test report of each function to be tested according to the test result, so as to determine that the test result is the failed function to be tested. Then, in response to the failed test result, the software testing device 10 may further generate a troubleshooting task of the function to be tested according to the failure reason thereof, and send the troubleshooting task to the corresponding sponsor, so as to further implement automation of the test troubleshooting procedure.
Specifically, the software testing apparatus 10 may first generate a trouble shooting task list for each function to be tested, in which a trouble exists, based on response data recorded in a failure cause column of each test report obtained by screening. Then, the software testing apparatus 10 may extract relevant information such as tester information, developer information, priority information, problem level information, precondition information, entering message information, operation step information, failure reason information, expected result information, etc. from the test report of each function to be tested, and obtain current time information for generating the failure removal task sheet.
Then, the software testing apparatus 10 may generate a title summary of the troubleshooting task sheet according to the format of "[ function name ] -subfunction-business scenario-operation step"; determining a reporter of the troubleshooting task according to the acquired tester information; determining a sponsor of the troubleshooting task according to the acquired developer information; determining the priority of the troubleshooting task according to the acquired priority information; determining a problem source of the troubleshooting task based on a default problem source (e.g., internal test); determining a problem level of the troubleshooting task according to a default problem level (e.g., general); according to the obtained precondition information, determining the precondition of the troubleshooting task; determining test data of a fault removal task according to the acquired parameter entering message information; determining a test step of a fault removal task according to the acquired operation step information; determining an actual result of the troubleshooting task according to the acquired failure reason information; determining an expected result of the troubleshooting task according to the obtained expected result information; determining the starting date of the troubleshooting task according to the acquired current time information; and/or determining the expiration date of the troubleshooting task according to the obtained current time information and the preset troubleshooting time (for example, 7 days) to generate the troubleshooting task list shown in fig. 6.
Then, the software testing device 10 can automatically send the corresponding troubleshooting task list to each sponsor according to the sponsor information of each troubleshooting task list, so as to automatically prompt each sponsor to troubleshoot the function to be tested. Therefore, the invention can further realize the automatic fault management of the function to be tested, and replaces the operation of manually submitting the fault removal tasks one by the software testing device 10, thereby improving the working efficiency of the testing personnel, improving the continuity of the testing flow and avoiding the repeated operation of the testing personnel.
While, for purposes of simplicity of explanation, the methodologies are shown and described as a series of acts, it is to be understood and appreciated that the methodologies are not limited by the order of acts, as some acts may, in accordance with one or more embodiments, occur in different orders and/or concurrently with other acts from that shown and described herein or not shown and described herein, as would be understood and appreciated by those skilled in the art.
Those of skill in the art would understand that information, signals, and data may be represented using any of a variety of different technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
Although the software testing apparatus 10 described in the above embodiment may be implemented by a combination of software and hardware. It will be appreciated that the software testing device 10 may be implemented in software or hardware alone. For hardware implementation, the software testing apparatus 10 may be implemented in one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), processors, controllers, microcontrollers, microprocessors, other electronic devices for performing the above functions, or a selected combination of the above. For software implementation, the soft-ware testing apparatus 10 may be implemented by separate software modules, such as program modules (procedures) and function modules (functions), running on a common chip, each of which performs one or more of the functions and operations described herein.
The various illustrative logical modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The previous description of the disclosure is provided to enable any person skilled in the art to make or use the disclosure. Various modifications to the disclosure will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other variations without departing from the spirit or scope of the disclosure. Thus, the disclosure is not intended to be limited to the examples and designs described herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (16)

1. A method of testing software, comprising the steps of:
obtaining a test case of a function to be tested;
generating a test script according to the test case;
executing the test script and acquiring response data of the function to be tested;
determining a test result and/or a failure reason of the function to be tested according to the response data; and
and filling the test result and/or the failure reason into the test case to generate a test report of the function to be tested.
2. The software testing method of claim 1, wherein the step of generating a test script from the test case comprises:
determining request interface information and parameter entering message information of the function to be tested according to the test case; and
and generating the test script according to the request interface information and the parameter entering message information.
3. The software testing method according to claim 2, wherein the test case includes a request interface field and/or a parameter entering message field, and the step of determining the request interface information and the parameter entering message information of the function to be tested according to the test case includes:
extracting the request interface information from the request interface column of the test case; and/or
Extracting the parameter entering message information from the parameter entering message column of the test case.
4. The software testing method as claimed in claim 2, wherein the test case includes a function information field, and the step of determining the request interface information and the incoming message information of the function to be tested according to the test case includes:
extracting functional information from the functional information field of the test case; and
and extracting the request interface information of the function to be tested from the corresponding configuration file according to the function information.
5. The software testing method according to claim 2 or 4, wherein the test case includes a function information field, and the step of determining the request interface information and the parameter entry information of the function to be tested according to the test case includes:
extracting functional information from the functional information field of the test case;
extracting at least one piece of request parameter information of the function to be tested from a corresponding configuration file according to the function information; and
and generating the parameter entering message information according to the at least one piece of request parameter information.
6. The software testing method of claim 5, wherein the generating the incoming message information according to the at least one piece of request parameter information comprises:
randomly generating character string information indicating test contents; and
and generating the parameter entering message information according to the at least one piece of request parameter information and the character string information.
7. The software testing method of claim 6, further comprising the steps of:
filling the request interface information extracted from the configuration file into a request interface column of the test case; and/or
And filling the generated parameter entering message information into a parameter entering message column of the test case.
8. The software testing method of claim 2, wherein the step of generating the test script according to the request interface information and the incoming message information comprises:
determining a request path of the test script according to the request interface information; and
and determining the message body data of the test script according to the parameter entering message information.
9. The software testing method of claim 8, wherein the function under test comprises a cloud interface function, and the executing the test script and obtaining the response data of the function under test comprises:
according to the request path of the test script, an HTTP request is sent to a corresponding request interface;
acquiring response data of the HTTP request through the request interface; and
and analyzing the response data to determine the test result and/or the failure reason of the function to be tested.
10. The software testing method according to claim 1, wherein the test case includes a failure cause column, and the step of filling the test result and/or the failure cause into the test case to generate the test report of the function under test includes:
and responding to a failed test result, and filling the response data into the failure reason column of the test case to generate the test report.
11. The software testing method of claim 10, wherein the test case further includes an execution result field, and the step of filling the test result and/or the failure cause into the test case to generate the test report of the function under test further includes:
responding to a successful test result, and filling successful execution result information into the execution result column of the test case to generate the test report; and
and responding to the failed test result, and filling the failed execution result information into the execution result column of the test case to generate the test report.
12. The software testing method of claim 1, further comprising the steps of:
and responding to a failure test result, and generating a fault elimination task of the function to be tested according to the failure reason.
13. The software testing method of claim 12, wherein the test case further includes developer information of the function under test, and the step of generating a troubleshooting task of the function under test according to the failure cause includes:
generating a fault removal task list of the function to be tested according to the failure reason;
determining a manager of the troubleshooting task list according to the developer information; and
and sending the troubleshooting task list to the sponsor to prompt the sponsor to troubleshoot the function to be tested.
14. The software testing method of claim 1, wherein the step of obtaining test cases for functions under test comprises:
acquiring an operation instruction input by a user; and
and determining the test case according to the operation instruction.
15. A software testing apparatus, comprising:
a memory; and
a processor coupled to the memory and configured to implement the software testing method of any one of claims 1-14.
16. A computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement a software testing method according to any of claims 1 to 14.
CN202111627026.5A 2021-12-28 2021-12-28 Software testing method and device Pending CN116401139A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111627026.5A CN116401139A (en) 2021-12-28 2021-12-28 Software testing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111627026.5A CN116401139A (en) 2021-12-28 2021-12-28 Software testing method and device

Publications (1)

Publication Number Publication Date
CN116401139A true CN116401139A (en) 2023-07-07

Family

ID=87011023

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111627026.5A Pending CN116401139A (en) 2021-12-28 2021-12-28 Software testing method and device

Country Status (1)

Country Link
CN (1) CN116401139A (en)

Similar Documents

Publication Publication Date Title
US10534699B2 (en) Method, device and computer program product for executing test cases
CN109302522B (en) Test method, test device, computer system, and computer medium
US11212628B2 (en) Method and apparatus for testing speaker, electronic device and storage medium
CN101241467B (en) Automatized white box test system and method facing to WEB application
CN106506243A (en) A kind of webmaster method for diagnosing faults based on daily record data
CN110716853A (en) Test script recording method, application program testing method and related device
CN107480036A (en) Automatic test approach, device, storage medium and the terminal of iOS Mobile solutions
CN112181854B (en) Method, device, equipment and storage medium for generating process automation script
CN115080398A (en) Automatic interface test system and method
CN115658529A (en) Automatic testing method for user page and related equipment
CN116841865A (en) Visual test method and device, electronic equipment and storage medium
CN113849419B (en) Method, system, equipment and storage medium for generating test vector of chip
CN114138633A (en) Method, device and equipment for testing software based on data driving and readable medium
CN117370203A (en) Automatic test method, system, electronic equipment and storage medium
CN117632710A (en) Method, device, equipment and storage medium for generating test code
CN116401139A (en) Software testing method and device
CN107885648B (en) Method and device for generating test report
CN110795338B (en) Front-end and back-end interaction-based automatic testing method and device and electronic equipment
CN115543234A (en) Log printing method and related device
CN114968770A (en) Automatic testing method, device, equipment and storage medium
CN113986263A (en) Code automation test method, device, electronic equipment and storage medium
CN114546749A (en) Chip random test case regression method, device, equipment and readable medium
CN114490337A (en) Debugging method, debugging platform, equipment and storage medium
CN113238953A (en) UI automation test method and device, electronic equipment and storage medium
CN112069062A (en) Method and device for editing and generating software test bug module

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination