CN115687156A - Jmeter-based interface automatic testing method and device - Google Patents

Jmeter-based interface automatic testing method and device Download PDF

Info

Publication number
CN115687156A
CN115687156A CN202211577391.4A CN202211577391A CN115687156A CN 115687156 A CN115687156 A CN 115687156A CN 202211577391 A CN202211577391 A CN 202211577391A CN 115687156 A CN115687156 A CN 115687156A
Authority
CN
China
Prior art keywords
test
interface
jmeter
tested
script
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211577391.4A
Other languages
Chinese (zh)
Inventor
王如迅
任党恩
闫亚菊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China International Financial Ltd By Share Ltd
Original Assignee
China International Financial Ltd By Share Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China International Financial Ltd By Share Ltd filed Critical China International Financial Ltd By Share Ltd
Priority to CN202211577391.4A priority Critical patent/CN115687156A/en
Publication of CN115687156A publication Critical patent/CN115687156A/en
Pending legal-status Critical Current

Links

Images

Abstract

The present disclosure provides a Jmeter-based interface automated testing method, which includes: the following configuration steps are performed for the Jmeter's test plan: adding components such as user-defined variables, HTTP request default values, HTTP header managers, result tree checking, aggregated reports and the like; adding a corresponding thread group for the item to be tested in the test plan, wherein the thread group is uniquely corresponding to the test plan; configuring a plurality of condition controllers under the thread group, wherein the condition controllers are used for determining the environment of the item to be tested; and adding a benshell sampler under each condition controller to configure data values for testing; then adding a specific interface test task, generating a Jmeter test script according to the configured test plan, and carrying out interface test based on the script to obtain a test result. The present disclosure also provides an interface automation test device based on the Jmeter.

Description

Jmeter-based interface automatic testing method and device
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method and an apparatus for automated testing of an interface based on a meter, a computing device, a computer-readable storage medium, and a computer program product.
Background
Jmeter is a Java-based pressure testing tool for pressure testing software. The Jmeter can be used for carrying out interface automation test, and the compiling of test scripts can be carried out through the components of the Jmeter, such as: "HTTP Cookie manager", "HTTP request default", "HTTP message header manager", "database configuration", "view result tree", "aggregate report", "thread group", "HTTP request", "assertion", and the like.
In the related technology, a common user can be simulated to send a request to perform a pressure test on a server interface by adding a thread group to configure the concurrency number, cycle times, thread starting time parameters and the like, then parameterization setting is performed by configuring a csv data file to enable the server interface to be a more accurate test interface, then a Beanshell script is compiled or an auxiliary jar packet is introduced, and finally a set listener and an assertion are added to judge whether the corresponding result of the request is an expected result, so that the automatic test of the interface is completed. However, scripts of existing solutions cannot distinguish between different environments (e.g., development environments, test environments, pre-release environments, and production environments), and thus cannot support multiple different environments with one set of scripts. In addition, the scripts of the existing solutions do not represent a logical isolation of the different items.
Disclosure of Invention
In view of the above, the present disclosure provides a method and apparatus for Jmeter-based interface automated testing, a computing device, a computer-readable storage medium, and a computer program product to mitigate, alleviate, or even eliminate the above-mentioned problems.
According to one aspect of the disclosure, a method for automatically testing a Jmeter-based interface is provided, wherein the method comprises the following steps: the following configuration steps are performed for the Jmeter's test plan: adding user-defined variables, HTTP request default values, HTTP header managers, view result trees and aggregated reports; adding a corresponding thread group for a project to be tested in the test plan, wherein the added thread group is uniquely corresponding to the test plan; configuring a plurality of condition controllers under the thread group, wherein the condition controllers are used for determining the environment of an item to be tested; and adding a benshell sampler under each of the plurality of conditional controllers to configure data values for testing; and then adding a specific interface test task, generating a Jmeter test script according to the configured test plan, and carrying out interface test based on the Jmeter test script to obtain a test result.
According to some embodiments of the disclosure, the configuring step further comprises: and establishing a corresponding cycle controller for each code module corresponding to the item to be tested so as to logically isolate each code module.
According to some embodiments of the present disclosure, the interface test is a single interface test or a business process test, and the name of the cycle controller is determined according to a foreground and background position corresponding to the interface to be tested and a module function.
According to some embodiments of the present disclosure, the interface test is a business process test, and the configuring step further comprises: establishing a transaction controller under the cycle controller to determine an overall performance metric for all HTTP requests under the transaction controller.
According to some embodiments of the present disclosure, the name of the HTTP request corresponding to the item to be tested is determined according to: the environment of the item to be tested, the serial number of the interface test case and the name of the interface.
According to some embodiments of the present disclosure, the interface test is a single interface test for adding, deleting, and modifying, and the interface test is performed by: adding target test data; querying the added target test data; acquiring the ID of the target test data according to the JSON extractor, and modifying the target test data to obtain modified test data; and deleting the modified test data according to the ID of the target test data.
According to some embodiments of the disclosure, the configuring step further comprises: adding an assertion expected value in a viewing result control; and wherein said performing an interface test based on said Jmeter test script comprises: verifying a plurality of interfaces based on the Jmeter test script to generate a corresponding plurality of assertion results; determining a first proportion of null assertions in the plurality of assertion results according to corresponding assertion expected values in the plurality of assertion results; and determining the health degree of the assertion according to at least the first proportion.
According to some embodiments of the disclosure, the method further comprises: normalizing the test result to obtain an interface test report, wherein the interface test report comprises a first path, a second path and a third path, and the first path is used for storing a data processing related table; the second path is used for storing test script related files, wherein the test script related files comprise jmx files, jtl files and log files; and the third path is used for storing the test result in the form of an HTML Report.
According to some embodiments of the present disclosure, the performing an interface test based on the meter test script to obtain a test result includes: submitting the Jmeter test script to a code repository; setting an automatic test task by using a continuous integration tool, downloading the Jmeter test script from the code warehouse and constructing by using a corresponding construction tool to generate a Jmeter test report; and feeding back preset content to an administrator, wherein the preset content comprises at least one of the following contents: the number of test interfaces, the number of failures, the success rate, the average response time, and the URL of the interface.
According to some embodiments of the disclosure, the environment in which the item to be tested is located includes any one of: a development environment, a test environment, a pre-release environment, and a production environment.
According to another aspect of the present disclosure, there is provided a meter-based interface automated testing apparatus, comprising: a test script configuration module configured to add user-defined variables, HTTP request defaults, HTTP header managers, view result trees, and aggregated reports; adding a corresponding thread group for a project to be tested in the test plan, wherein the added thread group is uniquely corresponding to the test plan; configuring a plurality of condition controllers under the thread group, wherein the condition controllers are used for determining the environment of the item to be tested; and adding a benshell sampler under each of the plurality of condition controllers to configure data values for testing; the test result generation module is configured to add a specific interface test task, generate a Jmeter test script according to the configured test plan, and perform an interface test based on the Jmeter test script to obtain a test result.
According to yet another aspect of the present disclosure, there is provided a computing device, characterized in that the computing device includes: a memory configured to store computer-executable instructions; a processor configured to perform any of the methods provided in accordance with the foregoing aspects of the disclosure when the computer-executable instructions are executed by the processor.
According to yet another aspect of the present disclosure, there is provided a computer-readable storage medium, characterized in that the computer-readable storage medium stores computer-executable instructions that, when executed, perform any of the methods provided according to the previous aspects of the present disclosure.
According to yet another aspect of the present disclosure, there is provided a computer program product, characterized in that the computer program product comprises computer executable instructions which, when executed by a processor, perform any of the methods provided according to the preceding aspects of the present disclosure.
According to the Jmeter-based interface automatic testing method provided by the disclosure, the following configuration steps can be executed aiming at a Jmeter testing plan: adding user-defined variables, HTTP request default values, HTTP header managers, view result trees and aggregated reports; adding a corresponding thread group for the item to be tested in the test plan, wherein the added thread group is uniquely corresponding to the test plan, so that different items can be logically isolated; configuring a plurality of condition controllers under the thread group, wherein the condition controllers are used for determining the environment of the item to be tested, so that different environments (such as a development environment, a test environment, a pre-release environment, a production environment and the like) can be distinguished, and a plurality of different environments can be supported by one set of scripts; and adding a benshell sampler under each of the plurality of condition controllers to configure data values for testing; and then adding a specific interface test task, generating a Jmeter test script according to the configured test plan, and carrying out interface test based on the Jmeter test script to obtain a test result.
These and other aspects of the disclosure will be apparent from and elucidated with reference to the embodiments described hereinafter.
Drawings
Further details, features and advantages of the disclosed solution are disclosed in the following description of exemplary embodiments with reference to the accompanying drawings, in which:
FIG. 1 schematically illustrates an example flow diagram of a method for Jmeter-based interface automation testing, according to some embodiments of this disclosure;
FIG. 2 schematically illustrates an example flow diagram of a Jmeter-based interface automation testing method according to further embodiments of the present disclosure;
FIG. 3 schematically illustrates an example flow diagram of a method for Jmeter-based interface automation testing, according to further embodiments of the present disclosure;
FIG. 4 is a schematic diagram showing an example of a method of automated testing of a Jmeter-based interface in FIG. 3;
FIG. 5 schematically illustrates an example block diagram of a Jmeter-based interface automation testing device, in accordance with some embodiments of the present disclosure;
fig. 6 illustrates an example system that includes an example computing device that represents one or more systems and/or devices that may implement the various techniques described herein.
Detailed Description
Several embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings in order to enable those skilled in the art to practice the technical solutions of the present disclosure. The technical solutions of the present disclosure may be embodied in many different forms and purposes, and should not be limited to the embodiments set forth herein. These embodiments are provided to make the technical solutions of the present disclosure clear and complete, but the described embodiments do not limit the scope of the present disclosure.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and/or the present specification and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
FIG. 1 schematically illustrates an example flow diagram of a Jmeter-based interface automated testing method (hereinafter simply referred to as interface testing method 100 for brevity) in accordance with some embodiments of the present disclosure.
Configuration step 110 is illustratively performed for a Jmeter's test plan, specifically, configuration step 110 includes steps 111-114. Wherein, in step 111, components such as user-defined variables, HTTP request default values, HTTP header managers, view result trees, and aggregate reports are added. It should be noted that according to the needs of the actual application scenario, other components, such as csv data file setting, database connection configuration, and the like, may also be added; in step 112, adding a corresponding thread group for the item to be tested in the test plan, wherein the added thread group is uniquely corresponding to the test plan; configuring a plurality of condition controllers under the thread group in step 113, wherein the condition controllers are used for determining the environment of the item to be tested; at step 114, a benshell sampler is added under each of the plurality of condition controllers to configure the data values for testing. Next, in step 120, a specific interface test task is added, a Jmeter test script is generated according to the configured test plan, and an interface test is performed based on the Jmeter test script to obtain a test result, and in addition, the script used for the test, the initialization data, and the configuration information may be versioned to the test result, so as to facilitate subsequent applications (e.g., performing a regression test).
Illustratively, the user-defined variable indicates whether env corresponds to a development environment, a test environment, a pre-release environment, or a production environment. A plurality of condition controllers (If controllers) are arranged below the thread groups. The condition controller is used for controlling the Jmeter to execute the script flow, and the sampler can be operated only when a certain condition is true. In the present disclosure, the number of condition controllers may be determined by the number of types of environments, for example, when the environment in which the item to be tested is located includes four environments of a development environment, a test environment, a pre-release environment, and a production environment, the number of condition controllers may be four to distinguish the four different environments. The name and value values may be set within the user-defined variable, e.g., the name may be "env" and then the value of "$ { env }" may be set in the condition controller. For example: the condition controllers are required to specify "$ { env }" = "development environment", "$ { env }" = "test environment", "$ { env }" = "pre-release environment", and "$ { env }" = "production environment", respectively. Where "$ { env }" is used to describe different environments. This context is used if the value in the user-defined variable and the value in the condition controller are consistent.
The added benshell sampler under each of the plurality of conditional controllers is used for defining variables, including request protocols, request addresses, request ports, random values and the like. Defining these variables facilitates parameter referencing by HTTP requests, which in turn helps to allow a set of code (script) to be reused in different environments, since it needs to be defined once in the benshell sampler inside the plurality of condition controllers, and can be reused many times. The method has the advantages that the operation of defining variables by adopting the 'Benshell sampler' is more flexible, the function of compiling codes is realized, and the extensibility is stronger than that of defining variables by a user.
Through the interface testing method 100, different items can be logically isolated by adding corresponding thread groups to the items to be tested in the testing plan, and different environments (for example, a development environment, a testing environment, a pre-release environment, a production environment and the like) where the items to be tested are located are distinguished by configuring a plurality of condition controllers (which are used for determining the environment where the items to be tested are located) under the thread groups, so that a set of scripts can be utilized to support a plurality of different environments, thereby facilitating the realization of interface automated testing and obtaining corresponding testing results.
In some embodiments, the configuring step 110 further comprises: and establishing a corresponding Loop Controller (Loop Controller) for each code module corresponding to the item to be tested so as to isolate each code module logically. The loop controller is used for controlling the corresponding code module to execute loop for multiple times (the loop times can be set according to actual needs). By logically isolating individual code modules using a cycle controller, interface testing (e.g., single interface testing or business process testing) may be performed more efficiently. In an exemplary manner, the first and second electrodes are,
in some embodiments, the interface test is a single interface test or a business process test, and the name of the cycle controller is determined according to a foreground and background position corresponding to the interface to be tested and a module function. For example, for single interface testing, the name of the cycle controller may be: "foreground-function module", for example: "foreground-student registration", "background-school management"; for business process testing, the name of the cycle controller may be: $ env interface test case number- "business process Chinese name", for example: "test environment-XX studio test cases on cloud 01-favorites". It should be noted that what module in the foreground and background of the test interface needs to be indicated in the name. In addition, each business process test can correspond to a test case number.
In some embodiments, the interface test is a business process test, and the configuring step further comprises: a Transaction Controller (Transaction Controller) is established under the cycle Controller to determine overall performance metrics for all HTTP requests under the Transaction Controller (e.g., 90% of requests take no more time, request error rate, maximum response time, TPS value). When there are multiple requests, it is desirable to see the test result of a transaction (e.g., total time of all requests, total throughput, etc.), which can be handled by the transaction controller.
In some embodiments, the name of the HTTP request corresponding to the item to be tested is determined according to: the environment of the item to be tested, the serial number of the interface test case and the name of the interface. In the related art, since the command specification of the HTTP request is not clearly defined, the test result shows that which test environment is the result of which test environment cannot be distinguished, and the specific position of the code module cannot be accurately located if the interface test fails. These problems may be solved by the above naming method for HTTP requests of the present disclosure. For example, the name of the HTTP request corresponding to the item to be tested may be: $ env } -interface test case number- "Chinese name of Single interface", for example: by the naming method, specific test cases in what environment are not successfully executed can be checked by observing HTML Report.
In some embodiments, the interface test is a single interface test for adding, deleting, and modifying, and the interface test is performed by: adding target test data; querying the added target test data; acquiring the ID of the target test data according to the JSON extractor, and modifying the target test data to obtain modified test data; and deleting the modified test data according to the ID of the target test data. Through the standardized technical scheme provided for the single interface test for increasing, deleting, checking and modifying, the single interface test for increasing, deleting, checking and modifying can ensure that a corresponding database is not required to be connected in the single interface test for increasing, deleting, checking and modifying, so that the configuration is simplified and dirty data is not caused.
FIG. 2 schematically illustrates an example flow diagram of a Jmeter-based interface automated testing method (hereinafter simply referred to as interface testing method 200 for brevity) according to further embodiments of the present disclosure. Wherein, in the configure step 210 for the test plan execution of the meter, steps 211-214 are consistent with steps 111-114, respectively, described above with respect to FIG. 1. In addition, the configuring step 210 further includes the step 215 of: an expected value of the assertion is increased in the view result control so that the expected value of the assertion is displayed regardless of success or failure of the assertion (FailureMessage in the current view result control only shows the expected value of the assertion in the case of failure of the assertion, and should also show the expected value of the assertion in the case of success of the test). For example, a viewing result control may be configured, that is, an item is added in the viewing result control to indicate an Assertion expected value, for example, the item may be named "Assertion expected", so that the interface expected value may be centrally displayed according to unique binding of an interface ID and a corresponding interface expected value, and interface testers may visually check a null Assertion, and may also conveniently perform centralized checking on whether the Assertion is written correctly or not. Illustratively, the environment and test number used may be tagged for each HTTP request, e.g., as "$ { env } -interface test case number- 'literal name in business process'". Furthermore, labels (labels) in the assertion result correspond to the assertion expected value one by one, so that the condition of intensively checking the assertion expected value is realized. Step 220 is for performing interface testing, which includes steps 221-224.
Specifically, in step 221, a specific interface test task is added, and a meter test script is generated according to the configured test plan. Further at step 222, a plurality of interfaces are verified based on the Jmeter test script to generate a corresponding plurality of assertion results.
At step 223, a first percentage of null-assertions in the assertion results is determined according to corresponding assertion expected values in the assertion results. Illustratively, when the assertion expected value is null, the corresponding assertion can be determined to be null, and the number of null assertions can be determined through the corresponding assertion expected value in the multiple assertion results, so as to determine the occupation ratio of the null assertion in the multiple assertion results as the first occupation ratio.
At step 224, a health of the assertion is determined based at least on the first percentage. For example, a second proportion of non-null assertions in the plurality of assertion results may be determined from the first proportion, and the second proportion may be taken as the health level. Exemplarily, the total number of null assertions is represented by N, and the first duty ratio is N/T, where T represents the total number of the assertion results, and in this case, the second duty ratio is equal to 1-N/T = (T-N)/T, so that (T-N)/T can be taken as the health degree. The higher the value of the health degree, the smaller the total number of null assertions, and the more reliable (healthy) the assertion result.
As another example, a health of an assertion may be determined based on the first duty and a weight of a null assertion, where the weight of the null assertion is determined based at least on a type of interface test to which the null assertion corresponds. The type of the interface test corresponding to the null assertion may include a single interface test and a business process test. The weight of the null assertion may be any number between 0 and 1. Exemplarily with w i To represent the weight of the ith null assertion, the health of the assertion can be represented as follows:
Figure DEST_PATH_IMAGE002
it should be noted that, in the present disclosure, the weight of the null assertion may be determined according to the type of the interface test corresponding to the null assertion, or may be determined by combining the type of the interface test corresponding to the null assertion and the importance of the business process. For example, a null assertion corresponding to a business process with higher importance is given higher weight.
Through the interface test method 200, the health degree of the assertion can be more accurately measured by counting the proportion of the empty assertions, and in addition, in the interface test method 200, the assertion of the test script can be divided into weights, so that the evaluation on the health degree of the assertion is further refined.
In some embodiments, the interface test method 100 further comprises: normalizing the test result to obtain an interface test report, wherein the interface test report comprises a first path, a second path and a third path, and the first path is used for storing a data processing related table; the second path is used for storing test script related files, wherein the test script related files comprise jmx files (such as test frame script files), jtl files (test reports in html form can be generated through the files), and log files (namely log files); and the third path is used for storing the test result in the form of HTML Report and viewing the test result through the browser. Illustratively, the first path, the second path, and the third path may be in the form of folders.
FIG. 3 schematically illustrates an example flow diagram of a Jmeter-based interface automated testing method (hereinafter simply referred to as interface testing method 300 for brevity) in accordance with further embodiments of the disclosure. Therein, in the configure step 310 for the test plan execution of the meter, steps 311-314 are consistent with steps 111-114, respectively, described above with respect to FIG. 1. Step 320 is for conducting an interface test, which includes steps 321-324.
Specifically, in step 321, adding a specific interface test task, and generating a Jmeter test script according to the configured test plan; at step 322, the Jmeter test script is submitted to a code repository. Illustratively, the code repository may be a GitLab repository, in which case the Jmeter test script is uploaded to GitLab. GitLab is an open source project for a warehouse management system, and is a Web service built on the basis of Git serving as a code management tool. In addition to the GitLab repository, other repositories capable of holding source code may be used (including but not limited to local repositories, gitHub, bitstruct, etc.); setting up an automated testing task using a persistent integration tool (e.g., jenkins, tekton, etc.), downloading the Jmeter test script from the code repository and building using a corresponding building tool (e.g., ant) to generate a Jmeter test report at step 323; at step 324, feeding back preset content to the administrator, the preset content including at least one of: the number of test interfaces, the number of failures, the success rate, the average response time, and the URL of the interface. A schematic diagram of the interface testing method 300 of fig. 3 is further described below in conjunction with fig. 4.
As shown in fig. 4, the meter server 410 may submit the generated meter test script to the code repository 420 so that the Jenkins server 430, which sets the corresponding automated test task, downloads the meter test script from the code repository 420 and constructs it using the corresponding construction tool, thereby generating a meter test report. Finally, preset content may be fed back to the administrator 440 by the Jenkins server 430, the preset content including at least one of: the number of test interfaces, the number of failures, the success rate, the average response time, and the URL of the interface.
FIG. 5 schematically illustrates an example block diagram of a Jmeter-based interface automated testing apparatus (hereinafter simply referred to as interface automated testing apparatus 500 for brevity) in accordance with some embodiments of the disclosure. As shown in fig. 5, the interface automation test apparatus 500 includes a test script configuration module 510 and a test result generation module 520.
In particular, test script configuration module 510 may be configured to: adding user-defined variables, HTTP request default values, HTTP header managers, checking result trees and aggregating reports; adding a corresponding thread group for a project to be tested in the test plan, wherein the added thread group is uniquely corresponding to the test plan; configuring a plurality of condition controllers under the thread group, wherein the condition controllers are used for determining the environment of an item to be tested; and adding a benshell sampler under each of the plurality of condition controllers to configure data values for testing; the test result generation module 520 may be configured to: adding specific interface test tasks, generating a Jmeter test script according to the configured test plan, and performing interface test based on the Jmeter test script to obtain a test result.
It should be understood that the interface automated test equipment 500 may be implemented in software, hardware, or a combination of software and hardware to support either of the C/S mode and the B/S mode. A plurality of different modules in the interface automated test equipment 500 may be implemented in the same software or hardware configuration, or one module may be implemented by a plurality of different software or hardware configurations.
Moreover, the interface automatic testing device 500 can be used to implement the interface testing method 100 described above, and the relevant details thereof have been described in detail above, and are not repeated here for the sake of brevity. The interface automated test equipment 500 may have the same features and advantages as described with respect to the previously described interface test method 100.
Fig. 6 illustrates an example system that includes an example computing device 600 that represents one or more systems and/or devices that may implement the various techniques described herein. Computing device 600 may be, for example, a server of a service provider, a device associated with a server, a system on a chip, and/or any other suitable computing device or computing system. Any one or more of the above-described interface automated test equipment 500 may take the form of a computing device 600. Alternatively, the interface automated testing apparatus 500 may be implemented as a computer program in the form of an application 616.
The example computing device 600 as shown in fig. 6 includes a processing system 611, one or more computer-readable media 612, and one or more I/O interfaces 613 communicatively coupled to each other. Although not shown, the computing device 600 may also include a system bus or other data and command transfer system that couples the various components to one another. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. Various other examples are also contemplated, such as control and data lines.
Processing system 611 represents functionality to perform one or more operations using hardware. Thus, the processing system 611 is illustrated as including hardware elements 614 that may be configured as processors, functional blocks, and the like. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors. The hardware elements 614 are not limited by the materials from which they are formed or the processing mechanisms employed therein. For example, a processor may be comprised of semiconductor(s) and/or transistors (e.g., electronic Integrated Circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions.
Computer-readable medium 612 is illustrated as including memory/storage 615. Memory/storage 615 represents memory/storage capacity associated with one or more computer-readable media. Memory/storage 615 may include volatile media (such as Random Access Memory (RAM)) and/or nonvolatile media (such as Read Only Memory (ROM), flash memory, optical disks, magnetic disks, and so forth). Memory/storage 615 may include fixed media (e.g., RAM, ROM, a fixed hard drive, etc.) as well as removable media (e.g., flash memory, a removable hard drive, an optical disk, and so forth). The computer-readable medium 612 may be configured in various other ways as further described below.
One or more I/O interfaces 613 represent functionality that allows a user to enter commands and information to computing device 600 using various input devices and optionally also allows information to be presented to the user and/or other components or devices using various output devices. Examples of input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone (e.g., for voice input), a scanner, touch functionality (e.g., capacitive or other sensors configured to detect physical touch), a camera (e.g., motion that may not involve touch may be detected as gestures using visible or invisible wavelengths such as infrared frequencies), and so forth. Examples of output devices include a display device (e.g., a display or projector), speakers, a printer, a network card, a haptic response device, and so forth. Thus, the computing device 600 may be configured in various ways to support user interaction, as described further below.
Computing device 600 also includes applications 616. The application 616 may be, for example, a software instance of the interface automation test equipment 500 and implement the techniques described herein in combination with other elements in the computing device 600.
Various techniques may be described herein in the general context of software hardware elements or program modules. Generally, these modules include routines, programs, elements, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The terms "module," "functionality," and "component" as used herein generally represent software, firmware, hardware, or a combination thereof. The features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of computing platforms having a variety of processors.
An implementation of the described modules and techniques may be stored on or transmitted across some form of computer readable media. Computer readable media can include a variety of media that can be accessed by computing device 600. By way of example, and not limitation, computer-readable media may comprise "computer-readable storage media" and "computer-readable signal media".
"computer-readable storage medium" refers to a medium and/or device, and/or a tangible storage apparatus, capable of persistently storing information, as opposed to mere signal transmission, carrier wave, or signal per se. Accordingly, computer-readable storage media refers to non-signal bearing media. Computer-readable storage media include hardware such as volatile and nonvolatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer-readable instructions, data structures, program modules, logic elements/circuits or other data. Examples of computer readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital Versatile Disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage devices, tangible media, or an article of manufacture suitable for storing the desired information and accessible by a computer.
"computer-readable signal medium" refers to a signal-bearing medium configured to transmit instructions to the hardware of computing device 600, such as via a network. Signal media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave, data signal or other transport mechanism. Signal media also includes any information delivery media. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.
As previously mentioned, the hardware elements 614 and computer-readable media 612 represent instructions, modules, programmable device logic, and/or fixed device logic implemented in hardware form that, in some embodiments, may be used to implement at least some aspects of the techniques described herein. The hardware elements may include integrated circuits or systems-on-chips, application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs), complex Programmable Logic Devices (CPLDs), and other implementations in silicon or components of other hardware devices. In this context, a hardware element may serve as a processing device that performs program tasks defined by instructions, modules, and/or logic embodied by the hardware element, as well as a hardware device for storing instructions for execution, such as the computer-readable storage medium described previously.
Combinations of the foregoing may also be used to implement the various techniques and modules described herein. Thus, software, hardware, or program modules and other program modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage medium and/or by one or more hardware elements 614. Computing device 600 may be configured to implement particular instructions and/or functions corresponding to software and/or hardware modules. Thus, implementing modules as modules executable by the computing device 600 as software may be implemented at least partially in hardware, for example, using computer-readable storage media of a processing system and/or hardware elements 614. The instructions and/or functions may be executable/operable by one or more articles of manufacture (e.g., one or more computing devices 600 and/or processing systems 611) to implement the techniques, modules, and examples described herein.
In various implementations, the computing device 600 may assume a variety of different configurations. For example, the computing device 600 may be implemented as a computer-like device including a personal computer, a desktop computer, a multi-screen computer, a laptop computer, a netbook, and so forth. The computing device 600 may also be implemented as a mobile device-like device including mobile devices, such as a mobile telephone, a portable music player, a portable gaming device, a tablet computer, a multi-screen computer, and so on. Computing device 600 may also be implemented as a television-like device that includes devices with or connected to generally larger screens in casual viewing environments. These devices include televisions, set-top boxes, game consoles, and the like.
The techniques described herein may be supported by these various configurations of computing device 600 and are not limited to specific examples of the techniques described herein. Functionality may also be implemented in whole or in part on "cloud" 620 using a distributed system, such as through platform 622 as described below.
Cloud 620 includes and/or is representative of a platform 622 for resources 624. The platform 622 abstracts underlying functionality of hardware (e.g., servers) and software resources of the cloud 620. The resources 624 may include applications and/or data that may be used when executing computer processes on servers remote from the computing device 600. Resources 624 may also include services provided over the internet and/or over a subscriber network such as a cellular or Wi-Fi network.
Platform 622 may abstract resources and functionality to connect computing device 600 with other computing devices. Platform 622 may also be used to abstract the hierarchy of resources to provide a corresponding level of hierarchy encountered for the demand of resources 624 implemented via platform 622. Thus, in interconnected device embodiments, implementation of functions described herein may be distributed throughout the system 600. For example, the functionality may be implemented in part on the computing device 600 and by the platform 622 that abstracts the functionality of the cloud 620.
It will be appreciated that embodiments of the disclosure have been described with reference to different functional units for clarity. However, it will be apparent that the functionality of each functional unit may be implemented in a single unit, in a plurality of units or as part of other functional units without departing from the disclosure. For example, functionality illustrated to be performed by a single unit may be performed by a plurality of different units. Thus, references to specific functional units are only to be seen as references to suitable units for providing the described functionality rather than indicative of a strict logical or physical structure or organization. Thus, the present disclosure may be implemented in a single unit or may be physically and functionally distributed between different units and circuits.
It will be understood that, although the terms first, second, third, etc. may be used herein to describe various devices, elements, components or sections, these devices, elements, components or sections should not be limited by these terms. These terms are only used to distinguish one device, element, component or section from another device, element, component or section.
Although the present disclosure has been described in connection with some embodiments, it is not intended to be limited to the specific form set forth herein. Rather, the scope of the present disclosure is limited only by the accompanying claims. Additionally, although individual features may be included in different claims, these may possibly advantageously be combined, and the inclusion in different claims does not imply that a combination of features is not feasible and/or advantageous. The order of features in the claims does not imply any specific order in which the features must be worked. Furthermore, in the claims, the word "comprising" does not exclude other elements, and the terms "a" or "an" do not exclude a plurality. Reference signs in the claims are provided merely as a clarifying example and shall not be construed as limiting the scope of the claims in any way.
It should be understood that embodiments of the disclosure have been described with reference to different functional units for clarity. However, it will be apparent that the functionality of each functional unit may be implemented in a single unit, in a plurality of units or as part of other functional units without departing from the disclosure. For example, functionality illustrated to be performed by a single unit may be performed by a plurality of different units. Thus, references to specific functional units are only to be seen as references to suitable units for providing the described functionality rather than indicative of a strict logical or physical structure or organization. Thus, the present disclosure may be implemented in a single unit or may be physically and functionally distributed between different units and circuits.
The present disclosure provides a computer-readable storage medium having computer-readable instructions stored thereon, which when executed, implement the above-mentioned Jmeter-based interface automation testing method.
The present disclosure provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The computer instructions are read by a processor of the computing device from the computer-readable storage medium, and the processor executes the computer instructions to enable the computing device to execute the Jmeter-based interface automated testing method provided in the various optional implementation modes.
Variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed subject matter, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the word "a" or "an" does not exclude a plurality. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Claims (14)

1. A Jmeter-based interface automatic testing method is characterized by comprising the following steps:
the following configuration steps are performed for the Jmeter's test plan:
adding user-defined variables, HTTP request default values, HTTP header managers, view result trees and aggregated reports;
adding a corresponding thread group for a project to be tested in the test plan, wherein the added thread group is uniquely corresponding to the test plan;
configuring a plurality of condition controllers under the thread group, wherein the condition controllers are used for determining the environment of the item to be tested; and
adding a Beanshell sampler under each of the plurality of condition controllers to configure data values for testing;
adding a specific interface test task, generating a Jmeter test script according to the configured test plan, and carrying out interface test based on the Jmeter test script to obtain a test result.
2. The method of claim 1, wherein the configuring step further comprises: and establishing a corresponding cycle controller for each code module corresponding to the item to be tested so as to logically isolate each code module.
3. The method of claim 2, wherein the interface test is a single interface test or a business process test, and the name of the cycle controller is determined according to a foreground and background position corresponding to the interface to be tested and a module function.
4. The method of claim 2, wherein the interface test is a business process test, and wherein the configuring step further comprises:
establishing a transaction controller under the cycle controller to determine an overall performance metric for all HTTP requests under the transaction controller.
5. The method of claim 1, wherein the name of the HTTP request corresponding to the item to be tested is determined according to:
the environment of the item to be tested, the serial number of the interface test case and the name of the interface.
6. The method of claim 1, wherein the interface test is a single interface test for add-drop and delete-drop classification, and the interface test is performed by:
adding target test data;
querying the added target test data;
acquiring the ID of the target test data according to the JSON extractor, and modifying the target test data to obtain modified test data;
and deleting the modified test data according to the ID of the target test data.
7. The method of claim 1, wherein the configuring step further comprises: adding an assertion expected value in a viewing result control;
and wherein said performing an interface test based on said Jmeter test script comprises:
verifying a plurality of interfaces based on the Jmeter test script to generate a corresponding plurality of assertion results;
determining a first proportion of null assertions in the plurality of assertion results according to corresponding assertion expected values in the plurality of assertion results; and
and determining the health degree of the assertion according to at least the first proportion.
8. The method of claim 1, further comprising:
normalizing the test result to obtain an interface test report, wherein the interface test report comprises a first path, a second path and a third path, and the first path, the second path and the third path are arranged in the interface test report
The first path is used for storing data processing related tables;
the second path is used for storing test script related files, and the test script related files comprise jmx files, jtl files and log files; and is
And the third path is used for storing the test result in the form of HTML Report.
9. The method of claim 1, wherein performing an interface test based on the meter test script to obtain a test result comprises:
submitting the Jmeter test script to a code repository;
setting an automatic test task by using a continuous integration tool, downloading the Jmeter test script from the code warehouse and constructing by using a corresponding construction tool to generate a Jmeter test report; and
feeding back preset content to an administrator, wherein the preset content comprises at least one of the following items: the number of test interfaces, the number of failures, the success rate, the average response time, and the URL of the interface.
10. The method of claim 1, wherein the environment in which the item to be tested is located comprises any one of: a development environment, a test environment, a pre-release environment, and a production environment.
11. A Jmeter-based interface automated testing device, characterized in that the device comprises:
a test script configuration module configured to add user-defined variables, HTTP request defaults, HTTP header managers, view result trees, and aggregated reports; adding a corresponding thread group for a project to be tested in the test plan, wherein the added thread group is uniquely corresponding to the test plan; configuring a plurality of condition controllers under the thread group, wherein the condition controllers are used for determining the environment of the item to be tested; and adding a benshell sampler under each of the plurality of condition controllers to configure data values for testing; and
the test result generation module is configured to add a specific interface test task, generate a Jmeter test script according to the configured test plan, and perform an interface test based on the Jmeter test script to obtain a test result.
12. A computing device, wherein the computing device comprises:
a memory configured to store computer-executable instructions;
a processor configured to perform the method of any one of claims 1 to 10 when the computer-executable instructions are executed by the processor.
13. A computer-readable storage medium having computer-executable instructions stored thereon that, when executed, perform the method of any one of claims 1 to 10.
14. A computer program product comprising computer executable instructions, wherein the computer executable instructions when executed by a processor perform the method of any one of claims 1 to 10.
CN202211577391.4A 2022-12-09 2022-12-09 Jmeter-based interface automatic testing method and device Pending CN115687156A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211577391.4A CN115687156A (en) 2022-12-09 2022-12-09 Jmeter-based interface automatic testing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211577391.4A CN115687156A (en) 2022-12-09 2022-12-09 Jmeter-based interface automatic testing method and device

Publications (1)

Publication Number Publication Date
CN115687156A true CN115687156A (en) 2023-02-03

Family

ID=85055564

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211577391.4A Pending CN115687156A (en) 2022-12-09 2022-12-09 Jmeter-based interface automatic testing method and device

Country Status (1)

Country Link
CN (1) CN115687156A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117251382A (en) * 2023-11-17 2023-12-19 北京安锐卓越信息技术股份有限公司 Automatic performance pressure measurement period execution method and device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117251382A (en) * 2023-11-17 2023-12-19 北京安锐卓越信息技术股份有限公司 Automatic performance pressure measurement period execution method and device

Similar Documents

Publication Publication Date Title
US20200349058A1 (en) Methods and systems for testing web applications
US11099973B2 (en) Automated test case management systems and methods
US9454423B2 (en) SAN performance analysis tool
CA2852760C (en) Migration assessment for cloud computing platforms
TWI533123B (en) Method and system for automated test and result comparison
US7340649B2 (en) System and method for determining fault isolation in an enterprise computing system
US20160004628A1 (en) Parallel test execution framework for multiple web browser testing
US11847480B2 (en) System for detecting impairment issues of distributed hosts
AU2018282168A1 (en) Dataflow graph configuration
US10791033B2 (en) Cloud-native network function assessment tool
AU2016259298A1 (en) Machine for development and deployment of analytical models
US11620420B2 (en) Computing system simulation and testing environment
US20160034379A1 (en) Information technology testing and testing data management
US9152534B2 (en) System and method for validating configuration settings
US11151025B1 (en) Generating software test plans based at least in part on monitored traffic of a production application
US20210303368A1 (en) Operator management apparatus, operator management method, and operator management computer program
CN109815119A (en) A kind of test method and device of APP link channel
CN115687156A (en) Jmeter-based interface automatic testing method and device
CN114996127A (en) Intelligent test method and system for solid state disk firmware module
CN111159029B (en) Automated testing method, apparatus, electronic device and computer readable storage medium
US11055205B1 (en) Regression testing using automation technologies
US10277485B1 (en) Network device testing using non-destructive techniques
US10055516B1 (en) Testing open mobile alliance server payload on an open mobile alliance client simulator
US8621451B1 (en) Responsiveness of a computing device in connection with software distribution
CN110618943B (en) Security service test method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination