CN116126719A - Interface testing method and device, electronic equipment and storage medium - Google Patents
Interface testing method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN116126719A CN116126719A CN202310140233.0A CN202310140233A CN116126719A CN 116126719 A CN116126719 A CN 116126719A CN 202310140233 A CN202310140233 A CN 202310140233A CN 116126719 A CN116126719 A CN 116126719A
- Authority
- CN
- China
- Prior art keywords
- test
- task
- information
- local
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 415
- 238000012545 processing Methods 0.000 claims abstract description 59
- 238000000034 method Methods 0.000 claims abstract description 46
- 230000008569 process Effects 0.000 claims abstract description 25
- 230000004044 response Effects 0.000 claims abstract description 13
- 238000004590 computer program Methods 0.000 claims description 15
- 230000014509 gene expression Effects 0.000 claims description 10
- 238000000605 extraction Methods 0.000 claims description 7
- 238000001514 detection method Methods 0.000 claims description 6
- 238000012216 screening Methods 0.000 claims description 6
- 238000013473 artificial intelligence Methods 0.000 abstract description 3
- 238000010998 test method Methods 0.000 description 13
- 238000010586 diagram Methods 0.000 description 12
- 238000004891 communication Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 238000011990 functional testing Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000005856 abnormality Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013475 authorization Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3684—Test management for test design, e.g. generating new test cases
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Telephone Function (AREA)
- Test And Diagnosis Of Digital Computers (AREA)
Abstract
The disclosure provides an interface testing method, an interface testing device, electronic equipment and a storage medium, relates to the technical field of artificial intelligence, and particularly relates to the technical field of testing. The specific implementation scheme is as follows: in response to receiving the test instruction, determining at least one target test case for the interface to be tested according to the test instruction; determining a test task according to at least one target test case; determining whether the test task meets remote processing conditions according to the local residual resource information and the required resource information required by executing the test task; and under the condition that the test task meets the remote processing condition, calling the remote resource to process the test task so as to obtain the test information aiming at the interface to be tested.
Description
Technical Field
The present disclosure relates to the field of artificial intelligence technology, and more particularly, to the field of testing technology, and more particularly, to an interface testing method, apparatus, electronic device, storage medium, and computer program product.
Background
With the improvement of the complexity of a software system, the one-sided performance and the hysteresis of the conventional functional test lead to the increase of the test cost, the test efficiency is greatly reduced, and the quality and the progress of projects are difficult to ensure only by the functional test. The application of the interface test can enable a test team to intervene in the project earlier and deeper, so that a tester can find a deep problem of the system at the initial stage of the project, and the time cost of problem repair is reduced.
Disclosure of Invention
The present disclosure provides an interface testing method, apparatus, electronic device, storage medium, and computer program product.
According to an aspect of the present disclosure, there is provided an interface testing method, including: in response to receiving the test instruction, determining at least one target test case for the interface to be tested according to the test instruction; determining a test task according to at least one target test case; determining whether the test task meets remote processing conditions according to the local residual resource information and the required resource information required by executing the test task; and under the condition that the test task meets the remote processing condition, calling the remote resource to process the test task so as to obtain the test information aiming at the interface to be tested.
According to another aspect of the present disclosure, there is provided an interface test apparatus including: the system comprises a first determining module, a second determining module, a third determining module and a calling module. The first determining module is used for determining at least one target test case aiming at the interface to be tested according to the test instruction in response to receiving the test instruction. The second determining module is used for determining a test task according to at least one target test case. And the third determining module is used for determining whether the test task meets the remote processing condition according to the local residual resource information and the required resource information required by executing the test task. And the calling module is used for calling the remote resource to process the test task under the condition that the test task meets the remote processing condition so as to obtain the test information aiming at the interface to be tested.
According to another aspect of the present disclosure, there is provided an electronic device including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the methods provided by the present disclosure.
According to another aspect of the present disclosure, there is provided a non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform the method provided by the present disclosure.
According to another aspect of the present disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, implements the method provided by the present disclosure.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
The drawings are for a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
fig. 1 is an application scenario schematic diagram of an interface testing method and apparatus according to an embodiment of the present disclosure;
FIG. 2 is a schematic flow chart diagram of an interface testing method according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a single interface test method according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a multi-interface test method according to an embodiment of the present disclosure;
FIG. 5 is a schematic block diagram of an interface testing apparatus according to an embodiment of the present disclosure; and
fig. 6 is a block diagram of an electronic device for implementing an interface testing method of an embodiment of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present disclosure to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 is an application scenario schematic diagram of an interface testing method and apparatus according to an embodiment of the present disclosure.
It should be noted that fig. 1 is only an example of a system architecture to which embodiments of the present disclosure may be applied to assist those skilled in the art in understanding the technical content of the present disclosure, but does not mean that embodiments of the present disclosure may not be used in other devices, systems, environments, or scenarios.
As shown in fig. 1, a system architecture 100 according to this embodiment may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired and/or wireless communication links, and the like.
The user may interact with the server 105 via the network 104 using the terminal devices 101, 102, 103 to receive or send messages or the like. The terminal devices 101, 102, 103 may be a variety of electronic devices having a display screen and supporting web browsing, including but not limited to smartphones, tablets, laptop and desktop computers, and the like.
The server 105 may be a server providing various services, such as a background management server (by way of example only) providing support for websites browsed by users using the terminal devices 101, 102, 103. The background management server may analyze and process the received data such as the user request, and feed back the processing result (for example, test information generated according to the configuration information of the front-end page) to the terminal device.
It should be noted that the interface testing method provided by the embodiments of the present disclosure may be generally performed by the server 105. Accordingly, the interface testing apparatus provided by the embodiments of the present disclosure may be generally disposed in the server 105. The interface testing method provided by the embodiments of the present disclosure may also be performed by a server or a server cluster that is different from the server 105 and is capable of communicating with the terminal devices 101, 102, 103 and/or the server 105. Accordingly, the interface testing apparatus provided by the embodiments of the present disclosure may also be provided in a server or a server cluster that is different from the server 105 and is capable of communicating with the terminal devices 101, 102, 103 and/or the server 105.
It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Fig. 2 is a schematic flow chart of an interface testing method according to an embodiment of the present disclosure.
As shown in fig. 2, in this embodiment, the interface testing method 200 is used for testing interfaces, where the number of interfaces to be tested is at least one, and the interface testing method may be used for testing a single interface or testing multiple interfaces. The interface test method 200 may include operations S210 to S240.
In response to receiving the test instruction, at least one target test case for the interface to be tested is determined according to the test instruction in operation S210.
For example, the object may perform operations such as inputting, selecting, clicking, dragging, etc. on the front page to configure required target information, and the object may be a user, where the target information includes configuration information and/or a test case identifier, for example. The object can also click a confirm or save button on the front page after completing configuration, which can trigger a test instruction, and the test instruction includes target information of the configuration of the object on the front page. Then, the electronic equipment receives the test instruction and determines a target test case according to target information in the test instruction.
For example, the number of test instructions may be at least one. For example, a plurality of users respectively operate on front-end pages, and the front-end pages can trigger a plurality of test instructions based on the operations.
For example, configuration information may be included in the test instructions, which may be utilized to determine the target test case. For another example, the test instruction may include a test case identifier, and the retrieved test case corresponding to the test case identifier may be determined as the target test case.
For single interface testing, the number of test cases used in the test may be 1. For multi-interface testing, the number of test cases used in the testing is at least two, and the number of test cases corresponding to each interface is at least one.
In operation S220, a test task is determined according to at least one target test case.
The test task may be a task that invokes a target test case for interface testing. For single interface testing, a test task may be determined from a test case corresponding to a single interface. For multi-interface testing, a testing task can be determined according to a plurality of testing cases corresponding to scenes to which a plurality of interfaces belong.
In operation S230, it is determined whether the test task satisfies the remote processing condition according to the local remaining resource information and the required resource information required to perform the test task.
By way of example, the local remaining resource information may include a local remaining resource amount, such as a remaining computing resource margin, a storage resource margin, etc., the computing resource may include a CPU, and the storage resource may include memory. The local remaining resources may also include proportions of the remaining resources, such as CPU utilization, memory utilization, and the like. The local remaining resources may also include a margin of the local task queue, e.g., the local task queue is 100 in length, 90 tasks in the current task queue, and the margin is 10 tasks.
For example, the required amount of resources for a test task may include the amount of resources that need to be used to process the test task.
For another example, the required resource amount of the test task may include an estimated time length required to process the test task, and for the timing task, an average of time lengths used to previously execute the test task may be used as the estimated time length. For a test task that has not been previously performed, the average time length value required to perform the task over a period of time may be taken as the estimated time length for the test task. In addition, a corresponding relation between the estimated time length and the resource amount can be constructed in advance, and the resource amount can be converted based on the corresponding relation.
As another example, the amount of required resources may be the number of test tasks.
For another example, where multiple test tasks are to be processed, the test tasks may be ordered according to at least one of the creation time, name, amount of resources required, estimated duration, etc. of the multiple test tasks. The required resource amount of the test task may include an execution order of the test task among a plurality of test tasks, for example, a 10 th task that is locally required to be processed.
For example, the remote processing condition may be determining whether a difference between the local remaining resource characterized by the local remaining resource information and the demand resource characterized by the demand resource information is less than or equal to a resource threshold, and if so, determining that the test task satisfies the remote processing condition. Otherwise, determining that the test task does not meet the remote processing condition.
In operation S240, in case it is determined that the test task satisfies the remote processing condition, the remote resource is invoked to process the test task to obtain test information for the interface to be tested.
For example, in the process of calling the remote resource to process the test tasks, the remote resource may be called to process all the test tasks, or the remote resource may be called to process a part of the test tasks.
According to the technical scheme provided by the embodiment of the disclosure, the remote resource can be called to process the test task, so that the problem that the overstock of the test task cannot be processed due to insufficient local residual resources can be avoided, and the test efficiency is improved.
In some embodiments, the test task may be processed locally in the event that it is determined that the remote call condition is not satisfied. For example, a Jmeter test engine is invoked using local resources to execute test cases for a single interface or multiple interfaces.
The implementation means for determining at least one target test case according to the information in the test instruction is described below.
In one example, the test instructions are used to test a single interface, and configuration information for the single interface may be included in the test instructions, where the configuration information may include global variables, signature information, extraction expressions for parameters, and assertion information. The extraction expression for a parameter may represent an expression used to extract a target parameter from the interface output information, e.g., the extracted target parameter is a state code, then the extraction expression may be "$. Code". The assertion information may include a logical decision of true or false results, and is used to determine whether the actual result output by the interface is consistent with the expected result, for example, the expected result of the target parameter is "200", and after the target parameter code is extracted by the extraction expression, whether the target parameter code is consistent with the expected result "200" is determined to verify whether the interface is correct.
For a single interface test scenario, after receiving the test instruction, a test case may be generated according to the configuration information. For example, a case template of the test case may be preconfigured and stored, after receiving the test instruction, configuration information may be obtained from the test instruction, and then the configuration information is assembled with the case template, thereby obtaining the test case. Next, the resulting test case may be added to at least one target test case.
In another example, the test instructions may be for testing a multi-interface scenario, the test instructions may include at least one configuration information for at least one interface, and/or include at least one test case identification for at least one interface. For a multi-interface test scenario, the target test case may be determined according to configuration information and/or a test case identifier included in the test instruction.
For example, a multi-interface test needs to test the interface A, B, C, and the user can configure configuration information a for the interface a, configuration information B for the interface B, and an identifier C for a test case of the interface C on the front-end page. Next, a test case tc_a for the interface a may be generated according to the configuration information a, a test case tc_b for the interface B may be generated according to the configuration information B, and the manner of generating the test case according to the configuration information may be referred to above, which will not be described herein. In addition, according to the test case identification and the mapping information representing the relation between the test case and the test case identification, the test case TC_C corresponding to the identification C can be searched in the case library. Next, test cases tc_ A, TC _b and tc_c may be determined as a set of test cases for multi-interface testing, the set of test cases corresponding to one test scenario. The existing test cases are referenced through the test case identification, and a user does not need to configure the test cases, so that the operation efficiency is improved.
According to another embodiment of the present disclosure, in the process of performing the single-interface test and the multi-interface test, the user may configure the required configuration information on the front-end page, and the electronic device may generate the test case based on the configuration information in the test instruction. In addition, after the new test cases are generated in the single-interface test and the multi-interface test, the new test cases can be added into the case library, so that the case library is updated. The embodiment can store new test cases, so that when a user performs other tests and needs to use the configured test cases, the test cases which are configured before can be directly referred by the test case identification, and repeated configuration of the same test case is not needed, thereby improving the test efficiency.
Next, a detailed description will be given of determining whether the test task satisfies the remote processing condition in connection with the following example.
In one example, the resources characterized by the local remaining resource information may include a local remaining resource amount, e.g., a computing resource margin, a memory margin. The demand resources characterized by the demand resource information may include an amount of demand resources required to perform the test task. Accordingly, determining whether the remote processing condition is satisfied may include: and determining whether the first remote processing condition is met according to the local residual resource amount, the required resource amount and the threshold value.
For example, for a timed task, the average of the amount of resources used to previously execute the test task may be taken as the required amount of resources. For a test task that has not been executed before, an average of the amount of resources required to execute other tasks over a period of time may be taken as the required amount of resources for the test task.
For example, the first remote processing condition may be that a difference between the local remaining amount of resources and the required amount of resources is less than or equal to a resource amount threshold. For another example, the sum of the local remaining resource amount and the required resource amount may be used as the first data, and the first remote processing condition may be that a ratio between the first data and the local resource amount is greater than or equal to a ratio threshold.
The present example calculates the first data by the local remaining resource amount and the required resource amount, and determines whether the first data meets the threshold value, so it can be accurately determined whether the test task satisfies the remote processing condition.
In another example, the resources characterized by the local remaining resource information may include local CPU utilization and the demand resources characterized by the demand resource information may include a total number of tasks for the test task. Accordingly, determining whether the remote processing condition is satisfied may include: and determining the maximum execution number according to the local CPU utilization and the target length of the local task queue, and then determining whether the total number of tasks of the test task is larger than the maximum execution number. If so, determining that the test task meets the second remote processing condition, otherwise, determining that the test task does not meet the second remote processing condition.
For example, the target length of the local task queue represents the maximum number of tasks in the local task queue. For example, the target length of the local task queue may be a predetermined value, e.g., the target length is 100.
For another example, the target length of the local task queue may be determined based on a predetermined wait time, a task processing time, and a number of threads local. For example, the predetermined wait time may represent a maximum wait time acceptable to the user, and the value of the predetermined wait time may be preconfigured. The task processing duration may represent an average duration value required to execute a task over a period of time. For example, a ratio of the predetermined waiting time period to the task processing time period may be taken as the second data, and then a difference between the second data and the local thread number may be taken as the target length of the local task queue.
According to the embodiment, the target length of the local task queue is dynamically determined according to the preset waiting time, the task processing time and the local thread number. Compared with the scheme that the target length adopts the preset value, the target length of the local task queue can be dynamically adjusted along with the change of the task processing time length or the preset waiting time length, so that the situation that the test task is backlogged on the local due to overlong preset value setting is avoided, and the situation that the local resource is idle due to overlong preset value setting is avoided. Therefore, on the basis of ensuring efficient utilization of local resources, the testing efficiency is improved.
The target length of the local task queue is described above, and a scheme of determining the maximum execution number based on the target length of the local task queue is described next.
For example, a ratio of 1 to CPU utilization may be used as the third data. For another example, a ratio of I/O time consumption to CPU time consumption may be calculated, and a value obtained by adding 1 to the ratio may be used as the third data. For another example, the number of threads in the local thread pool may be used as the third data. The sum of the third data and the target length of the local task queue is then determined as the maximum execution number.
For example, the number of unprocessed test tasks is a plurality. Tasks may be handled by a local thread pool, with the test tasks being handled by threads in the thread pool being in an in-process state. In addition, a local task queue can be set, and the test task in the local task queue is in a queuing state. After the threads in the thread pool process the test tasks in the processing state, the test tasks in the local task queue are processed, and at the moment, the state of the test tasks is switched from the queuing state to the in-process state.
The present embodiment determines the maximum execution number according to the utilization rate and the target length of the local task queue, and then determines whether the remote processing condition is satisfied according to the relationship between the total number of test tasks and the maximum execution number. Therefore, a plurality of test tasks can be reasonably distributed to a remote place, so that the test efficiency is ensured.
The process of scheduling a remote resource to process a portion of a plurality of test tasks is described below in connection with a specific example.
For example, the plurality of test tasks may be split into a first task set and a second task set based on the local remaining resource information and the demand resource information. And then executing the first task set by utilizing the local residual resources to obtain first test information corresponding to the first task set. And for a second task set which cannot be carried by the local resource, the remote resource can be called to execute the second task set so as to obtain second test information corresponding to the second task set.
For example, in splitting the plurality of test tasks, the required resource amount of the first task set may be approximated without exceeding the local remaining resource amount, and the difference between the local remaining resource amount and the required resource amount of the first task set may be less than the threshold. And then dividing other tasks in the plurality of test tasks except the first task set into a second task set.
For another example, the division of task sets may be performed using maximum test tasks. The maximum number of test tasks among the test tasks may be divided into a first task set and the remaining test tasks may be divided into a second task set.
For example, the number of unprocessed completed test tasks is 200, and the number of threads locally may be 10, indicating that 10 test tasks are in process. The target length of the local task queue is 100, indicating that 100 test tasks are allowed to queue. In this embodiment, the maximum execution number may be 110, and 110 test tasks may be processed locally, for example, the processing execution order is 110 and the previous test tasks. The remote resource may be invoked to process 90 test tasks, such as processing test tasks that follow 110 in execution order. In other embodiments, the 200 test tasks may also be processed using remote resources.
Fig. 3 is a schematic diagram of a single interface test method according to an embodiment of the present disclosure.
The working principle of the single interface test method will be described below with reference to fig. 3 by taking a single interface test as an example.
For example, a user may configure at the front-end page 305, configuration information 301 may include global variables, signature information, extracted expressions for parameters, assertion information, etc., and then individual test cases 302 may be generated based on these configuration information 301. After the test cases 302 are obtained, test tasks 303 may be determined and executed to obtain test information 304. For example, the Jmeter test engine may be invoked locally or remotely to execute a single interface test case 302, then the test information 304 of the Jmeter test engine is obtained, the test information 304 may include an execution log and execution results, and then the test information 304 is pushed to the front end page 305 through SSE (Server Send Event).
Fig. 4 is a schematic diagram of a multi-interface test method according to an embodiment of the present disclosure.
The working principle of the multi-interface test method will be described below with reference to fig. 4 by taking multi-interface test as an example.
For example, a user may configure at front-end page 410, and configuration information 401 may include global variables, signature information, extracted expressions for parameters, assertion information, and the like. Test case identifications 402 may also be configured at front-end page 410. Timing information may also be configured at the front-end page 410 to perform the timing tasks 404.
Next, according to the configuration information 401 and the test case identifier 402, a plurality of test cases may be generated, where the plurality of test cases may be related to the same scene 403, and the plurality of test cases may be arranged according to actual needs to obtain the scene 403. The orchestrated content may include dependencies of the interfaces based on which the return finger of the previous interface will be the input of the next interface.
Next, a thread pool 405 may be enabled to determine whether the test task satisfies the remote processing condition based on the local remaining resource information and the required resource information of the test task to be processed. It will be appreciated that after the scenario 403 is created, no resources need be invoked if the test tasks for that scenario 403 need not be run.
If the remote processing condition is not satisfied, the test task may be performed locally. For example, the Jmeter engine is started using local resources, and test tasks are performed using the Jmeter engine to obtain test information 408.
If the remote processing condition is satisfied, the Node resource pool 406 may be invoked, the number of start containers 407 (dockers) may be calculated according to the number of scenes 403 to be created, and the Jmeter test engine execution scenes 403 may be started by using the containers 407, respectively. For example, the container 407 may be created by a predetermined amount of resources, i.e., the container 407 may be created according to a predetermined number of CPU cores and memory.
In creating the containers 407, the number of tasks is known for the timed task 404, so the number of containers 407 required can be determined from the number of tasks. For example, a user may configure 100 timed tasks 404, split the test tasks, and then create multiple containers 407 at once to process the timed test tasks at a specified time. In addition, for the case that the user temporarily increases the test tasks, since the number of tasks cannot be estimated at this time, the number of containers 407 may be increased continuously according to the newly increased test tasks until the remote resource is less than or equal to the threshold. When the remote resource is less than or equal to the threshold, the front-end test instruction may be rejected and the resource may be notified that insufficient resources are needed to wait.
It will be appreciated that one container 407 may handle a plurality of test tasks, the number of containers 407 being determined based on the number of test tasks. For example, 10 additional test tasks may be processed per launch of one container 407, and only one container 407 need be launched when the number of test tasks requiring remote resource processing is 3. When the number of test tasks requiring remote resource processing is 30, 3 containers 407 need to be started.
After obtaining the test information 408, the test information 408 may be output to a message queue 409, which message queue 409 may be kafka. The test information 408 is then pushed by message queue 409 to front page 410 through SSE (Server Send Event).
According to another embodiment of the present disclosure, the test information may be obtained by performing a test task, e.g., performing a local task may obtain test information for an interface corresponding to the local task, e.g., performing a remote task may obtain test information for an interface corresponding to the remote task. After the test information is obtained, the test information can be output, and then the front-end page is utilized to display the test information, so that a user knows the test information. The test information may include an execution log and a test result, where the execution log may record related information such as a user identifier, an execution time, a task time consumption, whether an abnormality occurs in execution, and the like. It should be noted that, the system may record execution logs of test tasks of multiple users, and when the execution logs are displayed, test information related to the user may be displayed to a target user.
In one example, the user may operate on a front-end page, such as clicking an output button to generate an output instruction. Or automatically triggers to generate an output instruction after the test is completed. The output instruction may include a query condition, which may include at least one of a user identification and time information, and target test information for the target user may be determined from a plurality of test information according to the user identification in the output instruction. For example, when a plurality of users perform interface tests, the execution log records relevant information of the plurality of tests, and target test information corresponding to the user identification can be screened from the execution log. Test information for test tasks performed during a particular time period may be screened from the execution log based on time information, for example.
In another example, the test instruction corresponds to test time information, e.g., a test task is created based on the test instruction, which test task is executed for a first period of time, so that the test instruction corresponds to the first period of time. The target test information generated during the first period may be screened from the execution log.
After the target test information is obtained, the target test information can be output to the front-end page, so that the target test information is displayed to the target user by using the front-end page.
According to the embodiment, the target test information aiming at the target user can be screened out from the execution log with larger data volume based on the user identification and the time information, so that the target user can know the related information of the interface test of the target user conveniently, but cannot know the related information of the interface test of other users, the information volume displayed to the target user can be reduced, and meanwhile, the related information of the test is prevented from being leaked to other users.
FIG. 5 is a schematic block diagram of an interface testing apparatus according to an embodiment of the present disclosure; and
as shown in fig. 5, the interface test apparatus 500 may include a first determination module 510, a second determination module 520, a third determination module 530, and a calling module 540.
The first determining module 510 is configured to determine, in response to receiving the test instruction, at least one target test case for the interface to be tested according to the test instruction.
The second determining module 520 is configured to determine a test task according to at least one target test case.
The third determining module 530 is configured to determine whether the test task meets the remote processing condition according to the local remaining resource information and the required resource information required for executing the test task.
The calling module 540 is configured to call the remote resource to process the test task to obtain the test information for the interface to be tested, if it is determined that the test task meets the remote processing condition.
According to another embodiment of the present disclosure, the number of test tasks is plural, the local remaining resource information includes a local CPU utilization, and the demand resource information includes a total number of tasks of the test tasks; the third determination module includes: a first determination sub-module and a second determination sub-module. The first determining submodule is used for determining the maximum execution quantity according to the local CPU utilization rate and the target length of the local task queue; the second determination submodule is used for determining that the test tasks meet the remote processing condition in response to detecting that the total number of the tasks is larger than the maximum execution number.
According to another embodiment of the present disclosure, the third determining module further includes: and the third determining submodule is used for determining the target length of the local task queue according to the preset waiting time, the task processing time and the local thread number.
According to another embodiment of the present disclosure, the local remaining resource information includes a local remaining resource amount, and the demand resource information includes an estimated resource amount required to perform the test task; the third determination module includes: and a fourth determination sub-module for determining that the test task satisfies the remote processing condition in response to detecting that a difference between the local remaining resource amount and the estimated resource amount is less than or equal to a resource amount threshold.
According to another embodiment of the present disclosure, the number of test tasks is a plurality; the calling module comprises: the system comprises a sub-module disassembly module, an execution sub-module and a calling sub-module. The splitting module is used for splitting a plurality of test tasks into a first task set and a second task set according to the local residual resource information and the required resource information; the execution submodule is used for executing the first task set by utilizing the local residual resources to obtain first test information corresponding to the first task set; the calling sub-module is used for calling the remote resource to execute the second task set so as to obtain second test information corresponding to the second task set.
According to another embodiment of the present disclosure, determining at least one target test case from information in the test instructions includes at least one of: responding to the detection that the test instruction comprises configuration information, generating test cases according to the configuration information, and adding the generated test cases into at least one target test case; and responding to the detection that the test instruction comprises the test case identification, searching the test case corresponding to the test case identification from the case library according to the test case identification, and adding the searched test case into at least one target test case.
According to another embodiment of the present disclosure, the above apparatus further includes: and the adding module is used for adding the generated test cases into the case library after the test cases are generated.
According to another embodiment of the present disclosure, the configuration information includes at least one of: global variables, signature information, extraction expressions for parameters, assertion information.
According to another embodiment of the present disclosure, the above apparatus further includes: the device comprises a screening module and an output module. The screening module is used for responding to the received output instruction after the test information is obtained, and screening target test information aiming at the target object from the plurality of test information according to at least one of the object identification and the test time information in the output instruction; the output module is used for outputting target test information.
In the technical scheme of the disclosure, the related processes of collecting, storing, using, processing, transmitting, providing, disclosing and the like of the personal information of the user accord with the regulations of related laws and regulations, and the public order colloquial is not violated.
In the technical scheme of the disclosure, the authorization or consent of the user is obtained before the personal information of the user is obtained or acquired.
According to an embodiment of the present disclosure, the present disclosure also provides an electronic device including at least one processor; and a memory communicatively coupled to the at least one processor; the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the interface test method described above.
According to an embodiment of the present disclosure, the present disclosure also provides a non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform the above-described interface test method.
According to an embodiment of the present disclosure, there is also provided a computer program product comprising a computer program which, when executed by a processor, implements the above-described interface test method.
Fig. 6 is a block diagram of an electronic device for implementing an interface testing method of an embodiment of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 6, the apparatus 600 includes a computing unit 601 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 602 or a computer program loaded from a storage unit 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data required for the operation of the device 600 may also be stored. The computing unit 601, ROM 602, and RAM 603 are connected to each other by a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Various components in the device 600 are connected to the I/O interface 605, including: an input unit 606 such as a keyboard, mouse, etc.; an output unit 607 such as various types of displays, speakers, and the like; a storage unit 608, such as a magnetic disk, optical disk, or the like; and a communication unit 609 such as a network card, modem, wireless communication transceiver, etc. The communication unit 609 allows the device 600 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The computing unit 601 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 60l include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 601 performs the various methods and processes described above, such as interface test methods. For example, in some embodiments, the interface testing method may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as storage unit 608. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 600 via the ROM 602 and/or the communication unit 609. When the computer program is loaded into RAM 603 and executed by computing unit 601, one or more steps of the interface test method described above may be performed. Alternatively, in other embodiments, the computing unit 601 may be configured to perform the interface test method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), complex Programmable Logic Devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps recited in the present disclosure may be performed in parallel or sequentially or in a different order, provided that the desired results of the technical solutions of the present disclosure are achieved, and are not limited herein. The above detailed description should not be taken as limiting the scope of the present disclosure. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present disclosure are intended to be included within the scope of the present disclosure.
Claims (19)
1. An interface testing method, comprising:
in response to receiving a test instruction, determining at least one target test case for an interface to be tested according to the test instruction;
determining a test task according to the at least one target test case;
determining whether the test task meets remote processing conditions according to the local residual resource information and the required resource information required by executing the test task; and
and under the condition that the test task meets the remote processing condition, calling a remote resource to process the test task so as to obtain the test information aiming at the interface to be tested.
2. The method of claim 1, wherein the number of test tasks is a plurality, the local remaining resource information includes local CPU utilization, and the demand resource information includes a total number of tasks of the test tasks; the determining whether the test task meets the remote processing condition according to the local residual resource information and the required resource information required for executing the test task comprises the following steps:
determining the maximum execution quantity according to the local CPU utilization rate and the target length of the local task queue; and
In response to detecting that the total number of tasks is greater than the maximum number of executions, determining that the test task satisfies the remote processing condition.
3. The method of claim 2, wherein the target length of the local task queue is determined by:
and determining the target length of the local task queue according to the preset waiting time, the task processing time and the local thread number.
4. The method of claim 1, wherein the local remaining resource information comprises a local remaining resource amount, the demand resource information comprises an estimated resource amount required to perform the test task; the determining whether the test task meets the remote processing condition according to the local residual resource information and the required resource information required for executing the test task comprises the following steps:
in response to detecting that the difference between the local remaining resource amount and the estimated resource amount is less than or equal to a resource amount threshold, determining that the test task satisfies the remote processing condition.
5. The method of any of claims 1-4, wherein the number of test tasks is a plurality; invoking a remote resource to process the test task includes:
Splitting a plurality of test tasks into a first task set and a second task set according to the local residual resource information and the required resource information;
executing the first task set by utilizing the local residual resources to obtain first test information corresponding to the first task set; and
and calling the remote resource to execute the second task set to obtain second test information corresponding to the second task set.
6. The method of any of claims 1 to 4, wherein the determining at least one target test case for an interface to be tested according to the test instructions comprises at least one of:
responding to the detection that the test instruction comprises configuration information, generating a test case according to the configuration information, and adding the generated test case into the at least one target test case; and
and responding to the detection that the test instruction comprises a test case identifier, searching a test case corresponding to the test case identifier from a case library according to the test case identifier, and adding the searched test case into the at least one target test case.
7. The method of claim 6, wherein the configuration information comprises at least one of: global variables, signature information, extraction expressions for parameters, assertion information.
8. The method of claim 1, further comprising:
after the test information is obtained, in response to receiving an output instruction, screening target test information for a target object from a plurality of test information according to at least one of object identification and test time information in the output instruction; and
and outputting the target test information.
9. An interface testing apparatus, comprising:
the first determining module is used for responding to the received test instruction and determining at least one target test case aiming at the interface to be tested according to the test instruction;
the second determining module is used for determining a test task according to the at least one target test case;
the third determining module is used for determining whether the test task meets the remote processing condition according to the local residual resource information and the required resource information required by executing the test task; and
and the calling module is used for calling a remote resource to process the test task under the condition that the test task meets the remote processing condition so as to obtain the test information aiming at the interface to be tested.
10. The apparatus of claim 9, wherein the number of test tasks is a plurality, the local remaining resource information comprises local CPU utilization, and the demand resource information comprises a total number of tasks of the test tasks; the third determination module includes:
The first determining submodule is used for determining the maximum execution quantity according to the local CPU utilization rate and the target length of the local task queue; and
and a second determination sub-module configured to determine that the test task satisfies the remote processing condition in response to detecting that the total number of tasks is greater than the maximum number of executions.
11. The apparatus of claim 10, wherein the third determination module further comprises:
and the third determining submodule is used for determining the target length of the local task queue according to the preset waiting time, the task processing time and the local thread number.
12. The apparatus of claim 9, wherein the local remaining resource information comprises a local remaining resource amount, the demand resource information comprises an estimated resource amount required to perform the test task; the third determination module includes:
and a fourth determination sub-module configured to determine that the test task satisfies the remote processing condition in response to detecting that a difference between the local remaining resource amount and the estimated resource amount is less than or equal to a resource amount threshold.
13. The apparatus of any of claims 9 to 12, wherein the number of test tasks is a plurality; the calling module comprises:
The splitting module is used for splitting a plurality of test tasks into a first task set and a second task set according to the local residual resource information and the required resource information;
the execution sub-module is used for executing the first task set by utilizing the local residual resources so as to obtain first test information corresponding to the first task set; and
and the calling sub-module is used for calling the remote resource to execute the second task set so as to obtain second test information corresponding to the second task set.
14. The apparatus of any of claims 9 to 12, wherein the determining the at least one target test case from information in the test instructions comprises at least one of:
responding to the detection that the test instruction comprises configuration information, generating a test case according to the configuration information, and adding the generated test case into the at least one target test case; and
and responding to the detection that the test instruction comprises a test case identifier, searching a test case corresponding to the test case identifier from a case library according to the test case identifier, and adding the searched test case into the at least one target test case.
15. The apparatus of claim 14, wherein the configuration information comprises at least one of: global variables, signature information, extraction expressions for parameters, assertion information.
16. The apparatus of claim 9, further comprising: after the test information has been obtained,
the screening module is used for responding to the received output instruction and screening target test information aiming at a target object from a plurality of test information according to at least one of object identification and test time information in the output instruction; and
and the output module is used for outputting the target test information.
17. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1 to 8.
18. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of claims 1 to 8.
19. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310140233.0A CN116126719A (en) | 2023-02-08 | 2023-02-08 | Interface testing method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310140233.0A CN116126719A (en) | 2023-02-08 | 2023-02-08 | Interface testing method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116126719A true CN116126719A (en) | 2023-05-16 |
Family
ID=86308053
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310140233.0A Pending CN116126719A (en) | 2023-02-08 | 2023-02-08 | Interface testing method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116126719A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116955030A (en) * | 2023-06-28 | 2023-10-27 | 珠海妙存科技有限公司 | Test plan distribution method and system, electronic device and storage medium |
-
2023
- 2023-02-08 CN CN202310140233.0A patent/CN116126719A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116955030A (en) * | 2023-06-28 | 2023-10-27 | 珠海妙存科技有限公司 | Test plan distribution method and system, electronic device and storage medium |
CN116955030B (en) * | 2023-06-28 | 2024-02-23 | 珠海妙存科技有限公司 | Test plan distribution method and system, electronic device and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN116303013A (en) | Source code analysis method, device, electronic equipment and storage medium | |
CN113778644B (en) | Task processing method, device, equipment and storage medium | |
CN116126719A (en) | Interface testing method and device, electronic equipment and storage medium | |
CN116541224A (en) | Performance test method, device, electronic equipment and readable storage medium | |
CN116009847A (en) | Code generation method, device, electronic equipment and storage medium | |
EP3832985B1 (en) | Method and apparatus for processing local hot spot, electronic device and storage medium | |
CN115017047A (en) | Test method, system, equipment and medium based on B/S architecture | |
CN117290113B (en) | Task processing method, device, system and storage medium | |
CN117112162B (en) | Data processing method, device, equipment and storage medium | |
CN118567870B (en) | Batch data processing method, device, equipment and storage medium | |
CN117076332B (en) | Test case testing method and device, electronic equipment and storage medium | |
CN116204441B (en) | Performance test method, device, equipment and storage medium of index data structure | |
CN115495312B (en) | Service request processing method and device | |
CN115858921A (en) | Model processing method, device, equipment and storage medium | |
CN118733149A (en) | Data interaction method and device, electronic equipment and storage medium | |
CN117539719A (en) | Application operation monitoring method, device, equipment and medium | |
CN115859300A (en) | Vulnerability detection method and device, electronic equipment and storage medium | |
CN115967638A (en) | Equipment simulation system, method, equipment and storage medium | |
CN116801001A (en) | Video stream processing method and device, electronic equipment and storage medium | |
CN118013896A (en) | Multi-engine-based chip diagnosis method, frame, device, equipment and storage medium | |
CN117609064A (en) | Unit test method and device, electronic equipment and storage medium | |
CN116579914A (en) | Execution method and device of graphic processor engine, electronic equipment and storage medium | |
CN116010744A (en) | Page data processing method and device, electronic equipment and readable storage medium | |
CN116991737A (en) | Software testing method, system, electronic equipment and storage medium | |
CN116341663A (en) | Extension method, device, equipment and medium of deep learning reasoning framework |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |