CN111176985B - Software interface performance testing method and device, computer equipment and storage medium - Google Patents

Software interface performance testing method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN111176985B
CN111176985B CN201911283900.0A CN201911283900A CN111176985B CN 111176985 B CN111176985 B CN 111176985B CN 201911283900 A CN201911283900 A CN 201911283900A CN 111176985 B CN111176985 B CN 111176985B
Authority
CN
China
Prior art keywords
performance
test
historical
target interface
white list
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911283900.0A
Other languages
Chinese (zh)
Other versions
CN111176985A (en
Inventor
曹江岭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Ping An Medical Health Technology Service Co Ltd
Original Assignee
Shenzhen Ping An Medical Health Technology Service Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Ping An Medical Health Technology Service Co Ltd filed Critical Shenzhen Ping An Medical Health Technology Service Co Ltd
Priority to CN201911283900.0A priority Critical patent/CN111176985B/en
Publication of CN111176985A publication Critical patent/CN111176985A/en
Application granted granted Critical
Publication of CN111176985B publication Critical patent/CN111176985B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management

Abstract

The embodiment of the invention provides a method and a device for testing the performance of a software interface, computer equipment and a storage medium. In one aspect, the method comprises: establishing a performance baseline of a target interface according to historical test data of the target interface; executing performance test on the target interface according to the performance base line to obtain a test result; and generating early warning information of a corresponding type according to the test result. By the method and the device, the technical problem of inaccurate test result in performance test in the prior art is solved, and the test efficiency is improved.

Description

Software interface performance testing method and device, computer equipment and storage medium
[ technical field ] A method for producing a semiconductor device
The invention relates to the field of computers, in particular to a method and a device for testing the performance of a software interface, computer equipment and a storage medium.
[ background of the invention ]
In computer software development, a large amount of tests are often required for developed software to identify the quality of the software. Software testing is the process of verifying whether it meets specified requirements or making clear the difference between expected and actual results, either manually or by way of software running automatically.
In the prior art, conventional interface automatic testing mainly verifies the correctness of interface functions. Software quality control is not only software functionality, but also software performance. The program performs one interface performance test, and because many program interfaces consume time and labor, the performance test can be neglected in the version rapid iterative development process, thereby improving the risk of production accidents.
In view of the above problems in the related art, no effective solution has been found so far.
[ summary of the invention ]
In view of this, embodiments of the present invention provide a method and an apparatus for testing performance of a software interface, a computer device, and a storage medium.
In one aspect, an embodiment of the present invention provides a method for testing performance of a software interface, where the method includes: establishing a performance baseline of a target interface according to historical test data of the target interface; executing performance test on the target interface according to the performance baseline to obtain a test result; and generating corresponding types of early warning information according to the test results.
Optionally, the creating a performance baseline of the target interface according to the historical test data of the target interface includes: reading a historical test record of the target interface, wherein the historical test record comprises a white list item, and the white list item is a test result when the test result reaches a preset condition; and establishing a performance baseline of the target interface according to the quantity of the white list entries.
Optionally, creating a performance baseline for the target interface according to the number of white list entries includes one of: setting the performance baseline to be positive infinity when the number of white list entries is 0; when the number of the white list entries is less than 10, calculating a performance baseline P = historical lowest performance value + preset tolerance x historical lowest performance value according to the following formula; and when the number of the white list entries is greater than or equal to 10, performing normal distribution mean calculation on the performance values in a preset range in the historical test records to obtain the performance baseline, wherein the preset range is the historical test record with the performance value lower than (the performance highest value + the performance lowest value)/2.
Optionally, reading the historical test record of the target interface includes at least one of: reading a first historical test record generated by the target interface executing the functional test; and reading a second historical test record generated by the target interface executing the performance test.
Optionally, creating a performance baseline of the target interface according to the historical test data of the target interface includes: reading a historical test record of the target interface, wherein the historical test record comprises a white list item and a black list item, the white list item is a test result when the test result reaches a preset condition, and the black list item is a test result when the test result does not reach the preset condition; judging whether the total number of the items of the historical test records is greater than a first preset number, and/or judging whether the white list items in the historical test records are greater than a second preset number; when the total number of the items of the historical test records is greater than a first preset number, and/or when the white list items in the historical test records are greater than a second preset number, establishing a performance baseline of the target interface according to all the test records before the current test; and when the total number of the entries of the historical test record is less than or equal to a first preset number, and/or when the white list entries in the historical test record are less than or equal to a second preset number, setting the performance baseline of the target interface as a fixed value.
Optionally, the generating of the corresponding type of the early warning information according to the test result includes: calculating the early warning levels to which the test results belong, wherein each early warning level corresponds to a numerical range based on a performance baseline; and generating early warning information corresponding to the early warning level, wherein the early warning information carries tag information used for indicating the early warning level.
Optionally, after generating the corresponding type of early warning information according to the test result, the method further includes: judging the early warning type of the early warning information, wherein the early warning type comprises a first type and a second type, the first type is used for indicating that the interface performance has no defects, and the second type is used for indicating that the interface performance has defects; when the early warning type is the first type, recording the test result as a white list item in a test record; when the early warning type is the second type, recording the test result as a blacklist item in a test record; and counting all the white list entries and the black list entries of the target interface, and generating a visual report on a user interface.
In another aspect, an embodiment of the present invention provides a performance testing apparatus for a software interface, where the apparatus includes: the system comprises a creating module, a judging module and a judging module, wherein the creating module is used for creating a performance baseline of a target interface according to historical test data of the target interface; the test module is used for executing performance test on the target interface according to the performance baseline to obtain a test result; and the generating module is used for generating the early warning information of the corresponding type according to the test result.
Optionally, the creating module includes: the first reading unit is used for reading a historical test record of the target interface, wherein the historical test record comprises a white list item, and the white list item is a test result when the test result reaches a preset condition; a first creating unit, configured to create a performance baseline of the target interface according to the number of the white list entries.
Optionally, the first creating unit includes one of: a first creating subunit, configured to set the performance baseline to be positive infinity when the number of white list entries is 0; a second creating subunit, configured to calculate, when the number of the white list entries is less than 10, a performance baseline P = historical minimum performance value + preset tolerance × historical minimum performance value according to the following formula; and a third creating subunit, configured to, when the number of the white list entries is greater than or equal to 10, perform normal distribution mean calculation on the performance values in a predetermined range in the historical test records to obtain the performance baseline, where the predetermined range is a historical test record with a performance value lower than (performance maximum + performance minimum)/2.
Optionally, the first reading unit includes at least one of: the first reading subunit is used for reading a first historical test record generated by the target interface executing the functional test; and the second reading subunit is used for reading a second historical test record generated by the target interface executing the performance test.
Optionally, the creating module includes: the second reading unit is used for reading a historical test record of the target interface, wherein the historical test record comprises a white list item and a black list item, the white list item is a test result when the test result reaches a preset condition, and the black list item is a test result when the test result does not reach the preset condition; the judging unit is used for judging whether the total number of the items of the historical test record is greater than a first preset number and/or judging whether the white list items in the historical test record are greater than a second preset number; a second creating unit, configured to create a performance baseline of the target interface according to all test records before the current test when a total number of entries in the historical test record is greater than a first preset number, and/or white list entries in the historical test record are greater than a second preset number; and when the total number of the entries of the historical test record is less than or equal to a first preset number, and/or when the white list entries in the historical test record are less than or equal to a second preset number, setting the performance baseline of the target interface as a fixed value.
Optionally, the generating module includes: the calculation unit is used for calculating the early warning levels to which the test results belong, wherein each early warning level corresponds to a numerical range based on a performance baseline; and the generating unit is used for generating early warning information corresponding to the early warning level, wherein the early warning information carries label information used for indicating the early warning level.
Optionally, the apparatus further comprises: the judging module is used for judging the early warning type of the early warning information after the generating module generates the early warning information of the corresponding type according to the test result, wherein the early warning type comprises a first type and a second type, the first type is used for indicating that the interface performance has no defects, and the second type is used for indicating that the interface performance has defects; the recording module is used for recording the test result as a white list item in a test record when the early warning type is the first type; when the early warning type is the second type, recording the test result as a blacklist item in a test record; and the display module is used for counting all the white list entries and the black list entries of the target interface and generating a visual report on a user interface.
According to a further embodiment of the present invention, there is also provided a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the above method embodiments when executed.
According to yet another embodiment of the present invention, there is also provided an electronic device, including a memory in which a computer program is stored and a processor configured to execute the computer program to perform the steps in any of the above method embodiments.
According to the invention, the performance baseline of the target interface is established according to the historical test data of the target interface, then the performance test is carried out on the target interface according to the performance baseline to obtain the test result, and finally the early warning information of the corresponding type is generated according to the test result. Since the historical test data of the target interface is changed in a corresponding iteration manner along with the increase of the test times, the performance baseline can be dynamically adjusted in an iteration manner, and the test result is more accurate. The technical problem that the test result is inaccurate in performance test in the prior art is solved, and the test efficiency is improved.
[ description of the drawings ]
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
FIG. 1 is a block diagram of a hardware architecture of a performance testing computer for a software interface according to an embodiment of the present invention;
FIG. 2 is a flow chart of a method for performance testing of a software interface according to an embodiment of the invention;
FIG. 3 is a flow diagram illustrating the creation of a performance baseline based on dynamic values, in accordance with an embodiment of the present invention;
FIG. 4 is a schematic illustration of test response times for an embodiment of the present invention;
fig. 5 is a block diagram of a performance testing apparatus for a software interface according to an embodiment of the present invention.
[ detailed description ] embodiments
The invention will be described in detail hereinafter with reference to the accompanying drawings in conjunction with embodiments. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
Example 1
The method provided by the first embodiment of the present application may be executed in a mobile terminal, a test terminal, a computer, or a similar computing device. Taking an example of the software interface running on a computer, fig. 1 is a hardware structure block diagram of a performance testing computer of a software interface according to an embodiment of the present invention. As shown in fig. 1, computer 10 may include one or more (only one shown in fig. 1) processors 102 (processor 102 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA) and a memory 104 for storing data, and optionally may also include a transmission device 106 for communication functions and an input-output device 108. It will be appreciated by those of ordinary skill in the art that the configuration shown in FIG. 1 is illustrative only and is not intended to limit the configuration of the computer described above. For example, computer 10 may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 may be used to store computer programs, for example, software programs and modules of application software, such as a computer program corresponding to the performance testing method of the software interface in the embodiment of the present invention, and the processor 102 executes various functional applications and data processing by running the computer programs stored in the memory 104, so as to implement the above-mentioned method. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, memory 104 may further include memory located remotely from processor 102, which may be connected to computer 10 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used for receiving or transmitting data via a network. Specific examples of such networks may include wireless networks provided by the communications provider of computer 10. In one example, the transmission device 106 includes a Network adapter (NIC), which can be connected to other Network devices through a base station so as to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used for communicating with the internet in a wireless manner.
In this embodiment, a performance testing method for a software interface is provided, and fig. 2 is a flowchart of the performance testing method for a software interface according to the embodiment of the present invention, as shown in fig. 2, the flowchart includes the following steps:
step S202, establishing a performance baseline of a target interface according to historical test data of the target interface;
the performance baseline of this embodiment is a metric for measuring the performance of the software interface, and as the number of times the software interface is operated increases, the performance of the software interface is degraded, and the performance baseline also changes.
Step S204, executing performance test on the target interface according to the performance baseline to obtain a test result;
and step S206, generating early warning information of corresponding types according to the test result.
According to the scheme of the embodiment, the performance baseline of the target interface is established according to the historical test data of the target interface, then the performance test is performed on the target interface according to the performance baseline to obtain the test result, and finally the early warning information of the corresponding type is generated according to the test result. Since the historical test data of the target interface is changed in a corresponding iteration manner along with the increase of the test times, the performance baseline can be dynamically adjusted in an iteration manner, and the test result is more accurate. The technical problem of inaccurate test result in performance test in the prior art is solved, and the test efficiency is improved.
When creating the performance baseline of the interface in this embodiment, two ways are included, which correspond to the fixed value and the dynamic value, respectively. The performance baseline comprises a fixed value scheme and a dynamic value scheme, and the two schemes can be switched arbitrarily or configured according to priority.
And performing automatic testing of the interface regularly or manually, and after the execution is finished, if a fixed performance baseline exists, taking the performance baseline as a standard, and if the fixed performance baseline does not exist, taking a dynamic value as the performance baseline. And sending different early warnings according to the early warning level rule. This embodiment grades the test results, and the grade indicates: l1: l2 can be ignored: it is proposed to optimize L3: severe performance problem L4: the downtime can be caused, and the following rules are respectively corresponded to: less than or equal to 150%, less than or equal to 250%, less than or equal to 400%, greater than 400% of the baseline.
In an implementation manner of this embodiment, in the scheme of creating the performance baseline according to the fixed value, creating the performance baseline of the target interface according to the historical test data of the target interface includes:
s11, reading a historical test record of the target interface, wherein the historical test record comprises a white list item and a black list item, the white list item is a test result when the test result reaches a preset condition, and the black list item is a test result when the test result does not reach the preset condition;
in this embodiment, the reading of the historical test record of the target interface may be, but is not limited to: reading a first historical test record generated by the target interface executing function test; and reading a second historical test record generated by the target interface executing the performance test.
S12, judging whether the total number of the items of the historical test record is larger than a first preset number, and/or judging whether the white list items in the historical test record are larger than a second preset number;
s13, when the total number of the items of the historical test records is larger than a first preset number and/or white list items in the historical test records are larger than a second preset number, establishing a performance baseline of the target interface according to all the test records before the current test; and when the total number of the entries of the historical test record is less than or equal to a first preset number, and/or when the white list entries in the historical test record are less than or equal to a second preset number, setting the performance baseline of the target interface as a fixed value.
Software interfaces may have performance requirements for certain interfaces during the design phase, such as: the single request response time of the interface is N milliseconds, the interval time of continuous response of the interface is M seconds, the restart recovery time of the interface is K seconds, the throughput of the interface, the concurrency of the interface and the like. This allows a fixed performance baseline to be set for the type of interface.
In another embodiment of this embodiment, in the scheme of creating a performance baseline according to dynamic values, if there is no fixed value, the dynamic values of the performance baseline are calculated based on a certain amount of historical result data. Or a combination of fixed and dynamic values.
Fig. 3 is a schematic flowchart of an embodiment of creating a performance baseline according to a dynamic value, where creating a performance baseline of a target interface according to historical test data of the target interface includes:
s302, reading a historical test record of the target interface, wherein the historical test record comprises a white list item and a black list item, the white list item is a test result when the test result reaches a preset condition, and the black list item is a test result when the test result does not reach the preset condition;
s304, establishing a performance baseline of the target interface according to the quantity of the white list entries.
In one embodiment, the performance baseline of the target interface created according to the number of white list entries may be, but is not limited to:
setting the performance baseline to be positive infinity when the number of white list entries is 0;
when the number of the white list entries is less than 10, calculating a performance baseline P = historical lowest performance value + preset tolerance historical lowest performance value according to the following formula;
and when the number of the white list entries is greater than or equal to 10, performing normal distribution mean calculation on the performance values in a preset range in the historical test records to obtain the performance baseline, wherein the preset range is the historical test records with the performance values lower than (performance highest value + performance lowest value)/2.
In the solution of this embodiment, if a problem of a serious performance level higher than L3 occurs during a certain interface execution, the execution result is included in a blacklist. And if the interface historical data is removed from the blacklist data, the interface historical data is white list data.
The historical data is the accumulation of the results of each interface test, and each historical data is the interface response time in the interface test process and the like. The historical data of the scheme can be data of the software interface during function test, generally, whether the interface has the function is generated and recorded during the function test, and the test result is yes or no regardless of the quality of the function.
The lowest value and the highest value are calculated according to the result (interface response time) of each interface test, and the lowest value and the highest value are the highest value and the lowest value of the historical records.
The tolerance of the scheme can be customized, and can adopt 20,50,80,100%, and is set according to the requirement of a user on the baseline. The smaller the tolerance, the more stringent the baseline requirements and the more stringent the performance tests.
In the embodiment, when the total amount of the historical data, or the number of the white list data or the black list data in the historical data is less than a certain amount, the reference value of the dynamic value is limited, the dynamic value can jump to the fixed value for judgment, if the reference value exceeds the fixed value, early warning is generated, otherwise, the early warning is not performed, and the historical data is accumulated.
When the performance test is carried out, a target interface of the test can be selected by a user or can be an interface for screening the functional test to pass, if the functional test and the performance test of the interface pass simultaneously, the interface can be normally used, and for the interface which does not pass the functional test, the performance test can definitely not pass.
Taking response time as an example, the test procedure of the performance test is explained here: and filling the performance baseline into a test template to generate a test case, triggering a test script to execute a performance test process, calculating the performance data of the interface, and comparing the performance data with the performance baseline to obtain a test result. In the http request test of the interface, the request will be sent to the web server to process through the network, if need operate DB, forward to the database by the network again and process, then return the value to the web server, the web server returns the result data to the customer end through the network finally, the calculation method of the performance data: response time = (N1 + N2+ N3+ N4) + (A1 + A2+ A3), i.e.: (N is network time + A is application processing time). FIG. 4 is a schematic diagram of test response time according to an embodiment of the present invention.
The response time is a main parameter of the performance test, and may also be other parameters, throughput (such as concurrency number, transaction processing amount per second, etc.), and taking the concurrency number as an example, the test process of the performance test: and filling the performance baseline into a test template to generate a test case, triggering a test script to execute a performance test process, calculating the performance data of the interface, and comparing the performance data with the performance baseline to obtain a test result. The interface receives http request tests for many times in the same time, the interface generates response messages after processing is successful, and the response messages are fed back to a web server through a network for processing, and the performance data calculation method comprises the following steps: counting the number of response messages received within a predetermined time.
Optionally, the generating of the corresponding type of the early warning information according to the test result includes: calculating the early warning levels to which the test results belong, wherein each early warning level corresponds to a numerical range based on a performance baseline; and generating early warning information corresponding to the early warning level, wherein the early warning information carries label information used for indicating the early warning level.
In one example, the early warning level includes four levels, L1 to L4, where L1: l2 can be ignored: it is proposed to optimize L3: severe performance problem L4: downtime can be caused, and the numerical value ranges respectively corresponding to the four levels are as follows: less than or equal to 150% of the performance baseline, less than or equal to 250% of the performance baseline, less than or equal to 400% of the performance baseline, and greater than 400% of the performance baseline.
And meanwhile, adding the current test result into a historical white list or a historical black list so as to iteratively calculate the next performance baseline.
Optionally, after generating the corresponding type of early warning information according to the test result, the method further includes: judging the early warning type of the early warning information, wherein the early warning type comprises a first type and a second type, the first type is used for indicating that the interface performance has no defects, and the second type is used for indicating that the interface performance has defects; when the early warning type is the first type, recording the test result as a white list item in a test record; when the early warning type is the second type, recording the test result as a blacklist item in a test record; and counting all the white list entries and the black list entries of the target interface, and generating a visual report on a user interface. The occurrence of interface test results at and above the L3 level may be blacklisted. Otherwise, recording the white list, counting the interface history black list and the white list data together, and outputting the result into a visual report. And the data of the black list in the visual report is marked. And generating a visual report based on the time axis of the test time aiming at the single interface.
According to the embodiment, performance baselines are constructed according to requirements, test scenes, test progress and the like. The interface performance basic test is carried out while the interface is automatically executed, the interface with performance test requirements or passing function tests can be screened as a target interface, the performance base line is dynamically adjusted in an iteration mode, and the test result is more accurate. The visual chart can be used for checking performance problems in the historical process of the interface at any time.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
Example 2
In this embodiment, a performance testing apparatus for a software interface is further provided, where the apparatus is used to implement the foregoing embodiments and preferred embodiments, and details of the description already given are not repeated. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware or a combination of software and hardware is also possible and contemplated.
Fig. 5 is a block diagram of a performance testing apparatus for a software interface according to an embodiment of the present invention, as shown in fig. 5, the apparatus including:
a creation module 50, configured to create a performance baseline of a target interface according to historical test data of the target interface;
the test module 52 is configured to perform a performance test on the target interface according to the performance baseline to obtain a test result;
and the generating module 54 is configured to generate the early warning information of the corresponding type according to the test result.
Optionally, the creating module includes: the first reading unit is used for reading a historical test record of the target interface, wherein the historical test record comprises a white list item and a black list item, the white list item is a test result when the test result reaches a preset condition, and the black list item is a test result when the test result does not reach the preset condition; a first creating unit, configured to create a performance baseline of the target interface according to the number of the white list entries.
Optionally, the first creating unit includes one of: a first creating subunit configured to set the performance baseline to positive infinity when the number of white list entries is 0; a second creating subunit, configured to calculate, when the number of the white list entries is less than 10, a performance baseline P = historical minimum performance value + preset tolerance × historical minimum performance value according to the following formula; and a third creating subunit, configured to, when the number of the white list entries is greater than or equal to 10, perform normal distribution mean calculation on the performance values in a predetermined range in the historical test records to obtain the performance baseline, where the predetermined range is a historical test record with a performance value lower than (performance maximum + performance minimum)/2.
Optionally, the first reading unit includes at least one of: the first reading subunit is used for reading a first historical test record generated by the target interface executing the functional test; and the second reading subunit is used for reading a second historical test record generated by the target interface executing the performance test.
Optionally, the creating module includes: the second reading unit is used for reading a historical test record of the target interface, wherein the historical test record comprises a white list item and a black list item, the white list item is a test result when the test result reaches a preset condition, and the black list item is a test result when the test result does not reach the preset condition; the judging unit is used for judging whether the total number of the items of the historical test record is greater than a first preset number and/or judging whether the white list items in the historical test record are greater than a second preset number; a second creating unit, configured to create a performance baseline of the target interface according to all test records before the current test when a total number of entries in the historical test record is greater than a first preset number and/or white list entries in the historical test record are greater than a second preset number; and when the total number of the entries in the historical test record is less than or equal to a first preset number, and/or when the white list entries in the historical test record are less than or equal to a second preset number, setting the performance baseline of the target interface as a fixed value.
Optionally, the generating module includes: the calculation unit is used for calculating the early warning levels to which the test results belong, wherein each early warning level corresponds to a numerical range based on a performance baseline; and the generating unit is used for generating early warning information corresponding to the early warning level, wherein the early warning information carries label information used for indicating the early warning level.
Optionally, the apparatus further comprises: the judging module is used for judging the early warning type of the early warning information after the generating module generates the early warning information of the corresponding type according to the test result, wherein the early warning type comprises a first type and a second type, the first type is used for indicating that the interface performance has no defects, and the second type is used for indicating that the interface performance has defects; the recording module is used for recording the test result as a white list item in a test record when the early warning type is the first type; when the early warning type is the second type, recording the test result as a blacklist item in a test record; and the display module is used for counting all the white list entries and the black list entries of the target interface and generating a visual report on a user interface.
It should be noted that, the above modules may be implemented by software or hardware, and for the latter, the following may be implemented, but not limited to: the modules are all positioned in the same processor; alternatively, the modules are located in different processors in any combination.
Example 3
In the embodiments provided in the present invention, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one type of logical functional division, and other divisions may be realized in practice, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a Processor (Processor) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, an optical disk, or other various media capable of storing program codes.
Embodiments of the present invention also provide a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the above method embodiments when executed.
Alternatively, in the present embodiment, the storage medium may be configured to store a computer program for executing the steps of:
s1, establishing a performance baseline of a target interface according to historical test data of the target interface;
s2, performing performance test on the target interface according to the performance baseline to obtain a test result;
and S3, generating early warning information of corresponding types according to the test result.
Optionally, in this embodiment, the storage medium may include, but is not limited to: various media capable of storing computer programs, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Embodiments of the present invention further provide an electronic device, comprising a memory in which a computer program is stored and a processor configured to execute the computer program to perform the steps in any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, establishing a performance baseline of a target interface according to historical test data of the target interface;
s2, performing performance test on the target interface according to the performance baseline to obtain a test result;
and S3, generating early warning information of corresponding types according to the test result.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and should not be taken as limiting the scope of the present invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (8)

1. A performance testing method of a software interface is characterized by comprising the following steps:
creating a performance baseline for a target interface according to historical test data for the target interface, wherein the historical test records include white list entries and black list entries, including: reading a historical test record of the target interface, wherein the historical test record comprises a white list item, and the white list item is a test result when the test result reaches a preset condition; creating a performance baseline for the target interface based on the number of whitelist entries, including one of: setting the performance baseline to be positive infinity when the number of white list entries is 0; when the number of white list entries is less than 10, calculating a performance baseline according to the following formula: p = historical lowest performance value + preset tolerance — (iii) historical lowest performance value; when the number of the white list entries is larger than or equal to 10, performing normal distribution mean calculation on the performance values in a preset range in the historical test records to obtain the performance baseline, wherein the historical test records with the performance values lower than (the performance highest value + the performance lowest value)/2 are in the preset range;
executing performance test on the target interface according to the performance baseline to obtain a test result;
and generating corresponding types of early warning information according to the test results.
2. The method of claim 1, wherein reading the historical test record of the target interface comprises at least one of:
reading a first historical test record generated by the target interface executing the functional test;
and reading a second historical test record generated by the target interface executing the performance test.
3. The method of claim 1, wherein creating a performance baseline for a target interface from historical test data for the target interface comprises:
reading a historical test record of the target interface, wherein the historical test record comprises a white list item and a black list item, the white list item is a test result when the test result reaches a preset condition, and the black list item is a test result when the test result does not reach the preset condition;
judging whether the total number of the items of the historical test records is greater than a first preset number, and/or judging whether the white list items in the historical test records are greater than a second preset number;
when the total number of the items of the historical test records is greater than a first preset number, and/or when the white list items in the historical test records are greater than a second preset number, establishing a performance baseline of the target interface according to all the test records before the current test; and when the total number of the entries in the historical test record is less than or equal to a first preset number, and/or when the white list entries in the historical test record are less than or equal to a second preset number, setting the performance baseline of the target interface as a fixed value.
4. The method of claim 1, wherein generating corresponding types of early warning information from the test results comprises:
calculating the early warning levels to which the test results belong, wherein each early warning level corresponds to a numerical range based on a performance baseline;
and generating early warning information corresponding to the early warning level, wherein the early warning information carries label information used for indicating the early warning level.
5. The method of claim 1, wherein after generating the corresponding type of early warning information from the test results, the method further comprises:
judging the early warning type of the early warning information, wherein the early warning type comprises a first type and a second type, the first type is used for indicating that the interface performance has no defect, and the second type is used for indicating that the interface performance has defect;
when the early warning type is the first type, recording the test result as a white list item in a test record; when the early warning type is the second type, recording the test result as a blacklist item in a test record;
and counting all white list entries and black list entries of the target interface, and generating a visual report on a user interface.
6. A performance testing apparatus for a software interface, the apparatus comprising:
a creating module, configured to create a performance baseline of a target interface according to historical test data of the target interface, where the historical test record includes a white list entry and a black list entry, and includes: reading a historical test record of the target interface, wherein the historical test record comprises a white list item, and the white list item is a test result when the test result reaches a preset condition; creating a performance baseline for the target interface based on the number of whitelist entries, including one of: setting the performance baseline to be positive infinity when the number of white list entries is 0; when the number of white list entries is less than 10, calculating a performance baseline according to the following formula: p = historical minimum performance value + preset tolerance × historical minimum performance value; when the number of the white list entries is greater than or equal to 10, performing normal distribution mean calculation on the performance values in a preset range in the historical test records to obtain the performance baseline, wherein the preset range is the historical test record with the performance value lower than (performance highest value + performance lowest value)/2;
the test module is used for executing performance test on the target interface according to the performance baseline to obtain a test result;
and the generating module is used for generating the early warning information of the corresponding type according to the test result.
7. A computer device comprising a memory and a processor, the memory storing a computer program, wherein the processor implements the steps of the method of any one of claims 1 to 5 when executing the computer program.
8. A computer storage medium on which a computer program is stored, characterized in that the computer program, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 5.
CN201911283900.0A 2019-12-13 2019-12-13 Software interface performance testing method and device, computer equipment and storage medium Active CN111176985B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911283900.0A CN111176985B (en) 2019-12-13 2019-12-13 Software interface performance testing method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911283900.0A CN111176985B (en) 2019-12-13 2019-12-13 Software interface performance testing method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111176985A CN111176985A (en) 2020-05-19
CN111176985B true CN111176985B (en) 2023-02-03

Family

ID=70650359

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911283900.0A Active CN111176985B (en) 2019-12-13 2019-12-13 Software interface performance testing method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111176985B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113923703A (en) * 2020-07-08 2022-01-11 中国移动通信有限公司研究院 State detection method, device and storage medium
CN115134164B (en) * 2022-07-18 2024-02-23 深信服科技股份有限公司 Uploading behavior detection method, system, equipment and computer storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103139007A (en) * 2011-12-05 2013-06-05 阿里巴巴集团控股有限公司 Method and system for detecting application server performance
CN109783385A (en) * 2019-01-14 2019-05-21 中国银行股份有限公司 A kind of product test method and apparatus
CN110399302A (en) * 2019-07-26 2019-11-01 中国工商银行股份有限公司 Risk Identification Method, device, electronic equipment and the medium of Software Testing Project

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9645916B2 (en) * 2014-05-30 2017-05-09 Apple Inc. Performance testing for blocks of code
US10157116B2 (en) * 2016-11-28 2018-12-18 Google Llc Window deviation analyzer
CN108959042A (en) * 2017-05-19 2018-12-07 北京神州泰岳软件股份有限公司 Base-line data calculation method in a kind of baseline management

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103139007A (en) * 2011-12-05 2013-06-05 阿里巴巴集团控股有限公司 Method and system for detecting application server performance
CN109783385A (en) * 2019-01-14 2019-05-21 中国银行股份有限公司 A kind of product test method and apparatus
CN110399302A (en) * 2019-07-26 2019-11-01 中国工商银行股份有限公司 Risk Identification Method, device, electronic equipment and the medium of Software Testing Project

Also Published As

Publication number Publication date
CN111176985A (en) 2020-05-19

Similar Documents

Publication Publication Date Title
CN111178760B (en) Risk monitoring method, risk monitoring device, terminal equipment and computer readable storage medium
CN111176985B (en) Software interface performance testing method and device, computer equipment and storage medium
CN111240876A (en) Fault positioning method and device for microservice, storage medium and terminal
CN111181800A (en) Test data processing method and device, electronic equipment and storage medium
CN112835802A (en) Equipment testing method, device, equipment and storage medium
CN112948224A (en) Data processing method, device, terminal and storage medium
CN112437462B (en) Quality determination method and device of WIFI module, storage medium and electronic device
CN111506455B (en) Checking method and device for service release result
CN109710285B (en) Equipment upgrading method and system
CN111884932B (en) Link determining method, device, equipment and computer readable storage medium
CN106294104B (en) Test case execution method and mobile terminal
CN115437903A (en) Interface test method, device, apparatus, storage medium, and program
CN113656046A (en) Application deployment method and device
CN109102083B (en) Quantity configuration method of maintenance equipment and related equipment
CN112527276A (en) Data updating method and device in visual programming tool and terminal equipment
CN105630503A (en) Selecting method for mobile device application and development based on operating records of user
CN112148621A (en) Test method and device and electronic equipment
CN116991545B (en) Virtual machine deployment position determining method and device
CN113452533A (en) Charging self-inspection and self-healing method and device, computer equipment and storage medium
CN103873580A (en) Method and device for analyzing video webpages
EP4336883A1 (en) Modeling method, network element data processing method and apparatus, electronic device, and medium
CN111885159B (en) Data acquisition method and device, electronic equipment and storage medium
CN104065537A (en) Application external measurement method, external measurement device management server and application external measurement system
CN111105059B (en) Attribute conflict discovery method, device and computer-readable storage medium
CN111444110A (en) Data analysis method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220601

Address after: 518000 China Aviation Center 2901, No. 1018, Huafu Road, Huahang community, Huaqiang North Street, Futian District, Shenzhen, Guangdong Province

Applicant after: Shenzhen Ping An medical and Health Technology Service Co.,Ltd.

Address before: Room 12G, Area H, 666 Beijing East Road, Huangpu District, Shanghai 200001

Applicant before: PING AN MEDICAL AND HEALTHCARE MANAGEMENT Co.,Ltd.

GR01 Patent grant
GR01 Patent grant