CN110647453A - Application performance comparison test method, system, equipment and computer readable storage medium - Google Patents

Application performance comparison test method, system, equipment and computer readable storage medium Download PDF

Info

Publication number
CN110647453A
CN110647453A CN201910880849.5A CN201910880849A CN110647453A CN 110647453 A CN110647453 A CN 110647453A CN 201910880849 A CN201910880849 A CN 201910880849A CN 110647453 A CN110647453 A CN 110647453A
Authority
CN
China
Prior art keywords
application
tested
test
detection information
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910880849.5A
Other languages
Chinese (zh)
Inventor
杨良志
白琳
汪志新
贾亮
彭明华
李艳晓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Caixun Technology Co Ltd
Original Assignee
Caixun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Caixun Technology Co Ltd filed Critical Caixun Technology Co Ltd
Priority to CN201910880849.5A priority Critical patent/CN110647453A/en
Publication of CN110647453A publication Critical patent/CN110647453A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3466Performance evaluation by tracing or monitoring
    • G06F11/3476Data logging

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The embodiment of the invention discloses an application performance comparison test method, a system, equipment and a storage medium, wherein the method comprises the following steps: acquiring installation information and detection information of an application to be detected; installing the application to be tested to a test terminal according to the installation information and the detection information of the application to be tested; controlling the test terminal to run the application to be tested according to the detection information and obtaining a test result; and generating a comparative analysis report according to the test result. According to the embodiment of the invention, the performance comparison test of the application can be automatically completed only by submitting the installation information and the detection information, so that the time and labor consumption of manual test are avoided, the data processing process of manual analysis is avoided by automatically generating the comparison analysis report according to the test result, and the whole process is more efficient and faster.

Description

Application performance comparison test method, system, equipment and computer readable storage medium
Technical Field
The present invention relates to the field of application testing technologies, and in particular, to a method, a system, a device, and a computer-readable storage medium for comparing and testing application performance.
Background
With the development of mobile communication technology, the coverage of mobile phone applications is very wide, applications of the same type are more endlessly developed, a user can select which application to use first has a great relationship with the user experience, while good user experience is closely related to performance, mobile developers need to continually optimize their own application performance in order to stand out in a fierce market competition, there is a need for a sophisticated performance testing tool, and current testing systems can only execute one application under test on one terminal at a time, if the performance data of a plurality of competitive products running on a plurality of terminals in the background is required to be compared at the same time, the test is required for a plurality of times, the consistency of the environment is difficult to ensure, the performance data is required to be compared manually after the test is finished, when the comparison of terminals or competitive products is more, the whole process takes a lot of time for testing personnel.
Disclosure of Invention
The embodiment of the invention provides an application performance comparison test method, system and device and a computer storage medium, which are used for realizing background performance comparison test on a plurality of competitive product applications.
In a first aspect, an embodiment of the present invention provides an application performance comparison test method, including:
acquiring installation information and detection information of an application to be detected;
installing the application to be tested to a test terminal according to the installation information and the detection information of the application to be tested;
controlling the test terminal to run the application to be tested according to the detection information and obtaining a test result;
and generating a comparative analysis report according to the test result.
In a second aspect, an embodiment of the present invention further provides an application performance comparison test system, where the system includes:
the acquisition module is used for acquiring the installation information and the detection information of the application to be detected;
the installation module is used for installing the application to be tested to the test terminal according to the installation information and the detection information of the application to be tested;
the test module is used for controlling the test terminal to run the application to be tested according to the detection information and obtaining a test result;
and the analysis module is used for generating a comparative analysis report according to the test result.
In a third aspect, an embodiment of the present invention further provides an application performance comparison testing apparatus, including a memory and a processor, where the memory stores a computer program that can be run by the processor, and the processor executes the computer program to implement the application performance comparison testing method.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, where a computer program is stored in the storage medium, where the computer program includes program instructions, and when the program instructions are executed, the application performance comparison test method described above is implemented.
According to the technical scheme provided by the embodiment of the invention, the application to be tested is installed on the corresponding test terminal according to the installation information and the detection information of the application to be tested, the corresponding test result is obtained through operation, the comparison analysis report of the application to be tested is obtained according to the test result, the performance comparison test of the application can be automatically completed only by submitting the installation information and the detection information, the time and labor consumption of manual test are avoided, the data processing process of manual analysis is avoided by automatically generating the comparison analysis report according to the test result, and the whole process is more efficient and rapid.
Drawings
FIG. 1 is a flow chart of a comparative testing method for application performance according to a first embodiment of the present invention;
FIG. 2 is a flowchart of an application performance comparison test method in the second embodiment of the present invention;
FIG. 3 is a sub-flowchart of an application performance comparison test method in the second embodiment of the present invention;
FIG. 4 is a sub-flowchart of an application performance comparison test method in the second embodiment of the present invention;
FIG. 5 is a schematic diagram of a comparative testing system for application performance in the third embodiment of the present invention;
fig. 6 is a schematic diagram of an application performance comparison test device in the fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Furthermore, the terms "first," "second," and the like may be used herein to describe various orientations, actions, steps, elements, or the like, but the orientations, actions, steps, or elements are not limited by these terms. These terms are only used to distinguish one direction, action, step or element from another direction, action, step or element. For example, the first speed difference may be referred to as a second speed difference, and similarly, the second speed difference may be referred to as a first speed difference, without departing from the scope of the present invention. Both the first application and the second application are applications, but they are not the same application. The terms "first", "second", etc. are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise. It should be noted that when a portion is referred to as being "secured to" another portion, it can be directly on the other portion or there can be an intervening portion. When a portion is said to be "connected" to another portion, it may be directly connected to the other portion or intervening portions may be present. The terms "vertical," "horizontal," "left," "right," and the like as used herein are for illustrative purposes only and do not denote a unique embodiment.
Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the steps as a sequential process, many of the steps can be performed in parallel, concurrently or simultaneously. In addition, the order of the steps may be rearranged. A process may be terminated when its operations are completed, but may have additional steps not included in the figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc.
Example one
Fig. 1 is a flowchart of an application performance comparison testing method according to an embodiment of the present invention, where the embodiment is applicable to a situation where performance comparison tests are performed on multiple auction product applications, the method may be executed by an application performance comparison testing system, the system may be implemented in a software and/or hardware manner, and the system may be configured in a device, for example, typically, a server. Specifically, as shown in fig. 1, the method includes the following steps:
and step S110, acquiring installation information and detection information of the application to be detected.
In the embodiment of the present invention, the application to be tested generally includes a batch of competitive applications, that is, several applications having a competitive relationship, for example, a newly developed instant chat application needs to be put into the market, and the competitive application illustratively includes WeChat, QQ, etc., although other applications may be included therein, and the specific applications may be set according to different actual use requirements.
When a batch of applications to be tested need to be subjected to performance test, corresponding information needs to be uploaded, and specifically, the corresponding information can be uploaded to a server. The corresponding information referred to in this embodiment includes installation information and detection information, the installation information includes an installation package and an operating environment parameter of the application to be tested, and the detection information includes one or more of an application identification code to be tested, an application testing sequence to be tested, an application type to be tested, an application function to be tested, a testing environment parameter, a testing duration, and a performance index type to be tested.
In some embodiments, a user needs to test multiple sets of auction product applications, such as an instant chat type auction product application, a shopping type auction product application, and a music type auction product application, and it is obviously more tedious and time-consuming to upload only one set of auction product application each time, so that in this embodiment, all installation information of the multiple sets of auction product applications can be uploaded first, and then, detection information is used to determine which applications belong to the same type of auction product application, and a test requirement is set. Specifically, the application identification code to be tested is used for conveniently and quickly distinguishing the uploaded applications to be tested; the test sequence of the to-be-tested applications can be used for self-defining the test sequence of the to-be-tested applications when the test terminal does not meet the requirement for simultaneously detecting the to-be-tested applications, for example, a shopping competitive product application and a music competitive product application are uploaded at a certain time, the music competitive product application can be detected first, and then the shopping competitive product application can be detected; the application type to be tested is used for distinguishing the application types of the competitive products to which the application to be tested belongs (the same application to be tested can be divided into different application types of the competitive products according to different applications needing to be compared); the application function to be tested is used for selecting a functional module which needs to keep running when the application to be tested is detected; the test environment parameters are used for setting a specified test terminal of the application to be tested and the running environment of the test terminal during detection; the test duration refers to the time for the application to be tested to run at the test terminal, which can be set by the user, for example, 30 minutes, 60 minutes and 1 hour; the performance index type to be tested refers to the performance data type to be acquired when the application to be tested runs at the test terminal, and comprises one or more of application starting time, CPU occupancy rate, memory occupation, uplink and downlink flow and power consumption. It should be noted that, the above-mentioned detection information may be provided according to different settings of actual requirements, but not all of the detection information are provided, that is, in other embodiments, only one or more of the above-mentioned detection information may be selected to complete the performance comparison test of the application.
In some embodiments, the detection information may further include a data acquisition interval, which is used to set how often the performance acquirer acquires data, and may be set according to actual needs.
And step S120, installing the application to be tested to a test terminal according to the installation information and the detection information of the application to be tested.
The installation information and the detection information of the application to be tested are uploaded to the corresponding server according to the step S110, but the detection of the application to be tested needs to be performed on the corresponding test terminal, so that the application to be tested needs to be installed on the corresponding test terminal, which test terminal the application to be tested is installed on is selected by the user, and the installation package of the application to be tested needs to be sent to the corresponding test terminal to complete the installation.
And S130, controlling the test terminal to run the application to be tested according to the detection information and obtaining a test result.
After the application to be tested is installed on the corresponding test terminal, the application can be tested, and the specific test content, environment and the like are determined by the test information. For example, the test terminal is a plurality of models of mobile phones, each mobile phone runs all applications in the same type of competitive product application in the background, or the test terminal may be a plurality of mobile phones in the same model, and each mobile phone runs one application in the same type of competitive product application in the background, although other running modes may be set to implement different test environments besides the above two examples. Usually, the performance collector starts to operate before the application to be tested operates on the test terminal, which is to collect the start time of the application to be tested, and if the start time does not need to be collected, the performance collector can operate after the application to be tested operates, and the performance collector can collect various performance data of the application to be tested during operation, such as start time data, CPU occupancy data, memory occupancy data, uplink and downlink flow data and power consumption data. The performance data collected by the performance collector is the test result. In this embodiment, in order to obtain the background operation performance data of the application to be tested, the application to be tested needs to be set to operate on the background of the test terminal.
And S140, generating a comparative analysis report according to the test result.
Because the embodiment is directed at the performance comparison test among the competitive product applications, and only the performance collected by the performance collector is provided for the user to compare and check the performance by himself, the embodiment can generate the comparison analysis report by himself according to the test result, and the comparison analysis report can compare the performance data of the same application under different terminals or different applications under the same terminal according to the user requirements, so that a more intuitive and effective analysis result is presented for the user.
The embodiment provides an application performance comparison test method, wherein an application to be tested is installed on a corresponding test terminal according to installation information and detection information of the application to be tested and runs to obtain a corresponding test result, and a comparison analysis report of the application to be tested is obtained according to the test result, so that the performance comparison test of the application can be automatically completed only by submitting the installation information and the detection information, time and labor are avoided for manual test, a data processing process of manual analysis is avoided by automatically generating the comparison analysis report according to the test result, and the whole process is more efficient and rapid.
Example two
Fig. 2 is a flowchart of an application performance comparison testing method according to a second embodiment of the present invention, which explains part of the contents of the first embodiment in more detail based on the first embodiment, and as shown in fig. 4, in the application performance comparison testing method according to the second embodiment, step S140 is followed by step S250 of sending the comparison analysis report to a recipient according to the recipient information:
and step S110, acquiring installation information and detection information of the application to be detected.
The installation information of the application to be tested at least comprises an installation package of the application to be tested, the application to be tested is stored in a corresponding server after being uploaded, relevant information of the application to be tested is analyzed and then stored in an application information table of a database, and key fields in the table can be exemplarily:
app _ cnname: applying the Chinese name;
app _ package _ name: applying a runtime package name;
launchableacity: the application starts the class name;
apk _ filename: applying the installation package file name;
app _ version: applying the current version number;
userId: and uploading the user Id.
The detection information includes various information, and in addition to one or more of the identification code of the application to be detected, the test sequence of the application to be detected, the type of the application to be detected, the function of the application to be detected, the test environment parameter, the test duration, and the type of the performance index to be detected, the detection information may include recipient information of a comparative analysis report, such as a mailbox account.
And step S120, installing the application to be tested to a test terminal according to the installation information and the detection information of the application to be tested.
Specifically, as shown in fig. 3, step S120 includes:
and step S121, determining a test terminal corresponding to the application to be tested according to the detection information.
According to different test requirements, the test environment parameters corresponding to the applications to be tested are different, and the specific test environment parameters need to be set on the test terminal, so that for different applications to be tested, the corresponding test terminal needs to be found according to the requirements in the detection information.
And step S122, sending the installation information of the application to be tested to the corresponding test terminal.
After the test terminal corresponding to the application to be tested is determined, the installation information stored on the server can be sent to the corresponding test terminal, so that the subsequent test can be completed on the correct test terminal.
And S123, controlling the corresponding test terminal to install the application to be tested according to the installation information of the application to be tested.
After the installation information of the application to be tested is sent to the correct corresponding test terminal, the installation package can be opened to complete the installation of the application to be tested, and if the installation information is not sent to the correct corresponding test terminal, the installation cannot be carried out.
And S130, controlling the test terminal to run the application to be tested according to the detection information and obtaining a test result.
Specifically, as shown in fig. 4, step S130 includes:
and S131, controlling the background of the test terminal to run the application to be tested.
After the application to be tested is installed, the relevant test can be started, and the performance collector for detecting the relevant performance data can be started in advance or can be started when the application to be tested is started and can be preset according to different requirements. In this embodiment, performance data of the application to be tested during the background operation needs to be detected, so that the application to be tested needs to be started on the test terminal and then converted into the background operation.
Illustratively, when a test is executed, a thread pool containing 10 threads is initialized through newFixedThreadPool (10), when a test is executed, the server puts each test terminal into the thread pool as a separate task according to the number of the test terminals, if the thread pool has idle threads, the test terminals are immediately executed, and if the test terminals do not have idle threads, the test terminals are temporarily put into a queue to be executed; the method comprises the steps that test contents executed by each test terminal are independent, the executed operation processes are the same, corresponding applications to be tested are installed firstly, then the corresponding applications to be tested are started, and various performance data in the running process of the applications to be tested are collected through a performance collector; the test terminal can be connected to the server in a wired or wireless mode, and the server calls the adb command through the code to achieve the operations of installation, starting and the like of the application to be tested on the test terminal.
Illustratively, the test terminal is a mobile phone, and the operations of implementing the installation, the starting and the like of the application to be tested on the test terminal by calling the adb command through the code include: uploading the application to be tested to the mobile phone/sdcard/directory:
adb-s deviceid push apkName/sdcard/;
installing the application to be tested:
adb-s deviceid shell"pm install-r/sdcard/apkName";
starting the application to be tested:
adb-s deviceid shell am start-W-n packageName/launchActivity。
and S132, acquiring the PID and the UID of the application to be tested during the operation of the test terminal.
The PID and UID of the application to be tested can be found when the application to be tested runs on the test terminal through information such as the identification code of the application to be tested.
And S133, acquiring one or more of starting time data, CPU occupancy rate data, memory occupancy data, uplink and downlink flow data and power consumption data of the application to be tested according to the PID and the UID of the application to be tested.
The corresponding relevant performance data of the application in operation can be accurately inquired through the PID and the UID, and the specific collected data can be selected by setting test information according to different actual requirements.
What kind of performance data the performance collector collects is determined by the type of performance index to be tested in the test information, taking a mobile phone as a test terminal as an example, the obtained performance data includes:
starting time: synchronously acquiring a logcat log of the mobile phone when the application to be tested is started, wherein ThisTime in the log is recorded as application starting time;
CPU occupancy rate: according to the PID, acquiring a cpu occupation time slice of the application to be tested at the moment from a/proc/PID/stat file of the mobile phone, acquiring all cpu time slices occupied by the application to be tested at the current position of the mobile phone from the/proc/stat file, storing the two data, and inquiring the cpu time slice data of the two files again after several seconds, so that the current cpu occupancy rate can be calculated, wherein the cpu occupancy rate of the application to be tested is (the cpu time slice occupied by the application to be tested in the second inquiry-the cpu time slice occupied by the application to be tested in the first inquiry)/(the cpu time slice occupied by all applications in the second inquiry-the cpu time slice occupied by all applications in the first inquiry);
memory occupation: according to PID, the memory data occupied by the application to be tested can be directly obtained from a getProcessMemoryInfo method of an ActivinyManager of a mobile phone system;
uplink and downlink flow: after the application to be tested is started, acquiring current uplink flow data of the application to be tested as an uplink flow initial value from a/proc/UID _ stat/UID/tcp _ snd file of the mobile phone according to the UID, acquiring current downlink flow data of the application to be tested as a downlink flow initial value from the/proc/UID _ stat/UID/tcp _ rcv file, inquiring flow data of the two files again after a few seconds, calculating current uplink and downlink flow data, inquiring the uplink flow data-uplink flow initial value of the application to be tested after the uplink flow data of the application to be tested, and inquiring the downlink flow data-downlink flow initial value of the application to be tested after the downlink flow data of the application to be tested;
power consumption amount: the product obtained by multiplying the CPU time slice occupied by the application to be tested and the power consumption of the cell phone in unit time slice is the power consumption.
And S140, generating a comparative analysis report according to the test result.
Because the data report required by this embodiment is not a simple data report but a comparative analysis report for a batch of auction product applications, the comparative analysis of the application to be tested needs to be performed according to the application type of the auction product to which the application to be tested belongs, which specifically includes:
the method comprises the steps of obtaining test results of different applications to be tested under the same test environment parameter, and ensuring that the test results of the applications of the same type of competitive products are based on the same test environment in order to reflect the reasonability of comparative analysis.
And performing contrastive analysis according to the test results of the different applications to be tested under the same test environment parameter to generate a contrastive analysis report, wherein the contrastive analysis report can reflect the performance data comparison of all the applications to be tested belonging to the same competitive product application type, and further can analyze the information reflected by the different data comparison results.
And step S250, sending the comparative analysis report to a receiver according to the receiver information.
When the corresponding contrastive analysis report is generated, the contrastive analysis report needs to be sent to a corresponding receiver, the receiver may be preset or determined by the detection information, and the specific sending mode may be through a mailbox or other modes.
The embodiment installs the application to be tested to the corresponding test terminal and operates to acquire the corresponding test result according to the installation information and the detection information of the application to be tested, and obtains the contrastive analysis report of the application to be tested according to the test result, so that the performance contrastive test of the application can be automatically completed only by submitting the installation information and the detection information, time and labor are avoided for manual test, the contrastive analysis report is automatically generated according to the test result, the data processing process of manual analysis is avoided, the whole process is more efficient and rapid, the contrastive analysis report sending function is further provided, and a user can directly leave after submitting the corresponding test task and collect the contrastive analysis report when the user is idle.
EXAMPLE III
Fig. 5 is a schematic structural diagram of an application performance comparison test system 300 according to a third embodiment of the present invention, where the specific structure of the application performance comparison test system is as follows:
the obtaining module 310 obtains installation information and detection information of the application to be tested.
In the embodiment of the present invention, the application to be tested generally includes a batch of competitive applications, that is, several applications having a competitive relationship, for example, a newly developed instant chat application needs to be put into the market, and the competitive application illustratively includes WeChat, QQ, etc., although other applications may be included therein, and the specific applications may be set according to different actual use requirements.
When a batch of applications to be tested need to be subjected to performance test, corresponding information needs to be uploaded, and specifically, the corresponding information can be uploaded to a server. The corresponding information referred to in this embodiment includes installation information and detection information, the installation information includes an installation package and an operating environment parameter of the application to be tested, and the detection information includes one or more of an application identification code to be tested, an application testing sequence to be tested, an application type to be tested, an application function to be tested, a testing environment parameter, a testing duration, and a performance index type to be tested.
In some embodiments, a user needs to test multiple sets of auction product applications, such as an instant chat type auction product application, a shopping type auction product application, and a music type auction product application, and it is obviously more tedious and time-consuming to upload only one set of auction product application each time, so that in this embodiment, all installation information of the multiple sets of auction product applications can be uploaded first, and then, detection information is used to determine which applications belong to the same type of auction product application, and a test requirement is set. Specifically, the application identification code to be tested is used for conveniently and quickly distinguishing the uploaded applications to be tested; the test sequence of the to-be-tested applications can be used for self-defining the test sequence of the to-be-tested applications when the test terminal does not meet the requirement for simultaneously detecting the to-be-tested applications, for example, a shopping competitive product application and a music competitive product application are uploaded at a certain time, the music competitive product application can be detected first, and then the shopping competitive product application can be detected; the application type to be tested is used for distinguishing the application types of the competitive products to which the application to be tested belongs (the same application to be tested can be divided into different application types of the competitive products according to different applications needing to be compared); the application function to be tested is used for selecting a functional module which needs to keep running when the application to be tested is detected; the test environment parameters are used for setting a specified test terminal of the application to be tested and the running environment of the test terminal during detection; the test duration is used for indicating that the running time of the application to be tested at the test terminal can be set by a user, such as 30 minutes, 60 minutes and 1 hour; the performance index type to be tested refers to the performance data type to be acquired when the application to be tested runs at the test terminal, and comprises one or more of application starting time, CPU occupancy rate, memory occupation, uplink and downlink flow and power consumption. It should be noted that, the above-mentioned detection information may be provided according to different settings of actual requirements, but not all the detection information need to be provided, that is, in other embodiments, only one or more of the above-mentioned detection information may be selected to complete the performance comparison test of the application.
In some embodiments, the detection information may further include a data acquisition interval, which is used to set how often the performance acquirer acquires data, and may be set according to actual needs.
And the installation module 320 is used for installing the application to be tested to the test terminal according to the installation information and the detection information of the application to be tested.
The installation information and the detection information of the application to be tested are uploaded to the corresponding server according to the acquisition module 310, but the detection of the application to be tested needs to be performed on the corresponding test terminal, so that the application to be tested needs to be installed on the corresponding test terminal, which test terminal the application to be tested is installed on is selected by a user, and the installation package of the application to be tested needs to be sent to the corresponding test terminal to complete installation.
Specifically, the installation module 320 includes:
and the test terminal determining unit is used for determining the test terminal corresponding to the application to be tested according to the detection information.
And the installation information sending unit is used for sending the installation information of the application to be tested to the corresponding test terminal.
And the to-be-tested application installation unit is used for controlling the corresponding test terminal to install the to-be-tested application according to the installation information of the to-be-tested application.
And the test module 330 is configured to control the test terminal to run the application to be tested according to the detection information and obtain a test result.
Specifically, the test module 330 includes:
and the running unit is used for controlling the background of the test terminal to run the application to be tested.
And the PID and UID acquisition unit is used for acquiring the PID and the UID of the application to be tested during the operation of the test terminal.
And the data acquisition unit is used for acquiring one or more of starting time data, CPU occupancy rate data, memory occupancy data, uplink and downlink flow data and power consumption data of the application to be detected according to the PID and the UID of the application to be detected.
After the application to be tested is installed on the corresponding test terminal, the application can be tested, and the specific test content, environment and the like are determined by the test information. For example, the test terminal is a plurality of models of mobile phones, each mobile phone runs all applications in the same type of competitive product application in the background, or the test terminal may be a plurality of mobile phones in the same model, and each mobile phone runs one application in the same type of competitive product application in the background, although other running modes may be set to implement different test environments besides the above two examples. Usually, the performance collector starts to operate before the application to be tested operates on the test terminal, which is to collect the start time of the application to be tested, and if the start time does not need to be collected, the performance collector can operate after the application to be tested operates, and the performance collector can collect various performance data of the application to be tested during operation, such as start time data, CPU occupancy data, memory occupancy data, uplink and downlink flow data and power consumption data. The performance data collected by the performance collector is the test result. In this embodiment, in order to obtain the background operation performance data of the application to be tested, the application to be tested needs to be set to operate on the background of the test terminal.
And the analysis module 340 generates a comparative analysis report according to the test result.
The method is specifically used for: obtaining test results of different applications to be tested under the same test environment parameter; and performing contrastive analysis according to the test results of the different applications to be tested under the same test environment parameter to generate contrastive analysis reports.
Because the embodiment is directed at the performance comparison test among the competitive product applications, and only the performance collected by the performance collector is provided for the user to compare and check the performance by himself, the embodiment can generate the comparison analysis report by himself according to the test result, and the comparison analysis report can compare the performance data of the same application under different terminals or different applications under the same terminal according to the user requirements, so that a more intuitive and effective analysis result is presented for the user.
Further, the application performance comparison test system provided by this embodiment further includes:
and the sending module is used for sending the comparative analysis report to a receiver according to the receiver information.
The embodiment further provides an application performance comparison test system, an application to be tested is installed on a corresponding test terminal and runs to obtain a corresponding test result according to installation information and detection information of the application to be tested, a comparison analysis report of the application to be tested is obtained according to the test result, the performance comparison test of the application can be automatically completed only by submitting the installation information and the detection information, time and labor are avoided for manual test, a data processing process of manual analysis is avoided by automatically generating the comparison analysis report according to the test result, and the whole process is more efficient and rapid.
The application performance comparison test system provided by the embodiment of the invention can execute any application performance comparison test method provided by the embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
Example four
Fig. 6 is a schematic structural diagram of an application performance comparison testing apparatus 400 according to a fourth embodiment of the present invention, as shown in fig. 6, the application performance comparison testing apparatus includes a memory 410 and a processor 420, the number of the processors 420 in the application performance comparison testing apparatus may be one or more, and one processor 420 is taken as an example in fig. 6; the memory 410 and the processor 420 in the application performance comparison test device may be connected by a bus or other means, and fig. 6 illustrates the connection by the bus as an example.
The memory 40 is a computer-readable storage medium, and can be used to store software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the application performance comparison test method in the embodiment of the present invention (for example, the obtaining module 310, the installing module 320, the testing module 330, and the analyzing module 340 in the application performance comparison test system). The processor 420 executes various functional applications and data processing of the application performance comparison test apparatus by executing software programs, instructions and modules stored in the memory 410, so as to implement the application performance comparison test method described above.
Wherein the processor 420 is configured to run the computer executable program stored in the memory 410 to implement the following steps: step 110, acquiring installation information and detection information of an application to be detected; step 120, installing the application to be tested to a test terminal according to the installation information and the detection information of the application to be tested; step 130, controlling the test terminal to run the application to be tested according to the detection information and obtaining a test result; and 140, generating a comparative analysis report according to the test result.
Of course, the application performance comparison test device provided in the embodiment of the present invention is not limited to the above-described method operations, and may also perform related operations in the application performance comparison test method provided in any embodiment of the present invention.
The memory 410 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the memory 410 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, memory 410 may further include memory located remotely from processor 420, and such remote memory may be connected to the application performance comparison test equipment via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The device can execute the method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
EXAMPLE five
An embodiment of the present invention further provides a storage medium containing computer-executable instructions, where the computer-executable instructions are executed by a computer processor to perform an application performance comparison test method, where the application performance comparison test method includes:
acquiring installation information and detection information of an application to be detected;
installing the application to be tested to a test terminal according to the installation information and the detection information of the application to be tested;
controlling the test terminal to run the application to be tested according to the detection information and obtaining a test result;
and generating a comparative analysis report according to the test result.
Of course, the storage medium provided by the embodiment of the present invention contains computer-executable instructions, and the computer-executable instructions are not limited to the method operations described above, and may also perform related operations in the application performance comparison test method provided by any embodiment of the present invention.
From the above description of the embodiments, it is obvious for those skilled in the art that the present invention can be implemented by software and necessary general hardware, and certainly, can also be implemented by hardware, but the former is a better embodiment in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which may be stored in a computer-readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, and includes several instructions for enabling a computer device (which may be a personal computer, an application performance comparison testing device, or a network device) to execute the methods according to the embodiments of the present invention.
It should be noted that, in the embodiment of the application performance comparison test system, each unit and each module included in the embodiment are only divided according to functional logic, but are not limited to the above division, as long as the corresponding functions can be implemented; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. An application performance comparison test method is characterized by comprising the following steps:
acquiring installation information and detection information of an application to be detected;
installing the application to be tested to a test terminal according to the installation information and the detection information of the application to be tested;
controlling the test terminal to run the application to be tested according to the detection information and obtaining a test result;
and generating a comparative analysis report according to the test result.
2. The application performance comparison testing method of claim 1, wherein the installation information includes an installation package and operating environment parameters of the application to be tested, and the detection information includes one or more of an identification code of the application to be tested, a testing sequence of the application to be tested, a type of the application to be tested, an application function to be tested, a testing environment parameter, a testing duration, and a type of a performance index to be tested.
3. The method for the comparative testing of the application performance of claim 2, wherein the performance index types to be tested include one or more of start time, CPU occupancy, memory occupancy, uplink and downlink traffic and power consumption, and correspondingly, the test results include one or more of start time data, CPU occupancy data, memory occupancy data, uplink and downlink traffic data and power consumption data.
4. The method for the application performance comparison test of claim 1, wherein the installing the application to be tested to a test terminal according to the installation information and the detection information of the application to be tested comprises:
determining a test terminal corresponding to the application to be tested according to the detection information;
sending the installation information of the application to be tested to the corresponding test terminal;
and controlling the corresponding test terminal to install the application to be tested according to the installation information of the application to be tested.
5. The method for contrast testing of application performance of claim 3, wherein the controlling the test terminal to run the application to be tested and obtain the test result according to the detection information comprises:
controlling the background of the test terminal to run the application to be tested;
acquiring a PID and a UID of the application to be tested during the operation of the test terminal;
and acquiring one or more of starting time data, CPU occupancy rate data, memory occupancy data, uplink and downlink flow data and power consumption data of the application to be tested according to the PID and the UID of the application to be tested.
6. The method for the comparative testing of the application performance of claim 2, wherein the generating of the comparative analysis report according to the testing result comprises:
obtaining test results of different applications to be tested under the same test environment parameter;
and performing contrastive analysis according to the test results of the different applications to be tested under the same test environment parameter to generate contrastive analysis reports.
7. The application performance comparison test method of claim 1, wherein the detection information further comprises recipient information;
after generating a comparative analysis report according to the test result, the method further comprises the following steps: and sending the comparative analysis report to a receiver according to the receiver information.
8. An application performance comparison testing system, the system comprising:
the acquisition module is used for acquiring the installation information and the detection information of the application to be detected;
the installation module is used for installing the application to be tested to the test terminal according to the installation information and the detection information of the application to be tested;
the test module is used for controlling the test terminal to run the application to be tested according to the detection information and obtaining a test result;
and the analysis module is used for generating a comparative analysis report according to the test result.
9. An application performance comparison testing device, comprising a memory and a processor, the memory having stored thereon a computer program executable on the processor, the processor implementing the application performance comparison testing method according to claims 1-7 when executing the computer program.
10. A computer-readable storage medium, characterized in that the storage medium stores a computer program comprising program instructions which, when executed, implement the application performance comparison test method according to any one of claims 1-7.
CN201910880849.5A 2019-09-18 2019-09-18 Application performance comparison test method, system, equipment and computer readable storage medium Pending CN110647453A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910880849.5A CN110647453A (en) 2019-09-18 2019-09-18 Application performance comparison test method, system, equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910880849.5A CN110647453A (en) 2019-09-18 2019-09-18 Application performance comparison test method, system, equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN110647453A true CN110647453A (en) 2020-01-03

Family

ID=69010732

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910880849.5A Pending CN110647453A (en) 2019-09-18 2019-09-18 Application performance comparison test method, system, equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN110647453A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113407452A (en) * 2021-06-24 2021-09-17 上海中通吉网络技术有限公司 APP power consumption testing method
CN113468074A (en) * 2021-08-09 2021-10-01 北京映客芝士网络科技有限公司 Application program version comparison monitoring method, device, medium and electronic equipment
CN113505053A (en) * 2021-07-27 2021-10-15 云账户技术(天津)有限公司 Application performance detection method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107357727A (en) * 2017-07-04 2017-11-17 广州君海网络科技有限公司 APP testing results method, apparatus, readable storage medium storing program for executing and computer equipment
CN108287790A (en) * 2017-12-29 2018-07-17 五八有限公司 Application program capacity test method, device and the electronic equipment of terminal
CN109491883A (en) * 2018-09-13 2019-03-19 武汉灯塔之光科技有限公司 Performance detects control methods and system simultaneously between applying under a kind of Android platform more

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107357727A (en) * 2017-07-04 2017-11-17 广州君海网络科技有限公司 APP testing results method, apparatus, readable storage medium storing program for executing and computer equipment
CN108287790A (en) * 2017-12-29 2018-07-17 五八有限公司 Application program capacity test method, device and the electronic equipment of terminal
CN109491883A (en) * 2018-09-13 2019-03-19 武汉灯塔之光科技有限公司 Performance detects control methods and system simultaneously between applying under a kind of Android platform more

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113407452A (en) * 2021-06-24 2021-09-17 上海中通吉网络技术有限公司 APP power consumption testing method
CN113505053A (en) * 2021-07-27 2021-10-15 云账户技术(天津)有限公司 Application performance detection method and device
CN113468074A (en) * 2021-08-09 2021-10-01 北京映客芝士网络科技有限公司 Application program version comparison monitoring method, device, medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN107193750B (en) Script recording method and device
CN111835582B (en) Configuration method and device of Internet of things inspection equipment and computer equipment
CN110647453A (en) Application performance comparison test method, system, equipment and computer readable storage medium
CN109995614B (en) Alpha testing method and device
CN107302476B (en) Automatic testing method and system for testing asynchronous interactive system
CN111124911A (en) Automatic testing method, device, equipment and readable storage medium
CN108108296B (en) Cloud testing method, server and client
CN112269697B (en) Equipment storage performance testing method, system and related device
CN113254331A (en) Model test method, device, storage medium and program product
CN106528415A (en) Software compatibility test method, business platform and system
CN113672441A (en) Method and device for testing intelligent equipment
CN112558982B (en) Code detection method and device and computer equipment
CN111949548B (en) Automatic unauthorized penetration testing method and storage device
CN111835583B (en) Attribute inspection method and device for products of Internet of things and computer equipment
CN106484601B (en) User data analysis method and system for client
CN111078485A (en) Intelligent detection method for training result
CN112306639B (en) Virtual desktop testing method and device
CN112596750B (en) Application testing method and device, electronic equipment and computer readable storage medium
CN110795330A (en) Monkey pressure testing method and device
CN115454851A (en) Interface regression testing method and device, storage medium and electronic device
CN112860509A (en) Dial testing alarm method and device
CN111008116A (en) Timing snapshot characteristic testing method and system
CN105608008A (en) Automatic testing method and system applied to cloud storage client
CN116893977B (en) Automatic deployment method, device, equipment and medium for distributed simulation test environment
CN113722181B (en) BMC process monitoring method, device, system and medium of server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200103