CN110362483B - Performance data acquisition method, device, equipment and storage medium - Google Patents

Performance data acquisition method, device, equipment and storage medium Download PDF

Info

Publication number
CN110362483B
CN110362483B CN201910541820.4A CN201910541820A CN110362483B CN 110362483 B CN110362483 B CN 110362483B CN 201910541820 A CN201910541820 A CN 201910541820A CN 110362483 B CN110362483 B CN 110362483B
Authority
CN
China
Prior art keywords
performance data
tested
application program
data acquisition
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910541820.4A
Other languages
Chinese (zh)
Other versions
CN110362483A (en
Inventor
杨小彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Puhui Enterprise Management Co Ltd
Original Assignee
Ping An Puhui Enterprise Management Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Puhui Enterprise Management Co Ltd filed Critical Ping An Puhui Enterprise Management Co Ltd
Priority to CN201910541820.4A priority Critical patent/CN110362483B/en
Publication of CN110362483A publication Critical patent/CN110362483A/en
Application granted granted Critical
Publication of CN110362483B publication Critical patent/CN110362483B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention relates to data acquisition, and discloses a performance data acquisition method, a device, equipment and a storage medium, wherein the method comprises the following steps: when an automatic test instruction is received, determining an application program to be tested according to field information contained in the automatic test instruction; reading and executing a corresponding test case script from the test case library based on the field information; acquiring a data acquisition command generated in the test case script execution process and sending the data acquisition command to an application program to be tested; the performance data is preprocessed and then written into the preset database for storage, and because the data is automatically collected by executing the test case script, compared with the existing data collection mode of integrating a third-party software development kit in the program code, the method can ensure the safety of the program code and simultaneously realize the collection of the performance data of the application program with low cost and high efficiency.

Description

Performance data acquisition method, device, equipment and storage medium
Technical Field
The invention relates to the technical field of automatic testing, in particular to a performance data acquisition method, a performance data acquisition device, performance data acquisition equipment and a storage medium.
Background
At present, in the field of computer software, the most important factor in mobile terminal development is the collapse rate, and when the collapse rate is stabilized, the center of gravity of the mobile terminal work is shifted to performance optimization.
At present, a lot of mature third-party Software such as GT (portable telephone), wetest (a test platform) and other performance monitoring Software exist, the third-party Software needs to be integrated into an Application (App) code for performance monitoring, the problems of insecure code data and high charging exist, meanwhile, a Software Development Kit (SDK) package accessed to the third-party can increase the package volume of the test App, but the problem of reduced App response speed is caused due to the fact that the SDK has few used functions. Therefore, how to collect the performance data of the application program with low cost and high efficiency while ensuring the security of the code data of the application program becomes a problem to be solved urgently.
The above is only for the purpose of assisting understanding of the technical aspects of the present invention, and does not represent an admission that the above is prior art.
Disclosure of Invention
The invention mainly aims to provide a performance data acquisition method, a performance data acquisition device, performance data acquisition equipment and a storage medium, and aims to solve the technical problems of low safety, high cost and insufficient accuracy of acquisition results when an application program is subjected to performance data acquisition in the prior art.
In order to achieve the above object, the present invention provides a performance data acquisition method, comprising the steps of:
when an automatic test instruction is received, determining an application program to be tested according to field information contained in the automatic test instruction;
reading a corresponding test case script from a test case library based on the field information, and executing the test case script;
acquiring a data acquisition command generated in the test case script execution process, and sending the data acquisition command to the application program to be tested;
and acquiring performance data, preprocessing the performance data, and writing the preprocessed performance data into a preset database for storage, wherein the performance data is fed back when the application program to be tested responds to the data acquisition command.
Preferably, the field information includes an application version number;
the step of determining the application program to be tested according to the field information contained in the automatic test instruction comprises the following steps:
reading the application version number contained in the automatic test instruction, and extracting an application identifier carried in the application version number and version information of a system to be tested;
and searching for a corresponding application program according to the application identifier, and screening out the application program to be tested from the searched application program according to the version information of the system to be tested.
Preferably, the step of reading the corresponding test case script from the test case library based on the field information includes:
reading a system version field contained in the version information of the system to be tested, and searching a target tested function point corresponding to the system version field in a mapping relation between the pre-established system version field and the tested function point;
and searching the test case script covering all the target tested function points from the test case library.
Preferably, the data obtaining command includes a katton ratio obtaining command, and the step of acquiring the performance data, preprocessing the performance data, and writing the preprocessed performance data into a preset database for storage includes:
collecting a data matrix fed back by the application program to be tested when the application program responds to the Caton ratio acquisition command;
adding matrix elements of each row in the data matrix to obtain a plurality of frame rate values, and acquiring a target frame rate value exceeding a preset threshold value in the frame rate values;
and counting the number of the target frame rate values, acquiring the pause ratio of the application program to be tested according to the counted number of the target frame rate values, and writing the pause ratio into a preset database for storage.
Preferably, the data obtaining command further includes a fluency obtaining command, and the step of acquiring the performance data, preprocessing the performance data, and writing the preprocessed performance data into a preset database for storage includes:
collecting fluency logs fed back when the fluency obtaining command is responded by the application program to be tested;
reading a total frame number and a lost frame number in a preset time period from the fluency log, calculating fluency corresponding to the application program to be tested according to a preset formula, and writing the fluency into a preset database for storage;
the preset formula is fluency = frame rate (total frame number-frame loss number)/total frame number.
Preferably, the data obtaining command further includes a response time obtaining command, and the step of acquiring the performance data, preprocessing the performance data, and writing the preprocessed performance data into a preset database for storage includes:
collecting a time log generated when the application program to be tested responds to the response time acquisition command;
and extracting the function execution time of a preset function from the time log, acquiring the response time according to the function execution time, and writing the response time into a preset database for storage.
Preferably, the preset function comprises a view. The step of extracting the function execution time of the preset function from the time log and obtaining the response time according to the function execution time includes:
extracting event click time corresponding to the execution of the View.
And calculating the time difference between the event click time and the drawing completion time, and taking the time difference as response time.
In addition, in order to achieve the above object, the present invention further provides a performance data acquisition apparatus, including:
the program determining module is used for determining the application program to be tested according to field information contained in the automatic test instruction when the automatic test instruction is received;
the script determining module is used for reading a corresponding test case script from the test case library based on the field information and executing the test case script;
the script execution module is used for acquiring a data acquisition command generated in the test case script execution process and sending the data acquisition command to the application program to be tested;
and the data acquisition module is used for acquiring performance data, preprocessing the performance data and writing the preprocessed performance data into a preset database for storage, wherein the performance data is feedback data when the application program to be tested responds to the data acquisition command.
In addition, in order to achieve the above object, the present invention further provides a performance data acquisition device, including: a memory, a processor and a performance data acquisition program stored on the memory and executable on the processor, the performance data acquisition program being configured to implement the steps of the performance data acquisition method as described above.
In addition, to achieve the above object, the present invention further provides a storage medium having a performance data acquisition program stored thereon, wherein the performance data acquisition program, when executed by a processor, implements the steps of the performance data acquisition method as described above.
According to the method, when an automatic test instruction is received, an application program to be tested is determined according to field information contained in the automatic test instruction; reading a corresponding test case script from the test case library based on the field information, and executing the test case script; acquiring a data acquisition command generated in the test case script execution process, and sending the data acquisition command to an application program to be tested; the performance data is preprocessed and then written into the preset database for storage, and because the invention does not need to integrate a third-party software development kit in the application program code, the data collection can be completed only according to the test case script which is compiled in advance, the safety of the application program code can be ensured, and the collection of the performance data of the application program can be realized with low cost and high efficiency.
Drawings
Fig. 1 is a schematic structural diagram of a performance data acquisition device of a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart diagram of a first embodiment of a performance data collection method of the present invention;
FIG. 3 is a schematic flow chart diagram of a second embodiment of a performance data collection method of the present invention;
FIG. 4 is a schematic flow chart diagram of a performance data collection method according to a third embodiment of the present invention;
fig. 5 is a block diagram of a first embodiment of the performance data acquisition apparatus according to the present invention.
The implementation, functional features and advantages of the present invention will be further described with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a performance data acquisition device in a hardware operating environment according to an embodiment of the present invention.
As shown in fig. 1, the performance data acquisition apparatus may include: a processor 1001, such as a Central Processing Unit (CPU), a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a WIreless interface (e.g., a WIreless-FIdelity (WI-FI) interface). The Memory 1005 may be a Random Access Memory (RAM) Memory, or may be a Non-Volatile Memory (NVM), such as a disk Memory. The memory 1005 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the configuration shown in FIG. 1 does not constitute a limitation of the performance data collection apparatus and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, the memory 1005, which is a storage medium, may include therein an operating system, a data storage module, a network communication module, a user interface module, and a performance data acquisition program.
In the performance data acquisition apparatus shown in fig. 1, the network interface 1004 is mainly used for data communication with a network server; the user interface 1003 is mainly used for data interaction with a user; the processor 1001 and the memory 1005 of the performance data acquisition device of the present invention may be disposed in the performance data acquisition device, and the performance data acquisition device calls the performance data acquisition program stored in the memory 1005 through the processor 1001 and executes the performance data acquisition method provided by the embodiment of the present invention.
An embodiment of the present invention provides a performance data acquisition method, and referring to fig. 2, fig. 2 is a schematic flow chart of a first embodiment of the performance data acquisition method of the present invention.
In this embodiment, the performance data acquisition method includes the following steps:
step S10: when an automatic test instruction is received, determining an application program to be tested according to field information contained in the automatic test instruction;
it should be noted that the main execution body of the method of this embodiment may be a User Interface (UI) automated test case script written by a developer in advance using python language (a computer programming language), or may be a test host or a device (hereinafter referred to as a test host) capable of running the test case script, such as a personal computer, a notebook computer, and the like. In this embodiment, when a developer writes the test case script, the developer may integrate the android SDK adb command and the monkey command in the corresponding script code to ensure the automatic acquisition of subsequent performance data.
In this embodiment, the field information may include an application version number corresponding to the application program to be tested, and the application version number may be composed of an application identifier (e.g., an application name or a custom number/code of the application program) and system version information (e.g., a system version number) of a current system of the mobile terminal (e.g., a mobile phone, a tablet computer, and a personal computer) where the application program to be tested is located.
In specific implementation, when receiving an automated test instruction, a test host can analyze the instruction so as to read field information contained in the instruction according to an analysis result; and then determining the application program to be tested for the performance data acquisition according to the application version number contained in the field information.
It can be understood that, due to the difference between the hardware and the software of the mobile terminal, even for the same application program, a developer needs to adaptively modify the program code according to the difference of the mobile terminal so that the application program can be smoothly run on different mobile terminals. Therefore, in this embodiment, when writing the test case scripts, developers write the scripts respectively according to differences of brands/models/system versions of the mobile terminals, and then associate the written test case scripts with the system version fields (in the system version information to be tested) according to the tested function points of the test case scripts, so that the test case scripts can be quickly searched according to the system version fields in the following process.
Further, in this embodiment, the test host may read the application version number included in the automated test instruction, and extract the application identifier and the version information of the system to be tested, which are carried in the application version number; and then searching for the corresponding application program according to the application identifier, and screening out the application program to be tested from the searched application program according to the version information of the system to be tested. For example, the test host may read an application version number "pinganyizhanggtong.android.7.0.0" from the automated test instruction, and then extract an application identifier "pinganyizhanggtong" carried in the application version number and system version information "android.7.0.0" to be tested; and searching for application programs of different versions according to the application identifier pinganyizhangong, and screening out the application program to be tested from the searched application programs according to the version information android.7.0.0 of the system to be tested.
Step S20: reading a corresponding test case script from a test case library based on the field information, and executing the test case script;
it should be understood that, in general, test case scripts for different test phases are stored in the test case library, for example, the test case script corresponding to the pressure test phase, the test case script corresponding to the performance test phase, and the like. The different test dimensions targeted by the test case scripts are different, and with the upgrade of the system version of the mobile terminal, the implementation functions of the application program are changed and different accordingly, so that in order to ensure that a test host can comprehensively test various function points of the application program in the automatic test process, developers in the embodiment can correlate the system version field with the tested function points targeted by the test case scripts in advance and establish a mapping relationship between the system version field and the tested function points targeted by the test case scripts.
In a specific implementation, the test host can read a system version field contained in the version information of the system to be tested, and search a target tested function point corresponding to the system version field in a mapping relation between the pre-established system version field and the tested function point; and then searching the test case script covering all the target tested function points from the test case library.
Step S30: acquiring a data acquisition command generated in the test case script execution process, and sending the data acquisition command to the application program to be tested;
it should be noted that, in order to implement automatic acquisition of performance data of an application App, data acquisition commands of various types of performance data are integrated in the test case script, for example, a memory acquisition command "adb shell dumpsys meminfo com. Page", a CPU acquisition command "adb shell top-n1| grep com. Page", a stuck-to-stopped ratio acquisition command "adb shell dumpsys surface flag-latency", a fluency acquisition command "adb location", and the like.
In a specific implementation, after the test case script is acquired, the test host can execute the test case script to acquire the data acquisition command integrated therein and send the acquired data acquisition command to the application program to be tested, so that the application program to be tested responds to the received data acquisition commands and acquires performance data in the command response process.
Step S40: and acquiring performance data, preprocessing the performance data, and writing the preprocessed performance data into a preset database for storage, wherein the performance data is fed back when the application program to be tested responds to the data acquisition command.
It should be understood that the performance test plays an important role in quality assurance of the software application, the application is used for the performance test of the client to examine the performance of the client application, and the test entrance is the client, which mainly comprises a concurrency performance test, a fatigue strength test, a large data volume test, a speed test and the like. The concurrent performance test is particularly important, and the concurrent performance test is a process of Load test and stress test, and the Load test (Load Testing) is to determine the performance of the system under various workloads, and the goal is to test the corresponding output items of the system components, such as throughput, response time, CPU Load, memory usage, etc., to determine the performance of the system when the Load is gradually increased.
In specific implementation, the test host may invoke various data acquisition commands integrated in the script by executing the test case script, and then collect performance data such as a memory, a CPU, a Frame Per Second (FPS), fluency, response time, and the like, which are fed back when the application to be tested responds to the data acquisition commands.
In the embodiment, when the automatic test instruction is received, the application program to be tested is determined according to the field information contained in the automatic test instruction; reading a corresponding test case script from the test case library based on the field information, and executing the test case script; acquiring a data acquisition command generated in the test case script execution process, and sending the data acquisition command to an application program to be tested; the performance data is preprocessed and then written into the preset database for storage, and because the method does not need to integrate a third-party software development kit in the application program code, the method can acquire the performance data of the application program with low cost and high efficiency while ensuring the safety of the application program code.
Referring to fig. 3, fig. 3 is a schematic flow chart of a performance data acquisition method according to a second embodiment of the present invention.
Based on the first embodiment, in this embodiment, the data obtaining command includes a katton ratio obtaining command, and accordingly, the step S40 may include:
step S401: collecting a data matrix fed back by the application program to be tested when the application program responds to the Caton ratio acquisition command;
it should be understood that FPS (also known as the katton ratio) is a definition in the field of images, and refers to the number of frames transmitted per second of a picture, and colloquially to the number of pictures in an animation or video. FPS measures the amount of information used to store and display dynamic video, and the greater the number of frames per second, the smoother the displayed motion will be.
In a specific implementation, the test host may call an addb shell duration trigger-latency command pre-integrated in the test case script, so that the application program to be tested feeds back corresponding FPS information when receiving the addb shell duration trigger-latency command. In general, the system surfafinger above android.4.0 records the information of the last 128 frames, so the test host can read the FPS information (generally embodied in a data matrix) from the system through the katton ratio command, and then calculate the katton ratio.
Step S402: adding matrix elements of each row in the data matrix to obtain a plurality of frame rate values, and obtaining a target frame rate value exceeding a preset threshold value in the frame rate values;
it should be noted that the data matrix generally includes four dimensions of data: draw, premate, process, execute, while in fact a complete frame (frame rate value) = Draw + Premate + Process + Execute, when this time is less than 16ms, 60 frames per second can be guaranteed. For example, if the data in a row of the data matrix is Draw =4.08, prepare =3.31, process =4.24, and Execute =1.83, the frame rate value is equal to 13.46ms < 16ms, and thus 60 frames per second can be achieved.
In a specific implementation, the test host may add matrix elements in each row in the data matrix to obtain a frame rate value corresponding to each row of data in the matrix, and then compare all the obtained frame rate values with a preset threshold (16 ms) to obtain a target frame rate value smaller than 16 ms.
Step S403: and counting the number of the target frame rate values, acquiring the pause ratio of the application program to be tested according to the counted number of the target frame rate values, and writing the pause ratio into a preset database for storage.
In this embodiment, after obtaining the target frame rate value, the test host may count the number of the target frame rate value, then calculate a ratio of the target frame rate value in a frame rate value corresponding to the entire data matrix according to the number (the ratio is actually the katon ratio of the current system), and write the obtained katon ratio into a preset database for storage.
In the embodiment, a data matrix fed back when the application program to be tested responds to the katton ratio acquisition command is acquired; adding matrix elements of each row in the data matrix to obtain a plurality of frame rate values, and obtaining a target frame rate value exceeding a preset threshold value in the frame rate values; and then counting the number of the target frame rate values, acquiring the pause ratio of the application program to be tested according to the counted number of the target frame rate values, and writing the pause ratio into a preset database for storage, so that the accurate acquisition of the performance data of the pause ratio of the application program is realized, and the accuracy of the performance test is ensured.
Referring to fig. 4, fig. 4 is a schematic flow chart of a performance data acquisition method according to a third embodiment of the present invention.
Based on the foregoing embodiments, in this embodiment, the data obtaining command further includes a fluency obtaining command, and accordingly, the step S40 may include:
step S401': collecting fluency logs fed back by the application program to be tested when the fluency obtaining command is responded;
it should be understood that, today, the mobile phone App is more and more competitive, the performances, particularly the fluency, of the Android App is still different from that of the IOS, and the Android runs based on a java virtual machine, and the delay and the seizure of the touch response are much more serious than that of the IOS system. For example, operations such as sliding down and sliding up, zooming with two fingers and fast typing and the like are poor in smoothness of android, but for whether the App is smooth in use process, no reliable index is available at present to enable objective feeling of a user to correspond to data one by one. For such a situation, the fluency index of the application program to be tested is obtained by continuously obtaining the fluency log of the application program to be tested during running and analyzing the obtained fluency log according to the performance data acquisition method of the embodiment.
In specific implementation, the test host may call the fluency obtaining command adb logcat integrated in advance in the test case script to read the fluency log stored locally, and then perform fluency analysis according to the fluency log.
Step S402': reading a total frame number and a lost frame number in a preset time period from the fluency log, calculating the fluency corresponding to the application program to be tested according to a preset formula, and writing the fluency into a preset database for storage.
It should be appreciated that in the Android system, the Vertical Synchronization (VSync) mechanism is a technology that has long been widely used on PCs, and can be simply considered as a timed interrupt, i.e., the Android system sends out VSync signals every 16ms, triggering rendering of the UI. That is, the VSync mechanism is like a fixed engine (60 rpm), and each revolution will drive the system to perform some UI related operations. Therefore, when a complex work is faced, the increase of the workload will cause the time consumption to exceed 16.6ms, and will bring the change of the rotating speed. Based on the above principle, the embodiment considers that the fluency of the application program is judged by acquiring the rotating speed.
In a specific implementation, the test host may read a total frame number and a lost frame number within a preset time period (e.g., 1 second) from the obtained fluency log, then calculate the fluency corresponding to the application to be tested according to a preset formula "fluency = frame rate (total frame number-lost frame number)/total frame number", and write the fluency into a preset database for storage.
Further, the performance data collection method provided in this embodiment may also perform the performance data collection operation during the monkey stability test.
It should be understood that Monkey is a command line tool, and can use android debug bridge (adb) to run it, so as to simulate a random event stream such as user touching screen, sliding Trackball, and pressing key to perform a pressure test on an application program on a device, and detect how long the application program is abnormal, monkey stability is mainly reflected in a collapse rate, and usually, the collapse rate is completed in the pressure test performed before a performance test.
Specifically, the test host can execute the test case script so that the script sends a preset monkey command to the application program APP to be tested; when receiving a monkey command, an APP to be tested collects corresponding pressure test information when the monkey command is executed, and stores the pressure test information to a log file; and reading the program starting number and the crash number contained in the log file, and calculating the corresponding crash rate according to the starting number and the crash number by a formula of 'crash rate = crash number/starting number'.
In the embodiment, a fluency log fed back when the fluency acquisition command is responded by the application program to be tested is collected; the method includes the steps of reading a total frame number and a lost frame number in a preset time period from a fluency log, calculating fluency corresponding to an application program to be tested according to a preset formula 'fluency = frame rate (total frame number-lost frame number)/total frame number', writing the fluency into a preset database for storage, and accurately achieving obtaining of the fluency of the application program.
Based on the above embodiments, a fourth embodiment of the performance data acquisition method of the present invention is provided.
In this embodiment, the data obtaining command further includes a response time obtaining command, and the step S40 may specifically include the following steps:
the method comprises the following steps: collecting a time log generated when the application program to be tested responds to the response time acquisition command;
it should be understood that whether an App can be popular or not is generally dependent on the intuitive experience of the user when using the App. The time consumed by the process from clicking a button, link or issuing an instruction to the user until the system presents the result in the form of a user perception is the user's intuitive impression of the performance of the software, the so-called response time, and the user experience is relatively high when the corresponding time is small.
It can be understood that the Android system generally completes processing of a click event of a user by a View control, in this embodiment, an application page element is pressed on a terminal interface as a system to receive a touch/click event, and a menu page corresponding to the page element is completely popped up as an application to complete menu drawing. Therefore, in this embodiment, the test host may monitor a preset function in the xposed framework, print a time log (log) before executing the view. In addition, because the drawing of the Android control is to sequentially perform measures, layout, draw and the like on each view (view), a time log can be printed after the view and draw functions are executed by monitoring an xposed frame, then the time corresponding to the time log is taken as the drawing completion time, and finally the time difference between the two times is calculated, so that the response time can be obtained.
In a specific implementation, when the test host receives the response time acquisition command, the test host may first collect a time log generated when the application to be tested responds to the response time acquisition command.
The method comprises the following steps: and extracting the function execution time of a preset function from the time log, acquiring the response time according to the function execution time, and writing the response time into a preset database for storage.
It should be understood that the preset functions in this step include a view.
In a specific implementation, the test host can extract event click time corresponding to the execution of the view. And then calculating the time difference between the event click time and the drawing completion time, and taking the time difference as response time, namely realizing accurate acquisition of the response time.
The embodiment collects a time log generated when the application program to be tested responds to the response time acquisition command; the function execution time of the preset function is extracted from the time log, the response time is obtained according to the function execution time, and the response time is written into the preset database for storage, so that the response time of the application program is accurately obtained, and the obtained response time can be ensured to have high reliability.
In addition, an embodiment of the present invention further provides a storage medium, where a performance data acquisition program is stored on the storage medium, and the performance data acquisition program, when executed by a processor, implements the steps of the performance data acquisition method described above.
Referring to fig. 5, fig. 5 is a block diagram of a first embodiment of the performance data acquisition apparatus according to the present invention.
As shown in fig. 5, the performance data acquisition apparatus provided in the embodiment of the present invention includes:
the program determining module 501 is configured to determine, when an automatic test instruction is received, an application program to be tested according to field information included in the automatic test instruction;
a script determining module 502, configured to read a corresponding test case script from a test case library based on the field information, and execute the test case script;
the script execution module 503 is configured to acquire a data acquisition command generated in the test case script execution process, and send the data acquisition command to the application program to be tested;
and a data acquisition module 504, configured to acquire performance data, preprocess the performance data, and write the preprocessed performance data into a preset database for storage, where the performance data is data fed back by the application program to be tested when responding to the data acquisition command.
In the embodiment, when the automatic test instruction is received, the application program to be tested is determined according to the field information contained in the automatic test instruction; reading a corresponding test case script from the test case library based on the field information, and executing the test case script; acquiring a data acquisition command generated in the test case script execution process, and sending the data acquisition command to an application program to be tested; the method does not need to integrate a third-party software development kit in the application program code, so that the method can collect the performance data of the application program at low cost and high efficiency while ensuring the safety of the application program code.
Based on the first embodiment of the performance data acquisition device of the present invention, a second embodiment of the performance data acquisition device of the present invention is provided.
In this embodiment, the script determining module 502 is further configured to read the application version number included in the automatic test instruction, and extract the application identifier and version information of the system to be tested, which are carried in the application version number; and searching for a corresponding application program according to the application identifier, and screening out the application program to be tested from the searched application program according to the version information of the system to be tested.
Further, the script determining module 502 is further configured to read a system version field included in the version information of the system to be tested, and search for a target function point to be tested corresponding to the system version field in a mapping relationship between the system version field and the function point to be tested, where the mapping relationship is established in advance; and searching a test case script covering all the target tested function points from the test case library.
Further, the data acquisition module 504 is further configured to acquire a data matrix fed back by the application program to be tested when responding to the katton ratio acquisition command; adding matrix elements of each row in the data matrix to obtain a plurality of frame rate values, and acquiring a target frame rate value exceeding a preset threshold value in the frame rate values; and counting the number of the target frame rate values, acquiring the pause ratio of the application program to be tested according to the counted number of the target frame rate values, and writing the pause ratio into a preset database for storage.
Further, the data acquisition module 504 is further configured to acquire a fluency log fed back by the application program to be tested when responding to the fluency obtaining command; reading a total frame number and a lost frame number in a preset time period from the fluency log, calculating fluency corresponding to the application program to be tested according to a preset formula, and writing the fluency into a preset database for storage; wherein the preset formula is fluency = frame rate (total frame number-frame loss number)/total frame number.
Further, the data acquisition module 504 is further configured to acquire a time log generated when the application program to be tested responds to the response time acquisition command; and extracting the function execution time of a preset function from the time log, acquiring the response time according to the function execution time, and writing the response time into a preset database for storage.
Further, the data collection module 504 is further configured to extract, from the time log, an event click time corresponding to the view. And calculating the time difference between the event click time and the drawing completion time, and taking the time difference as response time.
Other embodiments or specific implementation manners of the performance data acquisition device of the present invention may refer to the above method embodiments, and are not described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrases "comprising one of 8230; \8230;" 8230; "does not exclude the presence of additional like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method of the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solutions of the present invention or portions thereof contributing to the prior art may be embodied in the form of a software product, where the computer software product is stored in a storage medium (such as a rom/ram, a magnetic disk, and an optical disk), and includes several instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the methods according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (9)

1. A method of performance data acquisition, the method comprising:
when an automatic test instruction is received, determining an application program to be tested according to field information contained in the automatic test instruction;
reading a corresponding test case script from a test case library based on the field information, and executing the test case script, wherein the field information comprises an application version number corresponding to the application program to be tested, and the application version number is composed of an application identifier and system version information of a current system of a mobile terminal where the application program to be tested is located;
acquiring a data acquisition command generated in the test case script execution process, and sending the data acquisition command to the application program to be tested;
acquiring performance data, preprocessing the performance data, and writing the preprocessed performance data into a preset database for storage, wherein the performance data is data fed back when the application program to be tested responds to the data acquisition command;
the data acquisition command comprises a Caton ratio acquisition command;
the steps of collecting performance data, preprocessing the performance data, and writing the preprocessed performance data into a preset database for storage comprise:
collecting a data matrix fed back by the application program to be tested when the application program responds to the Caton ratio acquisition command;
adding matrix elements of each row in the data matrix to obtain a plurality of frame rate values, and acquiring a target frame rate value exceeding a preset threshold value in the frame rate values;
counting the number of the target frame rate values, acquiring the pause ratio of the application program to be tested according to the counted number of the target frame rate values, and writing the pause ratio into a preset database for storage;
the data acquisition command further comprises a preset monkey command;
the step of collecting the performance data, preprocessing the performance data and writing the preprocessed performance data into a preset database for storage further comprises the following steps:
collecting pressure test information corresponding to the application program to be tested when the preset monkey command is executed;
storing the pressure test information to a log file;
and reading the program starting number and the collapse number contained in the log file, calculating a corresponding collapse rate according to the starting number and the collapse number, and writing the collapse rate into a preset database for storage.
2. The method of claim 1, wherein the field information includes an application version number;
the step of determining the application program to be tested according to the field information contained in the automatic test instruction comprises the following steps:
reading the application version number contained in the automatic test instruction, and extracting an application identifier carried in the application version number and version information of a system to be tested;
and searching for a corresponding application program according to the application identifier, and screening out the application program to be tested from the searched application program according to the version information of the system to be tested.
3. The method of claim 2, wherein the step of reading the corresponding test case script from the test case library based on the field information comprises:
reading a system version field contained in the version information of the system to be tested, and searching a target tested function point corresponding to the system version field in a mapping relation between the pre-established system version field and the tested function point;
and searching the test case script covering all the target tested function points from the test case library.
4. The method of claim 1, wherein the data fetch commands further comprise a fluency fetch command;
the steps of collecting performance data, preprocessing the performance data, and writing the preprocessed performance data into a preset database for storage comprise:
collecting fluency logs fed back by the application program to be tested when the fluency obtaining command is responded;
reading a total frame number and a lost frame number in a preset time period from the fluency log, calculating fluency corresponding to the application program to be tested according to a preset formula, and writing the fluency into a preset database for storage;
wherein the preset formula is fluency = frame rate (total frame number-frame loss number)/total frame number.
5. The method of claim 1, wherein the data fetch command further comprises a response time fetch command;
the steps of collecting performance data, preprocessing the performance data, and writing the preprocessed performance data into a preset database for storage comprise:
collecting a time log generated when the application program to be tested responds to the response time acquisition command;
and extracting the function execution time of a preset function from the time log, acquiring the response time according to the function execution time, and writing the response time into a preset database for storage.
6. The method of claim 5, wherein the preset function comprises a View.
The step of extracting the function execution time of the preset function from the time log and obtaining the response time according to the function execution time includes:
extracting event click time corresponding to the execution of the View.
And calculating the time difference between the event click time and the drawing completion time, and taking the time difference as response time.
7. A performance data collection device, the device comprising:
the program determining module is used for determining an application program to be tested according to field information contained in an automatic test instruction when the automatic test instruction is received;
the script determining module is used for reading a corresponding test case script from a test case library based on the field information and executing the test case script, wherein the field information comprises an application version number corresponding to the application program to be tested, and the application version number consists of an application identifier and system version information of a current system of a mobile terminal where the application program to be tested is located;
the script execution module is used for acquiring a data acquisition command generated in the test case script execution process and sending the data acquisition command to the application program to be tested;
the data acquisition module is used for acquiring performance data, preprocessing the performance data and writing the preprocessed performance data into a preset database for storage, wherein the performance data is feedback data when the application program to be tested responds to the data acquisition command;
wherein the data acquisition command comprises a Caton ratio acquisition command;
the steps of collecting performance data, preprocessing the performance data, and writing the preprocessed performance data into a preset database for storage comprise:
collecting a data matrix fed back by the application program to be tested when the application program responds to the Caton ratio acquisition command;
adding matrix elements of each row in the data matrix to obtain a plurality of frame rate values, and obtaining a target frame rate value exceeding a preset threshold value in the frame rate values;
counting the number of the target frame rate values, acquiring the pause ratio of the application program to be tested according to the counted number of the target frame rate values, and writing the pause ratio into a preset database for storage;
the data acquisition command further comprises a preset monkey command;
the step of collecting the performance data, preprocessing the performance data and writing the preprocessed performance data into a preset database for storage further comprises the following steps:
collecting pressure test information corresponding to the application program to be tested when the preset monkey command is executed;
storing the pressure test information to a log file;
and reading the program starting number and the crash number contained in the log file, calculating a corresponding crash rate according to the starting number and the crash number, and writing the crash rate into a preset database for storage.
8. A performance data collection device, the device comprising: memory, a processor and a performance data acquisition program stored on the memory and executable on the processor, the performance data acquisition program being configured to implement the steps of the performance data acquisition method as claimed in any one of claims 1 to 6.
9. A storage medium having stored thereon a performance data acquisition program which when executed by a processor implements the steps of the performance data acquisition method of any one of claims 1 to 6.
CN201910541820.4A 2019-06-19 2019-06-19 Performance data acquisition method, device, equipment and storage medium Active CN110362483B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910541820.4A CN110362483B (en) 2019-06-19 2019-06-19 Performance data acquisition method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910541820.4A CN110362483B (en) 2019-06-19 2019-06-19 Performance data acquisition method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110362483A CN110362483A (en) 2019-10-22
CN110362483B true CN110362483B (en) 2022-11-15

Family

ID=68216511

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910541820.4A Active CN110362483B (en) 2019-06-19 2019-06-19 Performance data acquisition method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110362483B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110830796B (en) * 2019-11-01 2021-09-03 深圳创维-Rgb电子有限公司 Television application testing method, television application testing device and readable storage medium
CN111061647A (en) * 2019-12-26 2020-04-24 行吟信息科技(上海)有限公司 Software performance automatic testing method and device and electronic equipment
CN111208989A (en) * 2019-12-27 2020-05-29 江苏南高智能装备创新中心有限公司 Data acquisition terminal capable of executing script
CN111198800B (en) * 2020-01-03 2023-08-04 北京小米移动软件有限公司 CPU occupancy rate detection method, CPU occupancy rate detection device and electronic equipment
CN111654691B (en) * 2020-05-21 2022-04-15 Oppo(重庆)智能科技有限公司 Performance test method, device, computer storage medium and system
CN112306870A (en) * 2020-10-28 2021-02-02 广州博冠信息科技有限公司 Data processing method and device based on live APP
CN112560035B (en) * 2020-12-15 2024-04-02 深圳市和讯华谷信息技术有限公司 Application detection method, device, equipment and storage medium
CN112527584A (en) * 2020-12-18 2021-03-19 上海万向区块链股份公司 Software efficiency improving method and system based on script compiling and data acquisition
CN114356771A (en) * 2021-12-31 2022-04-15 龙芯中科(武汉)技术有限公司 Operation method, device and equipment of data processing entity
CN116303101B (en) * 2023-05-19 2023-08-15 建信金融科技有限责任公司 Test case generation method, device and equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102025555B (en) * 2009-09-22 2014-07-16 中兴通讯股份有限公司 Method and device for testing IP multimedia sub-system performance
CN106528389B (en) * 2016-10-27 2021-03-09 北京小米移动软件有限公司 Performance evaluation method and device for system fluency and terminal
CN107273300A (en) * 2017-07-31 2017-10-20 北京云测信息技术有限公司 A kind of applied program testing method and device
CN108595312A (en) * 2017-12-29 2018-09-28 瑞庭网络技术(上海)有限公司 A kind of automatic performance method and device of modelling customer behavior
CN108733568A (en) * 2018-05-25 2018-11-02 平安科技(深圳)有限公司 Application testing method, device, equipment and readable storage medium

Also Published As

Publication number Publication date
CN110362483A (en) 2019-10-22

Similar Documents

Publication Publication Date Title
CN110362483B (en) Performance data acquisition method, device, equipment and storage medium
EP3149590B1 (en) Performance optimization tip presentation during debugging
US9448908B2 (en) System and method for model based session management
US7721268B2 (en) Method and system for a call stack capture
US9799037B2 (en) Service management using user experience metrics
US8850403B2 (en) Profiling data snapshots for software profilers
CN110457211B (en) Script performance test method, device and equipment and computer storage medium
US10657036B2 (en) Determining visual testing coverages
CN108874268B (en) User behavior data acquisition method and device
CN103345347B (en) A kind of method and apparatus that content of pages is commented on
US10169853B2 (en) Score weights for user interface (UI) elements
US20140074452A1 (en) System and method for automatic modeling of an application
WO2015077261A1 (en) Validating software characteristics
JPH0689200A (en) Debug system and method
CN111176960A (en) User operation behavior tracking method, device, equipment and storage medium
US10459835B1 (en) System and method for controlling quality of performance of digital applications
CN108595343A (en) The test method and device of application program
US20060188174A1 (en) Quantitative measure of a video interface
EP2472393A1 (en) Enablement of culture-based gestures
CN105808257B (en) Application popup identification method and device
CN112052073A (en) Script performance analysis method and device, readable storage medium and electronic equipment
US20180146157A1 (en) Correlating ui with cpu stacks for profiling sessions
CN112181853A (en) Program debugging method, device and system
CN110825649A (en) Application testing method, device, equipment and storage medium
CN110688602A (en) Method, device and system for testing webpage loading speed

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant