CN113886162A - Computing equipment performance test method, computing equipment and storage medium - Google Patents

Computing equipment performance test method, computing equipment and storage medium Download PDF

Info

Publication number
CN113886162A
CN113886162A CN202111226702.8A CN202111226702A CN113886162A CN 113886162 A CN113886162 A CN 113886162A CN 202111226702 A CN202111226702 A CN 202111226702A CN 113886162 A CN113886162 A CN 113886162A
Authority
CN
China
Prior art keywords
test
task
score
computing device
performance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111226702.8A
Other languages
Chinese (zh)
Other versions
CN113886162B (en
Inventor
曹现胜
孙建民
张伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uniontech Software Technology Co Ltd
Original Assignee
Uniontech Software Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uniontech Software Technology Co Ltd filed Critical Uniontech Software Technology Co Ltd
Priority to CN202111226702.8A priority Critical patent/CN113886162B/en
Priority claimed from CN202111226702.8A external-priority patent/CN113886162B/en
Publication of CN113886162A publication Critical patent/CN113886162A/en
Application granted granted Critical
Publication of CN113886162B publication Critical patent/CN113886162B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/22Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing
    • G06F11/2273Test methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention discloses a performance test method of computing equipment, the computing equipment and a storage medium. The method of the invention is suitable for being executed in a computing device and comprises the following steps: determining a plurality of test items for performing a performance test on a computing device; configuring a test task according to the determined test items to obtain a plurality of test tasks; executing each test task to obtain a test item score corresponding to the test task; setting corresponding score weight for each test item score; and calculating to obtain the performance score of the computing equipment according to the test item score and the corresponding score weight. The invention can carry out various tests on the computing equipment and verify the compatibility of the operating system on the computing equipment, thereby achieving the effect of automatically verifying the compatibility, expandability and usability of the operating system.

Description

Computing equipment performance test method, computing equipment and storage medium
Technical Field
The invention relates to the field of cloud computing, in particular to a server performance testing method, computing equipment and a storage medium.
Background
With the development of computer technology, in order to further improve the reliability of computer equipment and systems, the computer and the systems need to be tested. The existing testing method is low in efficiency, a tester needs to independently test on a machine every day, and test data are called for analysis after the test is finished. The method is time-consuming, the testing time and environment of different machines are different, different testing results cannot be summarized and counted, a uniform testing standard cannot be formed, and comprehensive and objective evaluation on computer equipment and systems is inconvenient.
For this reason, a new computing device performance testing method is needed.
Disclosure of Invention
To this end, the present invention provides a computing device performance testing method that seeks to solve, or at least alleviate, the above-identified problems.
According to an aspect of the present invention, there is provided a computing device performance testing method, adapted to be executed in a computing device, the method comprising the steps of: determining a plurality of test items for performing a performance test on a computing device; configuring a test task according to the determined test items to obtain a plurality of test tasks; executing each test task to obtain a test item score corresponding to the test task; setting corresponding score weight for each test item score; and calculating to obtain the performance score of the computing equipment according to the test item score and the corresponding score weight.
Optionally, in the method according to the present invention, configuring the test task according to the determined test item includes the steps of: setting the execution times and the execution parameters of the subtasks; constructing subtasks according to the execution times and the execution parameters of each subtask to obtain a plurality of subtasks; and aggregating the plurality of subtasks as the test tasks.
Optionally, in the method according to the present invention, executing each test task to obtain a test item score corresponding to the test task includes: executing each subtask in the test task to obtain a plurality of subtask scores; and setting corresponding scoring weight for each subtask score, and calculating to obtain the score of the test item according to the subtask score and the corresponding scoring weight.
Optionally, in the method according to the present invention, the test task includes a processor test, and when the test task is configured according to the determined test item, the constructed multiple sub-tasks include: the method comprises the following steps of running time test, million floating point operation time test, data throughput time test, decoding time test and processor response time test.
Optionally, in the method according to the present invention, the test task further includes a kernel test, and when the test task is configured according to the determined test item, the constructed multiple sub-tasks include: compiling times test, communication times test, square opening times test, calling times test, running times test, transmission byte number test and kernel response time test.
Optionally, in the method according to the present invention, the test task further includes an external memory test, and when the test task is configured according to the determined test item, the constructed multiple sub-tasks include: the method comprises the following steps of transmitting byte number test, reading and writing throughput rate test, running time test, reading byte number test, writing byte number test and external memory response time test.
Optionally, in the method according to the present invention, the test task further includes a network test, and when the test task is configured according to the determined test item, the constructed multiple sub-tasks include: a transmission byte number test and a network response time test.
Optionally, in the method according to the present invention, the test task further includes a display test, and when the test task is configured according to the determined test item, the constructed multiple subtasks include: drawing times test and transmission frame number test.
Optionally, in the method according to the present invention, the test task further includes a running test, and when the test task is configured according to the determined test item, the constructed multiple sub-tasks include: run number per second test, run number per minute test, and run number per hour test.
Optionally, in the method according to the present invention, when each subtask in the test task is executed, the method further includes the steps of: executing the subtasks according to the execution times of the subtasks to obtain a plurality of performance parameters; and determining the peak value score and the valley value score of the subtask according to the performance parameters.
Optionally, in a method according to the present invention, a plurality of historical peak scores and a plurality of historical valley scores of the subtasks are further stored in the computing device, the method further comprising the steps of: and drawing a peak valley trend graph according to the peak score, the valley score, the historical peak scores and the valley scores of the subtasks.
According to another aspect of the present invention, there is provided a computing device comprising: one or more processors; a memory; and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs comprising instructions for performing a computer performance testing method according to the present invention.
According to yet another aspect of the present invention, there is provided a computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by a computing device, cause the computing device to perform a computer performance testing method according to the present invention.
The invention relates to a performance test method of a computing device, which is suitable for being executed in the computing device and comprises the following steps: determining a plurality of test items for performing performance test on the computing equipment according to test requirements, and configuring test tasks according to the determined test items to obtain a plurality of test tasks. Performing multiple testing tasks in a computing device may enable automated testing of multiple aspects of the performance of the computing device. And then executing each test task to obtain a test item score corresponding to the test task, and setting a corresponding score weight for each test item score so as to calculate and obtain a performance score of the computing equipment according to the test item score and the corresponding score weight, so that the part of the computing equipment needing to be tested can be tested quickly, and the performance of the computing equipment can be known visually through performance frequency division. And the compatibility of the operating system on the computing equipment is verified by carrying out various tests on the computing equipment, so that the effects of automatically verifying the compatibility, expandability and usability of the operating system are achieved.
When the computing device is further tested, not only the hardware of the computing device but also the software of the computing device is tested, such as testing the kernel, so as to evaluate the performance of the operating system and the running application thereof and know the running condition of the application. Therefore, the performance of the computing equipment is comprehensively evaluated. When the performance of the computing equipment is high, the performance of the hardware of the computing equipment is good, and meanwhile, the operating system running in the computing equipment is good in compatibility.
Drawings
To the accomplishment of the foregoing and related ends, certain illustrative aspects are described herein in connection with the following description and the annexed drawings, which are indicative of various ways in which the principles disclosed herein may be practiced, and all aspects and equivalents thereof are intended to be within the scope of the claimed subject matter. The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description when read in conjunction with the accompanying drawings. Throughout this disclosure, like reference numerals generally refer to like parts or elements.
FIG. 1 shows a schematic diagram of a computing device testing system according to an exemplary embodiment of the present invention;
FIG. 2 illustrates a block diagram of a computing device 200, according to an exemplary embodiment of the invention; and
FIG. 3 shows a flowchart of a computing device performance testing method 300 according to an exemplary embodiment of the invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art. Like reference numerals generally refer to like parts or elements.
FIG. 1 shows a schematic diagram of a computing device testing system according to an exemplary embodiment of the present invention. As shown in FIG. 1, the computing device testing system includes a control server 140, a task server 150 communicatively coupled to the control server 140, a data server 160, and a computing device 110 and 130. The connection manner between the computing devices 110 and 130 and the control server 140 in fig. 1 is only exemplary, the number of the computing devices connected to the control server 140 is not limited in the present invention, and the control server 140 may be connected to hundreds or thousands of computing devices, so as to implement simultaneous testing of a large batch of computing devices, and improve testing efficiency.
According to an embodiment of the present invention, when a plurality of computing devices need to be subjected to batch performance testing, a plurality of test items for performing performance testing on the computing devices may be determined by the task server 150, a test task may be configured according to the determined test items, and the test task may be sent to the control server 140, so that the control server 140 generates a test installation package according to the test task. A test command is also generated and sent to the control server 140. The test instruction indicates the computing device 110 that installs the test installation package 130 to execute the test item in the test task for testing, and generate the test data.
The control server 140 receives the test task and the test command sent by the task server 150, and generates a test installation package according to the test task. The control server 140 further issues the test installation package to the connected computing devices 110 and 130, and forwards the test command to the computing devices 110 and 130, and receives the test data returned by each computing device. The test data includes performance scores for the computing devices 110 and 130. The control server 140 aggregates these performance scores and forwards them to the task server 150. The task server 150 can learn the performance of all connected computing devices based on the performance scores.
According to an embodiment of the present invention, the control server 140 may be provided with a plurality of backup servers (not shown), and the control server 140 is communicatively connected to the plurality of backup servers. When the control server 140 fails, the control server 140 timely transfers the communication connection with the plurality of computing devices 110 and 130 to the backup server, the backup server serves as the control server, receives the test task and the test command of the task server 150, creates a test installation package, and sends the test installation package to the computing devices 110 and 130 to forward the test command. The backup server also receives the test data of each computing device to obtain a plurality of items of test data, and draws a test data table according to the plurality of items of test data, so that the task server 150 performs performance evaluation on each computing device according to the test data table.
The computing device 110 receives the test installation package and installs the test installation package according to the received test installation package; and receiving the test command, executing the test task in the test installation package according to the test command to obtain test data, and transmitting the test data back to the control server 140.
The control server 140, the task server 150 communicatively connected to the control server 140, the data server 160, and the computing device 110 and 130 in fig. 1 may be implemented as computing devices as shown in fig. 2. FIG. 2 illustrates a block diagram of a computing device 200, according to an exemplary embodiment of the invention. As shown in FIG. 2, in a basic configuration 202, a computing device 200 typically includes a system memory 206 and one or more processors 204. A memory bus 208 may be used for communication between the processor 204 and the system memory 206.
Depending on the desired configuration, the processor 204 may be any type of processing, including but not limited to: a microprocessor (μ P), a microcontroller (μ C), a Digital Signal Processor (DSP), or any combination thereof. The processor 204 may include one or more levels of cache, such as a level one cache 210 and a level two cache 212, a processor core 214, and registers 216. Example processor cores 214 may include Arithmetic Logic Units (ALUs), Floating Point Units (FPUs), digital signal processing cores (DSP cores), or any combination thereof. The example memory controller 218 may be used with the processor 204, or in some implementations the memory controller 218 may be an internal part of the processor 204.
Depending on the desired configuration, system memory 206 may be any type of memory, including but not limited to: volatile memory (such as RAM), non-volatile memory (such as ROM, flash memory, etc.), or any combination thereof. System memory 206 may include an operating system 220, one or more programs 222, and program data 228. In some embodiments, the program 222 may be arranged to execute the instructions 223 of the method 300 according to the invention on an operating system by one or more processors 204 using the program data 228.
Computing device 200 may also include a storage interface bus 234. The storage interface bus 234 enables communication from the storage devices 232 (e.g., removable storage 236 and non-removable storage 238) to the basic configuration 202 via the bus/interface controller 230. Operating system 220, programs 222, and at least a portion of data 224 can be stored on removable storage 236 and/or non-removable storage 238, and loaded into system memory 206 via storage interface bus 234 and executed by one or more processors 204 when computing device 200 is powered on or programs 222 are to be executed.
Computing device 200 may also include an interface bus 240 that facilitates communication from various interface devices (e.g., output devices 242, peripheral interfaces 244, and communication devices 246) to the basic configuration 202 via the bus/interface controller 230. The example output device 242 includes a graphics processing unit 248 and an audio processing unit 250. They may be configured to facilitate communication with various external devices, such as a display or speakers, via one or more a/V ports 252. Example peripheral interfaces 244 can include a serial interface controller 254 and a parallel interface controller 256, which can be configured to facilitate communications with external devices such as input devices (e.g., keyboard, mouse, pen, voice input device, touch input device) or other peripherals (e.g., printer, scanner, etc.) via one or more I/O ports 258. An example communication device 246 may include a network controller 260, which may be arranged to communicate with one or more other computing devices 262 over a network communication link via one or more communication ports 264.
A network communication link may be one example of a communication medium. Communication media may typically be embodied by computer readable instructions, data structures, program modules, and may include any information delivery media, such as carrier waves or other transport mechanisms, in a modulated data signal. A "modulated data signal" may be a signal that has one or more of its data set or its changes made in such a manner as to encode information in the signal. By way of non-limiting example, communication media may include wired media such as a wired network or private-wired network, and various wireless media such as acoustic, Radio Frequency (RF), microwave, Infrared (IR), or other wireless media. The term computer readable media as used herein may include both storage media and communication media.
In a computing device 200 according to the present invention, the program 222 comprises a plurality of program instructions that instruct the execution method 300 to instruct the processor 204 to perform some of the steps of the computing device performance testing method 300 executed in the computing device 200 of the present invention so that some of the components in the computing device 200 implement the testing of the server performance by executing the computing device performance testing method 300 of the present invention.
Computing device 200 may be implemented as a server, e.g., file server 240, database 250, a server, an application server, etc., which may be a device such as a Personal Digital Assistant (PDA), a wireless web-browsing device, an application-specific device, or a hybrid device that include any of the above functions. May be implemented as a personal computer including both desktop and notebook computer configurations, and in some embodiments, the computing device 200 is configured to perform the computing device performance testing method 300.
FIG. 3 shows a flowchart of a computing device performance testing method 300 according to an exemplary embodiment of the invention. The computing device performance testing method 300 of the present invention is suitable for execution in a computing device and is further suitable for execution in the computing device 200 shown in FIG. 2 for testing the performance of the computing device.
As shown in FIG. 3, the computing device performance testing method 300 begins with step S310, where a plurality of test items are determined for performing a performance test on a computing device. The test items that the computing device needs to be tested for include processors, internal memory, cores, external memory, networks, displays, operations, and applications. When the computing equipment is tested, the hardware of the computing equipment is tested, and the software of the computing equipment is also tested, so that the performance of the computing equipment is comprehensively evaluated, and the running compatibility of an operating system is investigated. Each test task evaluates the performance of an aspect of the computing device.
When determining the test items to be tested by the computing equipment, the test items can be set according to the required evaluation purpose. If it is necessary to evaluate a certain hardware or a certain unilateral performance of the computing device, the computing device may be configured to evaluate only a single test item. And correspondingly constructing various and multiple subtasks when the test task is configured according to the test project so as to comprehensively evaluate the test project. If the hardware performance of the computing device needs to be tested in a repeated manner, the test items can be set to include: a processor, internal memory, external memory, and a display. If the software performance of the computing device needs to be tested with emphasis, the test items can be set to include: kernel, network, operations, and applications. If the software and hardware of the computing equipment need to be tested comprehensively, the computing equipment can be set to test all the test items. After the computing equipment is comprehensively tested, the performance of the obtained computing equipment is frequency-divided, and the test result not only depends on hardware, but also depends on an operating system, a development library and a compiler.
Subsequently, step S320 is executed to configure the test task according to the determined test items, so as to obtain a plurality of test tasks. And when the test task is configured, a plurality of subtasks are constructed under one test task. The test items can be further specifically subdivided, and the method comprises the steps of system core grading, encryption and decryption, compression and decompression, audio and video, image processing, 2D and 3D image rendering, scientific calculation, development and compilation and the like. The corresponding test task may also be further divided into a plurality of sub-tasks. Specifically, the execution times and the execution parameters of the subtasks are set, the subtasks are constructed according to the execution times and the execution parameters of each subtask, a plurality of subtasks are obtained, and finally the subtasks are collected to serve as the test tasks. And testing one performance parameter of each subtask constructed under the test task to obtain a corresponding subtask score, and synthesizing all subtask scores to obtain a corresponding test item score of the test task.
In order to avoid uncertainty of word testing, the number of executions needs to be set and executed for each subtask multiple times. And executing the subtasks according to the execution times of the subtasks. The execution parameters of the subtasks refer to parameters substituted in the execution of the subtasks, and the execution parameters of the subtasks are set, namely initial values of the execution of the subtasks are set. Each sub task can also obtain performance parameters through different testing methods, and each testing method corresponds to one testing program.
According to one embodiment of the invention, when the test item is a processor, the corresponding build test task is a processor test. The processor test tests various performances of the processor. When the processor test is configured, the constructed multiple subtasks comprise: the method comprises the following steps of running time test, million floating point operation time test, data throughput time test, decoding time test and processor response time test.
The running times test tests the fixed-point computing capability of the processor to obtain corresponding performance parameters, such as the running times per second, the running times per minute or the running times per day. The execution parameter of the running time test is running time, and the execution time length of the subtask is specified.
The million floating point operation times test tests the floating point operation capability of the processor to obtain corresponding performance parameters: millions of floating point operations per second. The execution parameter of the million floating point operation times test is the data size, and the data range executed by the subtask is specified.
The data throughput times test tests the cache performance of the processor to obtain corresponding performance parameters: data throughput per second. The execution parameters of the data throughput time test are the size of a memory and the number of parallels, and a cache hit scene executed by the subtask is specified.
The decoding times test tests the fixed-point computing capability of the processor to obtain corresponding performance parameters: number of decodes per second. And the execution parameter of the decoding times test is an instruction file, and a standard stream adopted by the subtask for decoding is set.
The response time test of the processor tests the response delay of the processor to obtain corresponding performance parameters: processor response time. The execution parameters of the processor response time test are calculation ranges, and data ranges for the execution of the subtasks are specified.
Table 1 shows a list of subtasks tested by a processor according to one embodiment of the invention, each subtask being testable by a plurality of test methods, one test program for each test method.
TABLE 1
Figure BDA0003314368330000101
Figure BDA0003314368330000111
According to one embodiment of the invention, when the test item is an internal memory, the corresponding build test task is an internal memory test. The internal memory test tests various performances of the internal memory. When configuring internal memory testing, the plurality of subtasks of the framework include: the method comprises the following steps of data volume transmission test, transmission frequency test, access record number test, data throughput test and internal memory response time test.
The transmission data quantity test tests the read-write bandwidth of the internal memory to obtain corresponding performance parameters: the amount of data transmitted per second. The execution parameters of the transmission data quantity test are a reading address and a storage address, and a data source address and a data target address for executing the subtask are specified.
The transmission times test tests the read-write bandwidth of the internal memory to obtain corresponding performance parameters: number of transmissions per second. The execution parameter of the transmission times test is the size of the specified memory block, and the size of data transmitted each time when the subtask is executed is specified.
And testing the data writing capability of the internal memory by the access record number test to obtain corresponding performance parameters: number of records accessed per second. The execution parameter of the access log number test is the number of threads, which specifies the number of connections used in executing the subtask.
The data throughput test tests the data reading capacity of the internal memory to obtain corresponding performance parameters: data throughput per second. The execution parameter of the data throughput test is the memory size, which specifies the memory block used when executing the subtask.
The response time test of the internal memory tests the response delay of the internal memory to obtain corresponding performance parameters: the internal memory corresponds to the time. The execution parameter of the internal memory is the memory size, which specifies the memory block used when executing the subtask.
Table 2 shows a subtask list for internal memory testing according to an exemplary embodiment of the present invention, each subtask list being testable by a plurality of test methods, each test method corresponding to a test program.
TABLE 2
Figure BDA0003314368330000121
According to one embodiment of the invention, when the test item is a kernel, the corresponding build test task is a kernel test. System kernel testing tests the kernel of an operating system running in a computing device. When the kernel test is configured, the constructed multiple subtasks comprise: compiling times test, communication times test, square opening times test, calling times test, running times test, transmission byte number test and kernel response time test.
The compiler of the operating system is tested by the compiling frequency test to obtain corresponding test parameters: compile times per minute. The execution parameter of the compiling time test is the running time, and the test time for executing the subtask is specified.
The communication frequency test tests the communication capability of the operating system to obtain corresponding test parameters: number of communications per second. The execution parameter of the communication frequency test is running time, and the test time for executing the subtask is specified.
And testing the computing capability of the operating system by the square-open times test to obtain corresponding test parameters: square open times per minute. The execution parameter of the square-open times test is the execution time, and the test time for executing the subtask is specified.
The calling times test tests the system calling capability of the operating system to obtain corresponding test parameters: number of calls per second. The execution parameter of the calling frequency test is execution time, and the test duration for executing the subtask is specified.
The running times test tests the thread of the operating system to obtain corresponding test parameters: number of runs per second. The execution parameter of the running time test is running time, and the test time length for executing the subtask is specified.
The byte number transmission test tests the process of the operating system to obtain corresponding test parameters: the number of bytes transferred per second. The execution parameter of the transmission byte number test is the size of transmission data, and the data transmission quantity of each execution of the subtask is specified.
The kernel response time test tests the response delay of the operating system to obtain corresponding test parameters: kernel response time. The execution parameter of the kernel response time is an execution command, and specifies the execution command used for executing the subtask each time.
Table 3 shows a list of subtasks for kernel testing, each subtask being testable by a plurality of test methods, each test method corresponding to a test program, according to an embodiment of the invention.
TABLE 3
Figure BDA0003314368330000131
Figure BDA0003314368330000141
According to one embodiment of the invention, when the test item is an external memory, the corresponding build test task is an external memory test. The external memory test tests various performances of the external memory. When the external memory test is configured, a plurality of constructed subtasks comprise: the method comprises the following steps of transmitting byte number test, reading and writing throughput rate test, running time test, reading byte number test, writing byte number test and external memory response time test.
The transmission byte number test tests the write-in capability of the external memory to obtain corresponding performance parameters: the number of bytes transferred per second. The operation parameters of the transmission byte number test comprise operation time, file path, buffer size and maximum block number.
The read-write throughput test tests the transaction processing capability of the external memory to obtain corresponding performance parameters: geometric mean of transaction speed and read-write throughput. The operating parameters of the read-write throughput test include: maximum number of files, maximum number of directories, minimum size of file, maximum size of file, write size, read size, and thread number.
The operation times test tests the write-in capability of the external memory to obtain corresponding performance parameters: number of runs per second. The operation parameters of the operation time test comprise: file path and run times.
The byte number reading test tests the reading capability of the external memory to obtain corresponding performance parameters: the number of bytes read per second. The operation parameters of the byte reading test comprise: file path, test block size, and test file size.
The writing section number test tests the writing capability of the external memory to obtain corresponding performance parameters: number of bytes written per second. The operating parameters of the written number test include: test block size and test file size.
The response time test of the external memory tests the response delay of the external memory, and obtains corresponding performance parameters: response time of the external memory. The operating parameters of the external memory response time test include: a file path.
Table 4 shows a list of subtasks for external memory testing, each subtask being testable by a plurality of test methods, each test method corresponding to a test program, according to one embodiment of the invention.
TABLE 4
Figure BDA0003314368330000151
Figure BDA0003314368330000161
Figure BDA0003314368330000171
Figure BDA0003314368330000181
According to an embodiment of the present invention, when the test item is a network, the corresponding construction test task is a network test. Network testing tests a network subsystem of a computing device. When the network test is configured, a plurality of constructed subtasks comprise: a transmission byte number test and a network response time test.
The transmission byte number test tests the bandwidth of the network subsystem to obtain corresponding test parameters: the number of bytes transferred per second. The execution parameters of the transmission byte number test comprise transmission bytes, common transmission bytes and a service end IP.
The network response time test tests the delay of the network subsystem to obtain corresponding test parameters: network subsystem response time. The execution parameters of the network response time test include the amount of data transferred.
Table 5 shows a list of subtasks for kernel testing, each subtask being testable by a plurality of test methods, each test method corresponding to a test program, according to an embodiment of the invention.
TABLE 5
Figure BDA0003314368330000182
Figure BDA0003314368330000191
According to one embodiment of the invention, when the test item is a display, the corresponding build test task is a display test. The display test tests graphics processing performance of the computing device. When the test task is configured, the constructed multiple subtasks include: drawing times test and transmission frame number test.
The drawing times test tests the processing performance of the plane graph to obtain corresponding performance parameters: number of plots per second. The operation parameters of the drawing number test include the number of operations and the operation time.
The transmission frame number test tests the three-dimensional graphic processing performance to obtain corresponding performance parameters: the number of frames transmitted per second. The operation parameters of the transmission frame number test comprise a scene, a model and a data position.
Table 6 shows a list of subtasks showing tests, each of which may be performed by a plurality of test methods, each of which corresponds to a test procedure, according to one embodiment of the present invention.
TABLE 6
Figure BDA0003314368330000192
Figure BDA0003314368330000201
According to one embodiment of the invention, when the test item is a runtime, the corresponding build test task is a runtime test. The run test tests the operating system's ability to run applications. When the test task is configured, the constructed multiple subtasks include: run number per second test, run number per minute test, and run number per hour test.
And (3) testing the running specified data set and the compiling capability by the running frequency test per second to obtain corresponding performance parameters: number of runs per second. The operating parameters for the run-per-second test include the size of the data or the number of runs.
And (3) testing the capabilities of the running function and the like by the running times per minute test to obtain corresponding performance parameters: run number per minute. The run parameters for the run per minute test included the number of runs.
And (3) testing the compiling and other capabilities by the hourly operation frequency test to obtain corresponding performance parameters: run number per hour. The operating parameters tested for run number per hour included the data size.
Table 7 shows a list of subtasks to run a test, each subtask being testable by a plurality of test methods, each test method corresponding to a test program, according to an embodiment of the invention.
TABLE 7
Figure BDA0003314368330000202
Figure BDA0003314368330000211
Figure BDA0003314368330000221
According to one embodiment of the invention, when the test item is an application, the corresponding build test task is an application test. The application test tests the performance of the application service of the operating system. When the test task is configured, the constructed multiple subtasks include: hourly run number test and monthly run number test.
And testing the performance of the execution file by the hourly operation frequency test to obtain corresponding performance parameters: run number per hour. The operating parameters of the hourly run number test included: the number of runs.
And (3) testing the performance of the operation file by the monthly running frequency test to obtain the response performance parameters: number of runs per month. The operating parameters for the number of runs per month include: a specified directory and a specified file.
Table 8 shows a list of subtasks to which a test is applied, each subtask being testable by a plurality of test methods, each test method corresponding to a test procedure, according to one embodiment of the invention.
TABLE 8
Figure BDA0003314368330000222
Figure BDA0003314368330000231
Subsequently, step S330 is executed to execute each test task, and obtain a test item score corresponding to the test task. When each test task is executed, each subtask in the test task is executed to obtain a plurality of subtask scores, then a corresponding score weight is set for each subtask score, and a test item score is calculated according to the subtask scores and the corresponding score weights.
According to an embodiment of the present invention, when each subtask in the test task is executed, the first performance parameter set is obtained according to the execution parameters set by the subtask and the set execution times. The number of performance parameters in the first set of performance parameters is the same as the number of executions.
These scores are discarded because some minor failures in the system, such as background process wakeups, may slow down running tests several times. The worst 1/3 performance parameters in the first set of performance parameters are removed to obtain a second set of performance parameters.
And then sorting the second performance parameter set according to the result, averaging 1/3 performance parameters in the middle part to obtain the subtask score of the subtask, and further obtaining a plurality of subtask scores according to the steps.
According to an embodiment of the invention, the first performance parameter set or the second performance parameter set may also be averaged directly as a subtask score for the subtask.
According to one embodiment of the invention, when the corresponding scoring weight is set for the subtask scoring, the corresponding scoring weight can be set according to the importance of the subtask in the test task; if the importance of the subtask is high, a higher scoring weight is set, and if the importance of the subtask is low, a lower scoring weight is set.
And then, calculating to obtain the score of the test item according to the subtask score and the corresponding score weight. According to one embodiment of the invention, the test item score may be calculated according to the following formula:
Figure BDA0003314368330000241
wherein, the Cacescore is the score of the test item, N is the number of subtasks configured in the corresponding test task of the test item, ResultValiAnd scoring the subtasks of the ith subtask, and Dim is a scoring weight corresponding to each subtask.
Subsequently, step S340 is performed, and a corresponding score weight is set for each test item score. When the corresponding score weight is set for the score of the test item, the score can be set according to the importance of the test item in the performance test of the computing equipment; if the importance of the test item is high, a higher score weight is set, and if the importance of the test item is low, a lower score weight is set.
Finally, step S350 is executed, and a performance score of the computing device is calculated according to the test item score and the corresponding score weight. According to one embodiment of the invention, the performance score of a computing device may be calculated according to the following formula:
Figure BDA0003314368330000242
wherein Totalscore isPerformance rating of the computing device, N being the number of test items to perform a performance test on the computing device, CacescoreiThe score of the test item for the ith test item is Dim, and the score weight corresponding to each test item is Dim.
According to an embodiment of the present invention, when each subtask in the test task is executed, the subtask is further executed according to the number of times of execution of the subtask to obtain a plurality of performance parameters, and the current peak score and the current valley score of the subtask are determined according to the plurality of performance parameters. When the peak score and the valley score are determined, determining the optimal 1/3 number performance parameters from the second performance parameter set, and calculating the average to obtain the peak score of the current time; the worst 1/3 performance parameters are determined from the second set of performance parameters and averaged to obtain the current trough score.
Also stored in the computing device are a plurality of historical peak scores and a plurality of historical valley scores for the subtasks. The historical valley score and the historical peak wind are the historical scores resulting from performing this subtask before this test. The invention also comprises the following steps: and drawing a peak valley trend graph according to the time sequence generated by the peak score, the valley score, the historical peak scores and the historical valley scores of the subtasks. Each time node in the peak valley trend graph corresponds to a set of data comprising a valley score and a peak score. Historical performance scores and score variation trends for an aspect of the computing device may be known from the peak-to-valley trend graph.
The invention relates to a performance test method of a computing device, which is suitable for being executed in the computing device and comprises the following steps: determining a plurality of test items for performing performance test on the computing equipment according to test requirements, and configuring test tasks according to the determined test items to obtain a plurality of test tasks. Performing multiple testing tasks in a computing device may enable automated testing of multiple aspects of the performance of the computing device. And then executing each test task to obtain a test item score corresponding to the test task, and setting a corresponding score weight for each test item score so as to calculate and obtain a performance score of the computing equipment according to the test item score and the corresponding score weight, so that the part of the computing equipment needing to be tested can be tested quickly, and the performance of the computing equipment can be known visually through performance frequency division. And the compatibility of the operating system on the computing equipment is verified by carrying out various tests on the computing equipment, so that the effects of automatically verifying the compatibility, expandability and usability of the operating system are achieved.
When the computing device is further tested, not only the hardware of the computing device but also the software of the computing device is tested, such as testing the kernel, so as to evaluate the performance of the operating system and the running application thereof and know the running condition of the application. Therefore, the performance of the computing equipment is comprehensively evaluated. When the performance of the computing equipment is high, the performance of the hardware of the computing equipment is good, and meanwhile, the operating system running in the computing equipment is good in compatibility.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim.
Those skilled in the art will appreciate that the modules or units or groups of devices in the examples disclosed herein may be arranged in a device as described in this embodiment, or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may be further divided into multiple sub-modules.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. Modules or units or groups in embodiments may be combined into one module or unit or group and may furthermore be divided into sub-modules or sub-units or sub-groups. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments.
Furthermore, some of the described embodiments are described herein as a method or combination of method elements that can be performed by a processor of a computer system or by other means of performing the described functions. A processor having the necessary instructions for carrying out the method or method elements thus forms a means for carrying out the method or method elements. Further, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is used to implement the functions performed by the elements for the purpose of carrying out the invention.
The various techniques described herein may be implemented in connection with hardware or software or, alternatively, with a combination of both. Thus, the methods and apparatus of the present invention, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium, wherein, when the program is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention.
In the case of program code execution on programmable computers, the computing device will generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Wherein the memory is configured to store program code; the processor is configured to perform the computing device performance testing method of the present invention according to instructions in the program code stored in the memory.
By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer-readable media includes both computer storage media and communication media. Computer storage media store information such as computer readable instructions, data structures, program modules or other data. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. Combinations of any of the above are also included within the scope of computer readable media.
As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The present invention has been disclosed in an illustrative rather than a restrictive sense, and the scope of the present invention is defined by the appended claims.

Claims (10)

1. A computing device performance testing method, adapted to be executed in a computing device, the method comprising the steps of:
determining a plurality of test items for performing a performance test on a computing device;
configuring a test task according to the determined test items to obtain a plurality of test tasks;
executing each test task to obtain a test item score corresponding to the test task;
setting corresponding score weight for each test item score;
and calculating to obtain the performance score of the computing equipment according to the test item score and the corresponding score weight.
2. The method of claim 1, wherein said configuring a test task according to said determined test items comprises the steps of:
setting the execution times and the execution parameters of the subtasks;
constructing subtasks according to the execution times and the execution parameters of each subtask to obtain a plurality of subtasks;
and aggregating the plurality of subtasks as the test tasks.
3. The method of claim 2, wherein said performing each test task to obtain a test item score corresponding to the test task comprises the steps of:
executing each subtask in the test task to obtain a plurality of subtask scores;
and setting corresponding scoring weight for each subtask score, and calculating to obtain the score of the test item according to the subtask score and the corresponding scoring weight.
4. The method of claim 3, wherein the test task comprises a processor test, and the constructed plurality of sub-tasks when the test task is configured according to the determined test items comprises:
the method comprises the following steps of running time test, million floating point operation time test, data throughput time test, decoding time test and processor response time test.
5. The method of claim 4, wherein the test task further comprises an internal memory test, and the constructed plurality of sub-tasks when the test task is configured according to the determined test items comprises:
the method comprises the following steps of data volume transmission test, transmission frequency test, access record number test, data throughput test and internal memory response time test.
6. The method of claim 5, wherein the test task further comprises a kernel test, and the constructed plurality of sub-tasks when the test task is configured according to the determined test item comprises:
compiling times test, communication times test, square opening times test, calling times test, running times test, transmission byte number test and kernel response time test.
7. The method of claim 6, wherein the test task further comprises an external memory test, and the constructed plurality of sub-tasks when the test task is configured according to the determined test items comprises:
the method comprises the following steps of transmitting byte number test, reading and writing throughput rate test, running time test, reading byte number test, writing byte number test and external memory response time test.
8. The method of claim 7, wherein the test task further comprises a network test, and the constructed plurality of sub-tasks when the test task is configured according to the determined test items comprises:
a transmission byte number test and a network response time test.
9. A computing device, comprising:
one or more processors;
a memory; and
one or more apparatuses comprising instructions for performing the method of any of claims 1-8.
10. A computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by a computing device, cause the computing device to perform the method of any of claims 1-8.
CN202111226702.8A 2021-10-21 Computing device performance test method, computing device and storage medium Active CN113886162B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111226702.8A CN113886162B (en) 2021-10-21 Computing device performance test method, computing device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111226702.8A CN113886162B (en) 2021-10-21 Computing device performance test method, computing device and storage medium

Publications (2)

Publication Number Publication Date
CN113886162A true CN113886162A (en) 2022-01-04
CN113886162B CN113886162B (en) 2024-05-31

Family

ID=

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114721922A (en) * 2022-05-16 2022-07-08 北京并行科技股份有限公司 Performance evaluation method of server cluster, computing equipment and storage medium
CN116302756A (en) * 2023-03-22 2023-06-23 无锡市软测认证有限公司 Performance test system and method based on FPGA (field programmable Gate array) accelerator card
CN116340070A (en) * 2023-03-25 2023-06-27 郑州航空工业管理学院 Test method of bioinformatics high-performance computing platform

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5305235A (en) * 1991-07-10 1994-04-19 Mitsubishi Denki Kabushiki Kaisha Monitoring diagnosis device for electrical appliance
CN108519933A (en) * 2018-02-08 2018-09-11 广州视源电子科技股份有限公司 Board test method, device, readable storage medium storing program for executing and computer equipment
CN109800138A (en) * 2018-12-18 2019-05-24 平安科技(深圳)有限公司 A kind of cpu test method, electronic device and storage medium
CN111338924A (en) * 2020-03-09 2020-06-26 苏州浪潮智能科技有限公司 Server performance test method, system, equipment and medium
CN113219287A (en) * 2021-05-21 2021-08-06 山东电工电气集团有限公司 Method for carrying out capacitance compatibility rating and weak link positioning on state net core FTU

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5305235A (en) * 1991-07-10 1994-04-19 Mitsubishi Denki Kabushiki Kaisha Monitoring diagnosis device for electrical appliance
CN108519933A (en) * 2018-02-08 2018-09-11 广州视源电子科技股份有限公司 Board test method, device, readable storage medium storing program for executing and computer equipment
CN109800138A (en) * 2018-12-18 2019-05-24 平安科技(深圳)有限公司 A kind of cpu test method, electronic device and storage medium
CN111338924A (en) * 2020-03-09 2020-06-26 苏州浪潮智能科技有限公司 Server performance test method, system, equipment and medium
CN113219287A (en) * 2021-05-21 2021-08-06 山东电工电气集团有限公司 Method for carrying out capacitance compatibility rating and weak link positioning on state net core FTU

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114721922A (en) * 2022-05-16 2022-07-08 北京并行科技股份有限公司 Performance evaluation method of server cluster, computing equipment and storage medium
CN114721922B (en) * 2022-05-16 2022-10-04 北京并行科技股份有限公司 Performance evaluation method of server cluster, computing equipment and storage medium
CN116302756A (en) * 2023-03-22 2023-06-23 无锡市软测认证有限公司 Performance test system and method based on FPGA (field programmable Gate array) accelerator card
CN116302756B (en) * 2023-03-22 2023-10-31 无锡市软测认证有限公司 Performance test system and method based on FPGA (field programmable Gate array) accelerator card
CN116340070A (en) * 2023-03-25 2023-06-27 郑州航空工业管理学院 Test method of bioinformatics high-performance computing platform

Similar Documents

Publication Publication Date Title
US11762690B2 (en) Data processing method and related products
US11847554B2 (en) Data processing method and related products
US20070271207A1 (en) Determining Compliance Rates for Probabilistic Requests
US20230126597A1 (en) Container orchestration framework
US8681166B1 (en) System and method for efficient resource management of a signal flow programmed digital signal processor code
GB2506122A (en) Integrating data transform test with data transform tool
US20130013283A1 (en) Distributed multi-pass microarchitecture simulation
US20210158201A1 (en) Dynamically predict optimal parallel apply algorithms
US10528691B1 (en) Method and system for automated selection of a subset of plurality of validation tests
WO2019104844A1 (en) Automatic performance testing method, apparatus and device for monetary fund system, and storage medium
CN114721922B (en) Performance evaluation method of server cluster, computing equipment and storage medium
US10445218B2 (en) Execution of graphic workloads on a simulated hardware environment
CN113886162B (en) Computing device performance test method, computing device and storage medium
CN113886162A (en) Computing equipment performance test method, computing equipment and storage medium
CN115576711A (en) Method and system for simulating returned data and computing equipment
CN115576766A (en) Flash memory management algorithm debugging method, system, device and readable storage medium
CN115048255A (en) Automatic test method, device, host and storage medium
US11520961B2 (en) Heterogeneous-computing based emulator
CN110309038B (en) Performance test method and device, electronic equipment and computer readable storage medium
CN112579169B (en) Method and device for generating processor trace stream
US20200127679A1 (en) Reducing a size of multiple data sets
CN112799924A (en) Simulation test system and method for cloud storage system storing training data
CN115808612A (en) Chip physical IP test system, method and electronic equipment
CN115269353A (en) Energy consumption pressure measurement method and device of server, electronic equipment and storage medium
CN116701175A (en) GDS system read-write performance test method and device of server and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant