CN112416691A - Performance test method and system based on benchmark test tool - Google Patents

Performance test method and system based on benchmark test tool Download PDF

Info

Publication number
CN112416691A
CN112416691A CN202011431585.4A CN202011431585A CN112416691A CN 112416691 A CN112416691 A CN 112416691A CN 202011431585 A CN202011431585 A CN 202011431585A CN 112416691 A CN112416691 A CN 112416691A
Authority
CN
China
Prior art keywords
test
parameter
benchmark
performance
tool
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011431585.4A
Other languages
Chinese (zh)
Other versions
CN112416691B (en
Inventor
刘伟锋
丛日本
张鸿
刘清
谭娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Haiguang Information Technology Co Ltd
Original Assignee
Haiguang Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Haiguang Information Technology Co Ltd filed Critical Haiguang Information Technology Co Ltd
Priority to CN202011431585.4A priority Critical patent/CN112416691B/en
Publication of CN112416691A publication Critical patent/CN112416691A/en
Application granted granted Critical
Publication of CN112416691B publication Critical patent/CN112416691B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/22Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing
    • G06F11/2273Test methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Test And Diagnosis Of Digital Computers (AREA)

Abstract

The invention provides a performance test method and a system based on a benchmark test tool, wherein the performance test method based on the benchmark test tool comprises the following steps: acquiring test information of a device to be tested, wherein the test information comprises: all parameter ranges and corresponding parameter names of the device to be tested; generating a parameter range table according to the test information so as to enable all parameter ranges to correspond to corresponding parameter names one by one; acquiring a test requirement of a parameter range according to the parameter name and the type of the selected benchmark test tool; converting the parameter range table into a parameter set according to the test requirements, wherein the parameter set comprises a data table which can be identified by the benchmark test tool; and performing benchmark test on all the parameter sets through the benchmark test tool to obtain a test result. The invention can improve the efficiency of performance measurement.

Description

Performance test method and system based on benchmark test tool
Technical Field
The invention relates to the technical field of computational performance testing, in particular to a performance testing method and system based on a benchmark testing tool.
Background
Benchmark testing refers to the act of running a computer program to evaluate the performance of hardware and software, while benchmark testing tools allow system testers and users to objectively and independently evaluate the performance of hardware.
However, when the existing benchmark test tool is used to test the performance of some target devices, for example, when the calculation performance of an acceleration processor is tested, generally, due to the problems that the benchmark test tool needs to use a large number of parameters, a large number of parameter combinations, a large parameter value range, dependence among the parameters and the like, testers need to design each parameter range elaborately before testing, and spend a large amount of time inputting a large number of parameters according to the design in the test, so that the efficiency is low, errors are prone to occur, and the steps of result analysis after the test are complicated, and errors are prone to occur.
Disclosure of Invention
In order to solve the above problems, the performance testing method and system based on the benchmark testing tool provided by the invention convert the testing parameters into corresponding parameter sets through the testing requirements, and send the parameter sets to the benchmark testing tool for benchmark testing, so that the automation degree of the performance benchmark testing of the device to be tested can be improved, the manual input is reduced, and the efficiency of the benchmark testing is improved.
In a first aspect, the present invention provides a performance testing method based on a benchmark testing tool, including:
acquiring test information of a device to be tested, wherein the test information comprises: all parameter ranges and corresponding parameter names of the device to be tested;
generating a parameter range table according to the test information so as to enable all parameter ranges to correspond to corresponding parameter names one by one;
acquiring a test requirement of a parameter range according to the parameter name and the type of the selected benchmark test tool;
converting the parameter range table into a parameter set according to the test requirements, wherein the parameter set comprises a data table which can be identified by the benchmark test tool;
and performing benchmark test on all the parameter sets through the benchmark test tool to obtain a test result.
Optionally, different parameter names correspond to different test performances, and the test result includes key bytes corresponding to the different test performances;
the method further comprises the following steps: and analyzing the test result according to the key byte, and outputting an actual measurement performance value of the test performance corresponding to the key byte.
Optionally, the acquiring test information of the device under test includes:
obtaining the model of the device to be tested, and generating the test information according to the model; or/and
test information input by a tester is received.
Optionally, the method further comprises: the monitoring error reporting information module is used for obtaining a monitoring result and outputting the detection result;
the error information reporting module is used for detecting error information generated in the execution process of the method.
Optionally, the method further comprises:
acquiring attribute information of the device to be tested;
calculating a theoretical performance peak value corresponding to the test performance according to the attribute information;
and comparing the theoretical performance peak value with the actually measured performance value, and judging whether the test of the test performance is successful according to the comparison result.
Optionally, before the converting the parameter range table into a parameter set according to the test requirement, the method further includes:
and adding parameter names and corresponding parameter ranges necessary for the benchmark test tool to the parameter range table according to the test requirements.
Optionally, the converting the parameter range table into a parameter set according to the test requirement includes:
and according to the test requirement, modifying the format of the parameter name in the parameter range table to obtain the parameter set.
In a second aspect, the present invention provides a performance batch test system based on a benchmark test tool, including:
a first obtaining module configured to obtain test information of a device under test, the test information including: all parameter ranges and corresponding parameter names of the device to be tested;
the generating module is configured to generate a parameter range table according to the test information so as to enable all parameter ranges to correspond to corresponding parameter names one by one;
the second acquisition module is configured to acquire the test requirements of the parameter range according to the parameter name and the type of the selected benchmark test tool;
a conversion module configured to convert the parameter range table into a parameter set according to the test requirements, the parameter set including a data table recognizable by the benchmark test tool;
a test module configured to benchmark all the parameter sets by the benchmark test tool to obtain a test result.
Optionally, different parameter names correspond to different test performances, and the test result includes key bytes corresponding to the different test performances;
the system further comprises: and the analysis module is configured to analyze the test result according to the key byte and output a measured performance value of the test performance corresponding to the key byte.
Optionally, the first obtaining module includes:
the obtaining submodule is configured to obtain the model of the device to be tested and generate the test information according to the model; or/and
the test system comprises a first acquisition module configured to receive test information input by a tester.
Optionally, the system further comprises:
an error information module configured to detect error information occurring during the execution of the system
And the monitoring module is configured to monitor the error information reporting module to obtain a monitoring result and output the detection result.
Optionally, the system further comprises:
a third obtaining module configured to obtain attribute information of the device under test, where the attribute information includes: the model of the device to be tested
The calculation module is configured to calculate a theoretical performance peak value corresponding to the test performance according to the attribute information;
and the comparison module is configured to compare the theoretical performance peak value with the actually-measured performance value and judge whether the test of the test performance is successful according to a comparison result.
Optionally, the system further comprises:
a supplementary module configured to add parameter names and corresponding parameter ranges necessary for the benchmark test tool to the parameter range table according to the test requirements before converting the parameter range table into parameter sets according to the test requirements.
Optionally, the conversion module is further configured to modify a format of a parameter name in the parameter range table according to the test requirement, so as to obtain the parameter set.
According to the performance testing method and system based on the benchmark testing tool, provided by the embodiment of the invention, the parameter range table is generated, so that the batch benchmark testing of the performance pieces of the device to be tested through the benchmark testing tool can be conveniently and rapidly carried out in batches, meanwhile, the testing parameters are converted into the corresponding parameter sets according to the testing requirements and are submitted to the benchmark testing tool for benchmark testing of the parameter sets, therefore, the automation degree of the performance benchmark testing of the device to be tested can be improved, the manual input is reduced, and the efficiency of the benchmark testing is improved.
Drawings
FIG. 1 is a schematic flow chart of a benchmark tool based performance testing method according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a performance testing apparatus based on a benchmark testing tool according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
First, the proper names to which the present invention relates are explained as follows:
CSV files, in which CSV (Comma-Separated Values) are sometimes also called character-Separated Values, because a Separated character may not be a Comma, store tabular data (numbers and text) in plain text, meaning that the file is a sequence of characters, containing no data that has to be interpreted like binary digits;
a Rocblas-bench test tool is a benchmark test tool for calculating performance of a GPU or an acceleration processor under a Rocblas math base under an ROCM software platform.
The Matrix Multiplication (GEMM) can be expressed as C ═ alpha op (a) op (b) + beta C, where op (X) represents the operation of transposing or not transposing Matrix X, alpha and beta are scalars, A, B, C is a Matrix, where op (a) is a Matrix of m rows and k columns, op (b) is a Matrix of k rows and n columns, and op (C) is a Matrix of m rows and n columns.
Example one
With reference to fig. 1, the present embodiment provides a performance testing method based on a benchmark testing tool, and in the present embodiment, the method is applied in a high performance testing Linux operating system environment, and the method includes steps S101 to S105 as follows:
step S101: acquiring test information of a device to be tested, wherein the test information comprises: all parameter ranges and corresponding parameter names of the device under test.
The method can obtain a plurality of test performances of the device to be tested for testing; the parameter range may be a specific value, or may be a parameter indication or content corresponding to a parameter name. In this embodiment, the dut is exemplified by an accelerated processor, and in the process of performing benchmark test on the computation performance of the accelerated processor, the parameter names may be the number of rows of a matrix, the number of columns of the matrix, the single-precision floating point, the double-precision floating point, the half-precision floating point, and so on of the accelerated processor.
In an optional embodiment, the obtaining test information of the device under test includes:
obtaining the model of the device to be tested, and generating the test information according to the model; or/and
test information input by a tester is received.
Step S102: and generating a parameter range table according to the test information so as to enable all parameter ranges to correspond to corresponding parameter names one by one.
In this embodiment, the tester may choose to use the common microsoft Excel form tool to process all test parameter ranges. Preferably, the parameter range table is saved in a CSV file format, which can reduce the complexity in subsequent parameter parsing. The parameter range table takes the parameter name as a table header and the parameter range as the cell content aiming at different calculation precisions such as single-precision, double-precision, half-precision and the like of the floating point of the accelerated processor.
Step S103: and acquiring the test requirements of the parameter range according to the parameter name and the type of the selected benchmark test tool.
Step S104: and converting the parameter range table into a parameter set according to the test requirement so as to generate a corresponding benchmark test case, wherein the parameter set comprises a data table which can be identified by the benchmark test tool.
In an alternative embodiment, the converting the parameter range table into a parameter set according to the test requirement includes:
and according to the test requirement, modifying the format of the parameter name in the parameter range table to obtain the parameter set.
Furthermore, the parameter range table is analyzed through software, and meanwhile, the combination of the parameters is completed according to the requirements of the standard testing tool on different precision tests. Considering a high-performance test Linux operating system environment, the related processing is completed by testing the maintainability of the tool, preferably a widely used Python programming language. Meanwhile, in order to support the expandability of the tests with different precisions, an object-oriented programming method is used to support the test expandability with different precisions.
Step S105: and performing benchmark test on all the parameter sets through the benchmark test tool to obtain a test result.
According to the method, the parameter range table is generated, the batch benchmark test of the performance piece of the device to be tested through the benchmark test tool can be conveniently and rapidly carried out in batches, meanwhile, the test parameters are converted into the corresponding parameter sets according to the test requirements, and the parameter sets are subjected to the benchmark test through the benchmark test tool, so that the automation degree of the performance benchmark test of the device to be tested can be improved, the manual investment is reduced, and the efficiency of the benchmark test is improved.
Further, a benchmark tool rocbias-bench is taken as an example to benchmark the computing power of the acceleration processor. In the description, the parameter m representing the size of the matrix is used, and the following test ranges can be specified in the parameter range table:
if a range of parameters is specified as a single value, such as 10240; the benchmark tool rocbias-bench tests only 10240.
If the parameter ranges are specified as value ranges, e.g., 10240, 10480, and the default step size is 1; the benchmark tool rocbias-bench tests 10240, 10241, 10242, 10243, … …, 10480, respectively.
If the parameter ranges are specified as value ranges, such as 10240 and 10480, and the step size is specified as 32, the corresponding parameter ranges can be represented as 10240 and 10480: 32, a first step of removing the first layer; the benchmark tool rocbias-bench tests 10240, 10272, … …, and 10464, respectively.
If a parameter range is specified to specify a number of specific values, such as 10240, 10241, and 10248, the benchmark tool rocbias-bench tests 10240, 10241, and 10248, respectively.
And for the benchmark test tool Rocblas-bench, the parameters corresponding to other representative matrixes in the benchmark test of the accelerated processor computing capacity are more than that, such as: n and k, the testing process is similar to the parameter m, and is not described herein.
In an optional embodiment, different parameter names correspond to different test performances, and the test result includes key bytes corresponding to the different test performances;
the method further comprises the following steps: and analyzing the test result according to the key byte, and outputting an actual measurement performance value of the test performance corresponding to the key byte.
And when the benchmark test tool is called to start testing, obtaining a test result according to the keywords output by the benchmark test tool. Taking rocbias-bench as an example, a complete test output result is obtained by using a "rocbias-gflips" keyword. And the high-performance test Linux operating system environment is also considered, and the output of the test result is finished by testing the maintainability of the tool and preferably selecting the widely used Python programming language.
In an optional embodiment, the method further comprises: the monitoring error reporting information module is used for obtaining a monitoring result and outputting the detection result;
the error information reporting module is used for detecting error information generated in the execution process of the method.
In an optional embodiment, the method further comprises:
acquiring attribute information of the device to be tested;
calculating a theoretical performance peak value corresponding to the test performance according to the attribute information;
and comparing the theoretical performance peak value with the actually measured performance value, and judging whether the test of the test performance is successful according to the comparison result.
Further, in the process of testing the calculation performance of the acceleration processor, firstly, the model of the tested acceleration processor, the parameter of the calculation unit and other attribute information are obtained, and theoretical precision performance is generated according to the attribute information. The theoretical performance peak value is the number of chips of the acceleration processor and the main frequency of the acceleration processor Boost and specifies the number of precision cores and the number of floating point calculations which can be processed in a single clock cycle of the acceleration processor. Then, obtaining actual measurement efficiency according to the theoretical performance peak value and the actual measurement performance value obtained before, wherein the actual measurement efficiency is the actual measurement performance/the theoretical performance peak value, and if the actual measurement efficiency is higher than an expected value, the test is passed; if the measured efficiency is lower than the expected value, the test fails.
In an optional embodiment, before the converting the parameter range table into a parameter set according to the test requirement, the method further comprises:
and adding parameter names and corresponding parameter ranges necessary for the benchmark test tool to the parameter range table according to the test requirements.
The performance testing method based on the benchmark testing tool automatically converts the testing information of different testing types into corresponding benchmark test cases through the conversion of the testing information, so that on one hand, a tester can specify the testing coverage range of all parameters of the benchmark testing tool, and on the other hand, the tester focuses more on the design of testing coverage data, rather than using a great deal of time and energy for manual combination and input of the parameters. In addition, the method saves the testing time and the human resources and improves the testing quality by automatically acquiring the testing information, automatically generating the corresponding parameter set and automatically generating the testing result ratio.
Example two
The embodiment provides a performance testing method based on a benchmark testing tool, which comprises the following steps:
firstly, according to the precision type supported by the device to be tested, such as double-precision floating point, single-precision floating point and the like, and the specification of the internal memory size and the like of the device to be tested, the following parameter ranges and corresponding parameter names are obtained:
the parameter ranges of m and k of the matrix op (A) and the parameter ranges of k and n of the matrix op (B) are selected from the range of 10000 to 13000 for testing, for example, the size of an internal memory of a device to be tested is 16 GB; the matrix calculation precision parameter ranges are double-precision floating point (abbreviated as d), single-precision floating point (abbreviated as s), integer INT8 and the like; the values of alpha and beta are scalar quantities, and can be values of 1, 2 and the like.
Then, according to the parameter ranges and test requirements supported by the device to be tested, a parameter range table selected in the actual test is generated, such as table one:
watch 1
Figure BDA0002826444280000091
The parameter name transposeA and the corresponding parameter range N indicate that the matrix a is not transposed, and the parameter name transposeA and the corresponding parameter range T indicate that the matrix B is transposed.
Then, according to the test requirements of the selected benchmark test tool, such as rocblas-bench, the table-list is supplemented with the necessary parameters, such as the main dimension parameters of the matrix: lda, ldb, ldc, etc., and obtaining an updated parameter range table, such as table two:
watch two
Figure BDA0002826444280000092
And then, according to the test requirements of the selected benchmark test tool, converting the data in the updated parameter range table into the parameter set which can be identified by the test tool. In particular, the necessary format conversion is performed on the individual parameter names so that the benchmark tool is acceptable.
Wherein, the parameter name of a single letter is added with a minus sign; adding a negative to the parameter names of two or more letters; name conversion is carried out on the special parameter names, such as function parameter conversion to '-f'; converting the precision parameter of the matrix into '-r'; the necessary parameters required by the benchmark tool itself, such as the number of test iterations (-i), are added. The resulting parameter set is as in table three:
watch III
Figure BDA0002826444280000101
And then, starting a benchmark test according to the test parameter set, and extracting a test result.
And (4) defining parameter values of the parameter ranges corresponding to the parameter names in the table III one by one, and combining the corresponding parameter values for all the function types. The combination of the parameters- -lda, - -m, and r in Table three is used, and the lda parameter value is assumed to be equal to the m parameter value, so as to obtain a test parameter table, such as Table four.
Watch four
Figure BDA0002826444280000102
Then, according to the format of "parameter name 1, parameter name 2, parameter value 2 …", a final parameter list that the benchmark test tool can execute is obtained, as shown in table five:
watch five
Figure BDA0002826444280000103
Wherein, the sequence of the labels in the fifth table is the sequence of the test.
And finally, starting a benchmark test tool and extracting test results from the table five.
EXAMPLE III
Referring to fig. 2, the present embodiment provides a performance batch test system 200 based on a benchmark test tool, which includes:
a first obtaining module 201 configured to obtain test information of a device under test, where the test information includes: all parameter ranges and corresponding parameter names of the device to be tested;
a generating module 202 configured to generate a parameter range table according to the test information, so as to correspond all parameter ranges to corresponding parameter names one to one;
the second obtaining module 203 is configured to obtain the test requirements of the parameter range according to the parameter name and the type of the selected benchmark test tool;
a conversion module 204 configured to convert the parameter range table into a parameter set according to the test requirement, the parameter set including a data table recognizable by the benchmark test tool;
a testing module 205 configured to benchmark all parameter sets by the benchmark testing tool to obtain a testing result.
In an optional embodiment, different parameter names correspond to different test performances, and the test result includes key bytes corresponding to the different test performances;
the system further comprises: and the analysis module is configured to analyze the test result according to the key byte and output a measured performance value of the test performance corresponding to the key byte.
In an alternative embodiment, the first obtaining module includes:
the obtaining submodule is configured to obtain the model of the device to be tested and generate the test information according to the model; or/and
the test system comprises a first acquisition module configured to receive test information input by a tester.
In an alternative embodiment, the system further comprises:
an error information module configured to detect error information occurring during the execution of the system
And the monitoring module is configured to monitor the error information reporting module to obtain a monitoring result and output the detection result.
In an alternative embodiment, the system further comprises:
a third obtaining module configured to obtain attribute information of the device under test, where the attribute information includes: the model of the device to be tested
The calculation module is configured to calculate a theoretical performance peak value corresponding to the test performance according to the attribute information;
and the comparison module is configured to compare the theoretical performance peak value with the actually-measured performance value and judge whether the test of the test performance is successful according to a comparison result.
In an alternative embodiment, the system further comprises:
a supplementary module configured to add parameter names and corresponding parameter ranges necessary for the benchmark test tool to the parameter range table according to the test requirements before converting the parameter range table into parameter sets according to the test requirements.
In an optional embodiment, the conversion module is further configured to modify a format of a parameter name in the parameter range table according to the test requirement to obtain the parameter set.
The above description is only for the specific embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (14)

1. A performance testing method based on a benchmark testing tool is characterized by comprising the following steps:
acquiring test information of a device to be tested, wherein the test information comprises: all parameter ranges and corresponding parameter names of the device to be tested;
generating a parameter range table according to the test information so as to enable all parameter ranges to correspond to corresponding parameter names one by one;
acquiring a test requirement of a parameter range according to the parameter name and the type of the selected benchmark test tool;
converting the parameter range table into a parameter set according to the test requirements, wherein the parameter set comprises a data table which can be identified by the benchmark test tool;
and performing benchmark test on all the parameter sets through the benchmark test tool to obtain a test result.
2. The benchmark-based performance testing method of claim 1, wherein different parameter names correspond to different testing performances, and the testing result includes key bytes corresponding to the different testing performances;
the method further comprises the following steps: and analyzing the test result according to the key byte, and outputting an actual measurement performance value of the test performance corresponding to the key byte.
3. The benchmark-based performance testing method of claim 1, wherein the obtaining of the testing information of the device under test comprises:
obtaining the model of the device to be tested, and generating the test information according to the model; or/and
test information input by a tester is received.
4. The benchmark-based performance testing method of claim 1, further comprising: the monitoring error reporting information module is used for obtaining a monitoring result and outputting the detection result;
the error information reporting module is used for detecting error information generated in the execution process of the method.
5. The benchmark-based performance testing method of claim 2, further comprising:
acquiring attribute information of the device to be tested;
calculating a theoretical performance peak value corresponding to the test performance according to the attribute information;
and comparing the theoretical performance peak value with the actually measured performance value, and judging whether the test of the test performance is successful according to the comparison result.
6. The benchmarking tool-based performance testing method of any one of claims 1-5, wherein prior to converting the parameter range table into parameter sets according to the testing requirements, the method further comprises:
and adding parameter names and corresponding parameter ranges necessary for the benchmark test tool to the parameter range table according to the test requirements.
7. The benchmark tool-based performance testing method of claim 1, wherein converting the parameter range table into a parameter set according to the testing requirements comprises:
and according to the test requirement, modifying the format of the parameter name in the parameter range table to obtain the parameter set.
8. A benchmark tool based performance batch testing system, comprising:
a first obtaining module configured to obtain test information of a device under test, the test information including: all parameter ranges and corresponding parameter names of the device to be tested;
the generating module is configured to generate a parameter range table according to the test information so as to enable all parameter ranges to correspond to corresponding parameter names one by one;
the second acquisition module is configured to acquire the test requirements of the parameter range according to the parameter name and the type of the selected benchmark test tool;
a conversion module configured to convert the parameter range table into a parameter set according to the test requirements, the parameter set including a data table recognizable by the benchmark test tool;
a test module configured to benchmark all the parameter sets by the benchmark test tool to obtain a test result.
9. The benchmark tool-based performance batch test system of claim 8, wherein different parameter names correspond to different test performances, and the test result comprises key bytes corresponding to different test performances;
the system further comprises: and the analysis module is configured to analyze the test result according to the key byte and output a measured performance value of the test performance corresponding to the key byte.
10. The benchmark tool-based performance batch test system of claim 8, wherein the first acquisition module comprises:
the obtaining submodule is configured to obtain the model of the device to be tested and generate the test information according to the model; or/and
the test system comprises a first acquisition module configured to receive test information input by a tester.
11. The benchmark tool-based performance batch test system of claim 8, further comprising:
an error information module configured to detect error information occurring during the execution of the system
And the monitoring module is configured to monitor the error information reporting module to obtain a monitoring result and output the detection result.
12. The benchmark tool-based performance batch test system of claim 9, further comprising:
a third obtaining module configured to obtain attribute information of the device under test, where the attribute information includes: the model of the device to be tested
The calculation module is configured to calculate a theoretical performance peak value corresponding to the test performance according to the attribute information;
and the comparison module is configured to compare the theoretical performance peak value with the actually-measured performance value and judge whether the test of the test performance is successful according to a comparison result.
13. The benchmark tool-based performance batch testing system according to any of claims 8 to 12, further comprising:
a supplementary module configured to add parameter names and corresponding parameter ranges necessary for the benchmark test tool to the parameter range table according to the test requirements before converting the parameter range table into parameter sets according to the test requirements.
14. The benchmark tool-based performance batch testing system of claim 8, wherein the transformation module is further configured to modify a format of parameter names in the parameter range table to obtain the parameter set according to the testing requirements.
CN202011431585.4A 2020-12-09 2020-12-09 Performance test method and system based on benchmark test tool Active CN112416691B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011431585.4A CN112416691B (en) 2020-12-09 2020-12-09 Performance test method and system based on benchmark test tool

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011431585.4A CN112416691B (en) 2020-12-09 2020-12-09 Performance test method and system based on benchmark test tool

Publications (2)

Publication Number Publication Date
CN112416691A true CN112416691A (en) 2021-02-26
CN112416691B CN112416691B (en) 2023-07-21

Family

ID=74774991

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011431585.4A Active CN112416691B (en) 2020-12-09 2020-12-09 Performance test method and system based on benchmark test tool

Country Status (1)

Country Link
CN (1) CN112416691B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013018376A1 (en) * 2011-08-04 2013-02-07 日本電気株式会社 System parameter settings assist system, data processing method for system parameter settings assist device, and program
CN106708676A (en) * 2016-12-01 2017-05-24 广州酷狗计算机科技有限公司 Interface test method and apparatus
CN107943689A (en) * 2017-11-16 2018-04-20 北京卫星信息工程研究所 Automated testing method and test system based on parametrization test script
CN109917268A (en) * 2019-01-23 2019-06-21 成都芯源系统有限公司 Test system and test method of voltage stabilizer
CN111400186A (en) * 2020-03-19 2020-07-10 时时同云科技(成都)有限责任公司 Performance test method and system
CN111475402A (en) * 2020-03-17 2020-07-31 中国平安人寿保险股份有限公司 Program function testing method and related device
CN111752834A (en) * 2020-06-24 2020-10-09 京东数字科技控股有限公司 Automatic testing method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013018376A1 (en) * 2011-08-04 2013-02-07 日本電気株式会社 System parameter settings assist system, data processing method for system parameter settings assist device, and program
CN106708676A (en) * 2016-12-01 2017-05-24 广州酷狗计算机科技有限公司 Interface test method and apparatus
CN107943689A (en) * 2017-11-16 2018-04-20 北京卫星信息工程研究所 Automated testing method and test system based on parametrization test script
CN109917268A (en) * 2019-01-23 2019-06-21 成都芯源系统有限公司 Test system and test method of voltage stabilizer
CN111475402A (en) * 2020-03-17 2020-07-31 中国平安人寿保险股份有限公司 Program function testing method and related device
CN111400186A (en) * 2020-03-19 2020-07-10 时时同云科技(成都)有限责任公司 Performance test method and system
CN111752834A (en) * 2020-06-24 2020-10-09 京东数字科技控股有限公司 Automatic testing method and device

Also Published As

Publication number Publication date
CN112416691B (en) 2023-07-21

Similar Documents

Publication Publication Date Title
CN102750223A (en) Error positioning method based on object-oriented program slice spectrum
CN111679979A (en) Destructive testing method and device
CN114625731B (en) Nuclear data cross section library generation method, device, equipment and storage medium
CN117787155A (en) Chip testability code dynamic simulation test system and test method
CN112416691A (en) Performance test method and system based on benchmark test tool
CN117667496A (en) Root cause analysis method and device, storage medium and electronic equipment
CN113222459A (en) System and method for dynamically constructing food uncertainty evaluation model by expression tree
CN112084735A (en) FPGA (field programmable Gate array) cutting method and system based on RTL (real time language) source code
CN116049293B (en) Method, device, equipment and medium for realizing analysis of CSV file based on database configuration
CN117150237A (en) Time sequence data prediction method, device, equipment and computer readable storage medium
CN116305837B (en) Modularized uncertainty quantitative evaluation method and equipment for highlighting proportion analysis
CN115495082B (en) TLV format data automatic conversion method and related equipment
CN110968518A (en) Analysis method and device for automatic test log file
CN115033434A (en) Kernel performance theoretical value calculation method and device and storage medium
CN113902296B (en) Intelligent test method and system for single-phase asynchronous motor
CN112380133B (en) Method and device for simulating instruction set simulator by utilizing function library
CN107273293B (en) Big data system performance test method and device and electronic equipment
CN114064469A (en) Interface automation test method and storage medium
CN111061994A (en) Calculation method and device for function derivation, calculation equipment and storage medium
CN112416727A (en) Batch processing operation checking method, device, equipment and medium
CN112347751A (en) Method and device for generating COSMIC workload evaluation document
CN112329108A (en) Optimized anti-floating checking calculation method and system for subway station
JP2017224185A (en) Bug contamination probability calculation program and bug contamination probability calculation method
CN111221821A (en) AI model iterative updating method, electronic equipment and storage medium
CN110851492B (en) Method and system for rapidly analyzing mechanical environment test data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant