CN111124853B - Cloud desktop scale evaluation system and method based on CPU performance - Google Patents

Cloud desktop scale evaluation system and method based on CPU performance Download PDF

Info

Publication number
CN111124853B
CN111124853B CN201911155091.5A CN201911155091A CN111124853B CN 111124853 B CN111124853 B CN 111124853B CN 201911155091 A CN201911155091 A CN 201911155091A CN 111124853 B CN111124853 B CN 111124853B
Authority
CN
China
Prior art keywords
cloud desktop
cloud
performance
test
cpu
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911155091.5A
Other languages
Chinese (zh)
Other versions
CN111124853A (en
Inventor
张辉
鲍豹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian Centerm Information Co Ltd
Original Assignee
Fujian Centerm Information Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian Centerm Information Co Ltd filed Critical Fujian Centerm Information Co Ltd
Priority to CN201911155091.5A priority Critical patent/CN111124853B/en
Publication of CN111124853A publication Critical patent/CN111124853A/en
Application granted granted Critical
Publication of CN111124853B publication Critical patent/CN111124853B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3452Performance evaluation by statistical analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2201/00Indexing scheme relating to error detection, to error correction, and to monitoring
    • G06F2201/815Virtual

Abstract

The invention provides a cloud desktop scale evaluation system based on CPU performance in the technical field of cloud desktops, which comprises a server, a cloud desktop management platform and a plurality of cloud desktops, wherein the server is used for controlling the cloud desktop management platform to create the cloud desktop and analyzing concurrent performance data of the cloud desktop; the server side comprises an analyzer for analyzing data uploaded by the concurrency controller and the concurrency controller for scheduling the cloud desktop management platform to create the cloud desktop and scheduling the cloud desktop to perform CPU benchmark test; one end of the concurrency controller is connected with the analyzer, and the other end of the concurrency controller is respectively connected with the cloud desktop management platform and the cloud desktop; the invention also provides a cloud desktop scale evaluation method based on the CPU performance. The invention has the advantages that: the accuracy of cloud desktop scale assessment is greatly improved.

Description

Cloud desktop scale evaluation system and method based on CPU performance
Technical Field
The invention relates to the technical field of cloud desktops, in particular to a cloud desktop scale evaluation system and method based on CPU performance.
Background
The cloud desktop is also called desktop virtualization and cloud computer, and is a new mode for replacing the traditional computer; after the cloud desktop is adopted, a user does not need to purchase a computer host, and all components such as a CPU (central processing unit), a memory, a hard disk, a network card and the like contained in the host are virtualized out in a server at the back end. By evaluating the scale of the cloud desktop, people can conveniently plan and design the configuration of the cloud desktop better.
Traditionally, the following two methods are adopted for cloud desktop scale assessment: the method comprises the following steps: estimating the number of the cloud desktops (the number of office users) which can be borne by the host computer to the maximum according to experience, for example, estimating that 1.2 cloud desktops can be borne by a 1vCPU (virtual processor) according to experience, and further estimating the number of the cloud desktops which can be borne by the host computer to the maximum; the second method comprises the following steps: and performing cloud desktop scale evaluation based on the performance index, namely simulating user load by calling an application program, further measuring the change trend of the response time of the application program under the condition that the concurrency number is increased, and evaluating the maximum concurrency number, namely the maximum bearable cloud desktop number of the host.
However, the conventional method has the following disadvantages: the method evaluates according to experience, so that the cloud desktop scale evaluation result is not accurate enough; the second method is evaluated based on performance indexes, but the application programs which can be used for simulation are limited, such as LoginVSI only simulates Notepad and zip compression programs by default, and if a new application program type needs to be supported, simulation scripts need to be customized by self, and a user is required to have certain programming capacity; and the user load is simulated by calling the application program, only the operations of starting the application program, opening a file and the like can be simulated, and the operation after the application program is started cannot be simulated, so that the cloud desktop scale evaluation result is not accurate enough.
Disclosure of Invention
One of the technical problems to be solved by the invention is to provide a cloud desktop scale evaluation system based on CPU performance, so that the accuracy of cloud desktop scale evaluation is improved.
The invention realizes one of the technical problems as follows: a cloud desktop scale evaluation system based on CPU performance comprises a server side, a cloud desktop management platform and a plurality of cloud desktops, wherein the server side is used for controlling the cloud desktop management platform to create a cloud desktop and analyzing concurrent performance data of the cloud desktop; the server side comprises an analyzer for analyzing data uploaded by the concurrency controller and the concurrency controller for scheduling the cloud desktop management platform to create the cloud desktop and scheduling the cloud desktop to perform CPU benchmark test;
one end of the concurrency controller is connected with the analyzer, and the other end of the concurrency controller is connected with the cloud desktop management platform and the cloud desktop respectively.
Furthermore, the cloud desktop is provided with a load simulator used for scheduling a CPU benchmark test program to test the cloud desktop.
The second technical problem to be solved by the present invention is to provide a cloud desktop scale evaluation method based on CPU performance, so as to improve the accuracy of cloud desktop scale evaluation.
The invention realizes the second technical problem: a cloud desktop scale evaluation method based on CPU performance needs to use the evaluation system, and comprises the following steps:
step S10, the concurrency controller controls the cloud desktop management platform to create a cloud desktop;
step S20, the concurrency controller controls each cloud desktop to execute a CPU benchmark test program for testing and generating a test result;
step S30, each cloud desktop sends the test result to an analyzer through a concurrency controller;
and step S40, the analyzer performs cloud desktop scale evaluation according to the received test result.
Further, the step S10 specifically includes:
step S11, the concurrency controller leads the cloud desktop template deployed on the load simulator into a cloud desktop management platform; (ii) a
Step S12, the concurrence controller sends a command for creating a cloud desktop to the cloud desktop management platform, and after receiving the command, the cloud desktop management platform creates the cloud desktop according to the imported cloud desktop template.
Further, the step S20 specifically includes:
step S21, the cloud desktop management platform logs in each cloud desktop, and after each cloud desktop logs in, each load simulator respectively sends the login result to the controller;
step S22, the concurrency controller judges whether the number of the received login results is equal to the number of the created cloud desktops, if so, the step S23 is executed; if not, go to step S21;
and step S23, the concurrency controller concurrently controls all the load simulators to execute the times set by the CPU benchmark test program, records the end time of each test and the score of the test, and generates the test result according to the end time and the score.
Further, the step S30 is specifically:
and each load simulator sends the test result to a concurrency controller, and the concurrency controller sends the received test result to an analyzer.
Further, the step S40 specifically includes:
step S41, the analyzer receives the test result;
step S42, the analyzer screens the effective data in the test result;
step S43, calculating a first performance score of each cloud desktop by the analyzer according to the effective data;
step S44, calculating a second performance score of the host by the analyzer according to the first performance score of each cloud desktop;
step S45, drawing a curve chart of the second performance score;
step S46, judging whether the curve graph has performance reduction inflection points, if not, increasing the creation number of cloud desktops, and entering step S20; and if so, determining the cloud desktop concurrency number corresponding to the performance degradation inflection point as the maximum cloud desktop concurrency scale of the host.
Further, the step S42 is specifically:
the analyzer selects the cloud desktop which finishes the test earliest as a reference, and the end time of the last test of the cloud desktop which finishes the test earliest is recorded as t 0 And all test end time in other cloud desktops exceeds t 0 The data in the data list are invalid data, and the invalid data are removed to obtain valid data.
Further, in step S43, the first performance score is specifically:
Figure BDA0002284591380000031
where C represents the first performance score, n represents the number of CPU benchmark test program tests, and x n And the score of the nth CPU benchmark test in the effective data is represented.
Further, in step S44, the second performance score is specifically:
Figure BDA0002284591380000041
where HC represents the second performance score, m represents the number of cloud desktops, C m A first performance score is represented for the mth cloud desktop.
The invention has the advantages that:
the concurrent controller controls each cloud desktop to execute the CPU benchmark test program for testing, the CPU benchmark test program covers various work tasks reflecting real application scenes, and the cloud desktop scale evaluation is closer to the actual application scenes of users than the cloud desktop scale evaluation based on performance indexes, compared with the traditional experience evaluation method and performance index evaluation method, the accuracy of cloud desktop scale evaluation is greatly improved, and the users can easily evaluate the cloud desktop scale without programming capacity, so that the evaluation difficulty is reduced; by means of a huge database of a CPU benchmark test program, the computing capability of the cloud desktop can be aligned with that of a mainstream PC platform, the computing capability of the cloud desktop is expressed more intuitively, and scheme designers are helped to rapidly plan and design the configuration of the cloud desktop.
Drawings
The invention will be further described with reference to the following examples and figures.
Fig. 1 is a logic architecture diagram of a cloud desktop scale evaluation system based on CPU performance according to the present invention.
FIG. 2 is a flowchart of a cloud desktop scale assessment method based on CPU performance according to the present invention.
Detailed Description
Referring to fig. 1 to 2, a preferred embodiment of a cloud desktop scale evaluation system based on CPU performance according to the present invention includes a server for controlling a cloud desktop management platform to create a cloud desktop and analyzing concurrent performance data of the cloud desktop, a cloud desktop management platform for creating a cloud desktop, and a plurality of cloud desktops; the server side comprises an analyzer for analyzing data uploaded by the concurrency controller and the concurrency controller for scheduling the cloud desktop management platform to create the cloud desktop and scheduling the cloud desktop to perform CPU benchmark test;
the concurrency controller sequentially and incrementally creates the cloud desktops and starts the cloud desktops, concurrently schedules all the load simulators in the created cloud desktops to execute the CPU benchmark test programs, and sends the test results to the analyzer;
the analyzer receives the test result, performs statistical analysis and outputs the concurrent performance data of the cloud desktop, and performs benchmarking with the mainstream PC platform to output the performance grade of the cloud desktop;
one end of the concurrency controller is connected with the analyzer, and the other end of the concurrency controller is connected with the cloud desktop management platform and the cloud desktop respectively.
The cloud desktop is provided with a load simulator used for scheduling a CPU benchmark test program to test the cloud desktop.
The invention discloses a preferred embodiment of a cloud desktop scale evaluation method based on CPU performance, which comprises the following steps:
step S10, the concurrency controller controls the cloud desktop management platform to create a cloud desktop;
step S20, the concurrency controller controls each cloud desktop to execute a CPU benchmark test program for testing and generating a test result;
the CPU Benchmark test program (Benchmark) is used for measuring the highest actual running performance of hardware of a machine and the performance improvement effect of software optimization and can be divided into a micro Benchmark test program and a macro Benchmark test program; micro-benchmarking programs are used to measure a particular aspect of a computer system, such as CPU fixed/floating point performance, memory speed, I/O speed, network speed, or system software performance (e.g., synchronization performance); the macro-benchmark test program is used for measuring the overall performance of a computer system or the universality of the optimization method, and different applications such as a Web service program, a data processing program and a scientific and engineering calculation program can be selected;
step S30, each load simulator sends the test result to a concurrency controller, and the concurrency controller sends the received test result to an analyzer;
and step S40, the analyzer performs cloud desktop scale evaluation according to the received test result.
The step S10 specifically includes:
step S11, the concurrency controller leads the cloud desktop template deployed on the load simulator into a cloud desktop management platform; the cloud desktop template is used for the cloud desktop management platform to create cloud desktops with the same configuration in batch;
step S12, the concurrence controller sends a command for creating a cloud desktop to the cloud desktop management platform, and after receiving the command, the cloud desktop management platform creates the cloud desktop according to the imported cloud desktop template.
The step S20 specifically includes:
step S21, the cloud desktop management platform logs in each cloud desktop, and after each cloud desktop logs in, each load simulator actively sends the login result to the concurrent controller respectively;
step S22, the concurrency controller judges whether the number of the received login results is equal to the number of the created cloud desktops, if yes, the created cloud desktops are all logged in, and step S23 is executed; if not, the step S21 is performed, which indicates that all the created cloud desktops have no login, and then the step S21 is performed;
step S23, the concurrency controller concurrently controls all the load simulators to execute the times set by the CPU benchmark test program, records the end time of each test and the score of the test, and generates the test result according to the end time and the score; the CPU benchmark test program is preferably Geekbench.
The step S40 specifically includes:
step S41, the analyzer receives the test result;
step S42, the analyzer filters the effective data in the test result;
step S43, calculating a first performance score of each cloud desktop by the analyzer according to the effective data;
step S44, calculating a second performance score of the host by the analyzer according to the first performance score of each cloud desktop;
step S45, drawing a curve chart of the second performance score;
step S46, judging whether the curve graph has performance reduction inflection points, if not, increasing the creation number of cloud desktops, and entering step S20; if the number of the cloud desktop concurrencies corresponding to the performance degradation inflection point is the maximum cloud desktop concurrency scale of the host, the mainstream PC platform can be benchmarked according to the database of the CPU benchmark test program, and then cloud desktop scheme designers can be helped to rapidly plan the optimal number of the concurrent desktops of various user types.
The step S42 specifically includes:
the analyzer selects the cloud desktop which finishes the test earliest as a reference, and the end time of the last test of the cloud desktop which finishes the test earliest is recorded as t 0 Other cloud tablesAll test end times in a plane exceed t 0 The data in the data list are invalid data, and the invalid data are removed to obtain valid data.
In step S43, the first performance score is specifically:
Figure BDA0002284591380000061
where C represents the first performance score, n represents the number of CPU benchmark tests, and x n The score of the nth CPU benchmark test in the effective data is represented; namely, the values of the CPU benchmark tests are geometrically averaged, so that the performance of the cloud desktop can be more objectively reflected.
In step S44, the second performance score specifically includes:
Figure BDA0002284591380000062
where HC represents the second performance score, m represents the number of cloud desktops, C m Representing a first performance score for the mth cloud desktop; namely, the performance scores of the cloud desktops are geometrically averaged, so that the performance of the host can be more objective.
In summary, the invention has the advantages that:
the concurrent controller controls each cloud desktop to execute the CPU benchmark test program for testing, the CPU benchmark test program covers various work tasks reflecting real application scenes, and the cloud desktop scale evaluation is closer to the actual application scenes of users than the cloud desktop scale evaluation based on performance indexes, compared with the traditional experience evaluation method and performance index evaluation method, the accuracy of cloud desktop scale evaluation is greatly improved, and the users can easily evaluate the cloud desktop scale without programming capacity, so that the evaluation difficulty is reduced; by means of a huge database of a CPU benchmark test program, the computing capability of the cloud desktop can be aligned with that of a mainstream PC platform, the computing capability of the cloud desktop is expressed more intuitively, and scheme designers are helped to rapidly plan and design the configuration of the cloud desktop.
Although specific embodiments of the invention have been described above, it will be understood by those skilled in the art that the specific embodiments described are illustrative only and are not limiting upon the scope of the invention, and that equivalent modifications and variations can be made by those skilled in the art without departing from the spirit of the invention, which is to be limited only by the appended claims.

Claims (5)

1. A cloud desktop scale evaluation method based on CPU performance is characterized in that: the method needs to use a cloud desktop scale evaluation system based on CPU performance, and the system comprises a server, a cloud desktop management platform and a plurality of cloud desktops, wherein the server is used for controlling the cloud desktop management platform to create the cloud desktop and analyzing concurrent performance data of the cloud desktop; the server side comprises an analyzer for analyzing data uploaded by the concurrency controller and the concurrency controller for scheduling the cloud desktop management platform to create the cloud desktop and scheduling the cloud desktop to perform CPU benchmark test;
one end of the concurrency controller is connected with the analyzer, and the other end of the concurrency controller is respectively connected with the cloud desktop management platform and the cloud desktop;
the cloud desktop is provided with a load simulator for scheduling a CPU benchmark test program to test the cloud desktop;
the method comprises the following steps:
step S10, the concurrency controller controls the cloud desktop management platform to create a cloud desktop;
step S20, the concurrence controller controls each cloud desktop to execute a CPU benchmark test program for testing and generating a test result;
step S30, each cloud desktop sends the test result to an analyzer through a concurrency controller;
step S40, the analyzer performs cloud desktop scale evaluation according to the received test result;
the step S20 specifically includes:
step S21, the cloud desktop management platform logs in each cloud desktop, and after each cloud desktop logs in, each load simulator respectively sends the login result to the controller;
step S22, the concurrency controller judges whether the number of the received login results is equal to the number of the created cloud desktops, if so, the step S23 is executed; if not, go to step S21;
step S23, the concurrency controller concurrently controls all the load simulators to execute the times set by the CPU benchmark test program, records the end time of each test and the score of the test, and generates the test result according to the end time and the score;
the step S40 specifically includes:
step S41, the analyzer receives the test result;
step S42, the analyzer filters the effective data in the test result;
step S43, calculating a first performance score of each cloud desktop by the analyzer according to the effective data;
the first performance score is specifically:
Figure FDA0003735080440000021
where C represents the first performance score, n represents the number of CPU benchmark test program tests, and x n The score of the nth CPU benchmark test in the effective data is represented;
step S44, calculating a second performance score of the host by the analyzer according to the first performance score of each cloud desktop;
step S45, drawing a curve chart of the second performance score;
step S46, judging whether the curve graph has performance reduction inflection points, if not, increasing the creation number of cloud desktops, and entering step S20; if the cloud desktop concurrency scale exists, the cloud desktop concurrency number corresponding to the performance degradation inflection point is the maximum cloud desktop concurrency scale of the host.
2. The cloud desktop scale assessment method based on CPU performance according to claim 1, characterized in that: the step S10 specifically includes:
step S11, the concurrency controller leads a cloud desktop template deployed on the load simulator into a cloud desktop management platform;
step S12, the concurrence controller sends a command for creating a cloud desktop to the cloud desktop management platform, and after receiving the command, the cloud desktop management platform creates the cloud desktop according to the imported cloud desktop template.
3. The cloud desktop scale assessment method based on CPU performance according to claim 1, characterized in that: the step S30 specifically includes:
and each load simulator sends the test result to a concurrency controller, and the concurrency controller sends the received test result to an analyzer.
4. The cloud desktop scale assessment method based on CPU performance according to claim 1, characterized in that: the step S42 specifically includes:
the analyzer selects the cloud desktop which finishes the test earliest as a reference, and the end time of the last test of the cloud desktop which finishes the test earliest is recorded as t 0 And all test ending time in other cloud desktops exceeds t 0 The data in the data list are invalid data, and the invalid data are removed to obtain valid data.
5. The cloud desktop size evaluation method based on CPU performance of claim 1, wherein: in step S44, the second performance score specifically includes:
Figure FDA0003735080440000031
where HC represents the second performance score, m represents the number of cloud desktops, C m A first performance score is represented for the mth cloud desktop.
CN201911155091.5A 2019-11-22 2019-11-22 Cloud desktop scale evaluation system and method based on CPU performance Active CN111124853B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911155091.5A CN111124853B (en) 2019-11-22 2019-11-22 Cloud desktop scale evaluation system and method based on CPU performance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911155091.5A CN111124853B (en) 2019-11-22 2019-11-22 Cloud desktop scale evaluation system and method based on CPU performance

Publications (2)

Publication Number Publication Date
CN111124853A CN111124853A (en) 2020-05-08
CN111124853B true CN111124853B (en) 2022-08-23

Family

ID=70496397

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911155091.5A Active CN111124853B (en) 2019-11-22 2019-11-22 Cloud desktop scale evaluation system and method based on CPU performance

Country Status (1)

Country Link
CN (1) CN111124853B (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8819171B2 (en) * 2011-09-07 2014-08-26 Microsoft Corporation Monitoring and benchmarking client performance from the server-side
CN105446846B (en) * 2015-11-30 2018-07-10 中电科华云信息技术有限公司 Performance test methods based on cloud desktop
CN106850330B (en) * 2016-12-09 2021-02-09 中电科华云信息技术有限公司 Intelligent cloud desktop performance test system and method
CN107547261B (en) * 2017-07-24 2020-10-27 华为技术有限公司 Cloud platform performance test method and device

Also Published As

Publication number Publication date
CN111124853A (en) 2020-05-08

Similar Documents

Publication Publication Date Title
US10740208B2 (en) Cloud infrastructure optimization
US9785454B2 (en) Virtual session benchmarking tool for measuring performance and/or scalability of centralized desktop environments
US8145456B2 (en) Optimizing a prediction of resource usage of an application in a virtual environment
CN109460348B (en) Pressure measurement method and device of game server
US20060288149A1 (en) Generating static performance modeling factors in a deployed system
CN105630575A (en) Performance evaluation method aiming at KVM virtualization server
Verma et al. Profiling and evaluating hardware choices for MapReduce environments: An application-aware approach
Manotas et al. Investigating the impacts of web servers on web application energy usage
US20180314774A1 (en) System Performance Measurement of Stochastic Workloads
CN111124853B (en) Cloud desktop scale evaluation system and method based on CPU performance
US9081605B2 (en) Conflicting sub-process identification method, apparatus and computer program
CN114282686A (en) Method and system for constructing machine learning modeling process
Chen et al. A methodology for understanding mapreduce performance under diverse workloads
JP4843379B2 (en) Computer system development program
Vedam et al. Demystifying cloud benchmarking paradigm-an in depth view
Hauck et al. Ginpex: deriving performance-relevant infrastructure properties through goal-oriented experiments
Gao et al. An exploratory study on assessing the impact of environment variations on the results of load tests
Nambiar et al. Model driven software performance engineering: Current challenges and way ahead
US11526828B2 (en) Calculating developer time during development process
Tarvo et al. Using computer simulation to predict the performance of multithreaded programs
Mitchell et al. On-the-fly capacity planning
Bell Objectives and problems in simulating computers
Hamed et al. Performance Prediction of Web Based Application Architectures Case Study: .NET vs. Java EE
Bögelsack et al. Performance Overhead of Paravirtualization on an exemplary ERP system
CN116737554B (en) Intelligent analysis processing system and method based on big data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant