CN115373675A - Method, apparatus, device, storage medium and program product for assisting performance optimization - Google Patents

Method, apparatus, device, storage medium and program product for assisting performance optimization Download PDF

Info

Publication number
CN115373675A
CN115373675A CN202110552714.3A CN202110552714A CN115373675A CN 115373675 A CN115373675 A CN 115373675A CN 202110552714 A CN202110552714 A CN 202110552714A CN 115373675 A CN115373675 A CN 115373675A
Authority
CN
China
Prior art keywords
job
target
information
jobs
optimization
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110552714.3A
Other languages
Chinese (zh)
Inventor
彭大成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110552714.3A priority Critical patent/CN115373675A/en
Publication of CN115373675A publication Critical patent/CN115373675A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/40Transformation of program code
    • G06F8/41Compilation
    • G06F8/44Encoding
    • G06F8/443Optimisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/362Software debugging
    • G06F11/3628Software debugging of optimised code
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/448Execution paradigms, e.g. implementations of programming paradigms
    • G06F9/4488Object-oriented
    • G06F9/449Object-oriented method invocation or resolution

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Debugging And Monitoring (AREA)

Abstract

Embodiments of the present disclosure provide a method, an apparatus, a device, a storage medium, and a program product for assisting performance optimization, which relate to the field of computer technology. In the auxiliary performance optimization method, first description information is acquired according to the running of a target job in a system. Such first description information may include system level information associated with the system, job level information associated with the target job, and/or code level information associated with the code of the target job. Further, a target reference job is determined by using a difference between the first description information and second description information of a reference job among the plurality of reference jobs. The determined target reference job has a corresponding optimization strategy, such optimization strategy being retrievable for optimizing the target job, e.g. to be presented to a user. In this way, embodiments of the present disclosure can determine a reference job similar to the target job from the reference job library to assist in optimizing the target job, thereby reducing the learning cost of the developer to perform job performance optimization.

Description

Method, apparatus, device, storage medium and program product for assisting performance optimization
Technical Field
Embodiments of the present disclosure relate generally to the field of computer technology. More particularly, embodiments of the present disclosure relate to methods, apparatuses, devices, computer-readable storage media and computer program products for assisting performance optimization.
Background
The purpose of performance optimization (performance optimization) technology is to make the system run faster and the time required for completing a specific function shorter on the premise of not affecting the correct running of the system. The performance optimization technology becomes more and more important in the current IT application scenario, and is widely applied to service scenarios such as Content Distribution Network (CDN), big data, distributed storage, video transmission, and the like.
Conventional performance optimization approaches typically require the developer to model performance based on performance data and to present a solution, which requires the developer to have sufficient knowledge of performance optimization. However, in a general development process, developers often do not have sufficient knowledge of performance optimization, which makes it difficult for the developers to efficiently perform performance optimization of jobs. Therefore, a solution is needed to effectively assist developers in performing performance optimization of jobs.
Disclosure of Invention
Embodiments of the present disclosure provide a scheme for assisting performance optimization.
In a first aspect of the disclosure, a method for assisting performance optimization is provided. The method comprises the following steps: acquiring first description information based on the running of a target operation in the system, wherein the first description information comprises at least one of the following items: system level information associated with the system, job level information associated with the target job, and code level information associated with the code of the target job; and determining a target reference job from the plurality of reference jobs based on a difference between the first description information and second description information of a reference job from the plurality of reference jobs, wherein the target reference job has an associated optimization policy, the optimization policy being retrievable for optimizing the target job.
In this manner, embodiments of the present disclosure are able to compare a target job to be optimized with a plurality of reference jobs, thereby determining a target reference job that comes close to the target job characteristics. Further, an optimization strategy for optimizing the target reference job can be acquired by the developer as a reference for optimizing the target job. In this way, embodiments of the present disclosure can determine one or more reference jobs for a developer that can be borrowed, such that the developer can perform optimizations for target jobs accordingly based on these reference job optimization strategies. This can greatly reduce the learning cost required for developers to perform job performance optimization.
In some embodiments of the first aspect, the system level information indicates at least one of: static configuration information of the system; and dynamic performance information of the system. In this manner, embodiments of the present disclosure can utilize different types of system level information to screen out reference jobs that are similar to the execution environment of the target job, which may provide assistance in optimizing the target job from the system level.
In some embodiments of the first aspect, the job-level information indicates at least one of: process information associated with the target job; resource scheduling information associated with the target job; and utilization information of the target job for the specific computing unit. In this manner, embodiments of the present disclosure can utilize different types of job-level information to screen out reference jobs that are similar to the job-level characteristics of the target job, which may help optimize the target job from the job-level.
In some embodiments of the first aspect, the code level information indicates at least one of: function call information associated with a plurality of functions in code; lock and wait information associated with the code; and memory access information associated with the code. In this manner, embodiments of the present disclosure can utilize different types of code level information to screen out reference jobs that are similar to the code characteristics of the target job, which may provide assistance in optimizing the target job from the code level.
In some embodiments of the first aspect, the method further comprises: acquiring service information of a target operation, wherein the service information indicates the type of a service related to the target operation; and determining a plurality of reference jobs matching the business information from the reference job library. In this way, the embodiments of the present disclosure can effectively filter the reference jobs with inconsistent service types, thereby reducing the amount of computation required for screening the reference jobs.
In some embodiments of the first aspect, determining the target reference job comprises: determining a first set of reference jobs from the plurality of reference jobs based on the system level information; and determining a target reference job based on the first set of reference jobs. In this manner, embodiments of the present disclosure may first recommend reference jobs with close system characteristics from the system level, thereby helping the user attempt to optimize the target job from the system level.
In some embodiments of the first aspect, determining the first set of reference jobs comprises: acquiring development information associated with a development environment of a target job; and determining a first set of reference jobs from the plurality of reference jobs based on the system level information and the development information. In this way, embodiments of the present disclosure can account for differences between optimization strategies for different development environments.
In some embodiments of the first aspect, determining the target reference job based on the first set of reference jobs comprises: determining a second set of reference jobs from the first set of reference jobs based on the job-level information; and determining a target reference job based on the second set of reference jobs. Based on the mode, the embodiment of the disclosure can further acquire the job-level information to screen the reference jobs with the matched job characteristics to provide the job-level optimization suggestion under the condition that the suggestion at the system level cannot effectively optimize the jobs.
In some embodiments of the first aspect, the method further comprises: in response to determining that the number of the first set of reference jobs is greater than the threshold, a first indication is provided for directing generation of the job-level information. In this way, the embodiment of the present disclosure may effectively guide the user to perform the performance information acquisition of the next stage to obtain further optimization suggestions.
In some embodiments of the first aspect, determining the target reference job based on the second set of reference jobs comprises: determining a third set of reference jobs from the first set of reference jobs based on the code level information; and determining a target reference job based on the third set of reference jobs. In this way, embodiments of the present disclosure may further obtain code level information to filter reference jobs with matching code characteristics to provide code level optimization suggestions in the event that the job level suggestions are not effective to optimize the jobs.
In some embodiments of the first aspect, the method further comprises: in response to determining that the number of second set of reference jobs is greater than the threshold, providing a second indication for directing generation of code level information. In this way, the embodiment of the present disclosure may effectively guide the user to perform the performance information acquisition of the next stage to obtain further optimization suggestion.
In some embodiments of the first aspect, the method further comprises: an optimization strategy associated with the target reference job is provided as a first optimization suggestion for the target job.
In some embodiments of the first aspect, the method further comprises: determining a first feature representation corresponding to the first description information and a second feature representation corresponding to the second description information; and determining a difference between the first descriptive information and the second descriptive information based on the first feature representation and the second feature representation. In this way, embodiments of the present disclosure can convert different types of data into a unified encoded representation, thereby enabling automatic screening of reference jobs.
In some embodiments of the first aspect, the first description information comprises a numerical parameter, the first feature representation comprises a first value corresponding to the numerical parameter, and the first value is determined based on the value of the numerical parameter and a weight associated with the numerical parameter. In some embodiments of the first aspect, the first description information comprises a non-numerical parameter, and the first feature representation comprises a second value corresponding to the non-numerical parameter, the second value being determined based on encoding the non-numerical parameter.
In some embodiments of the first aspect, the plurality of reference jobs are from a library of audited reference jobs, and the method further comprises: determining a first additional reference job from the unchecked reference job library based on the first description information; and providing the optimization strategy associated with the first additional reference job as a second optimization suggestion for the target job. Based on such a manner, the embodiment of the disclosure can expand the screening range of the reference job under the condition that the current reference job may not effectively help the optimization, thereby improving the probability of successfully optimizing the target job.
In some embodiments of the first aspect, the method further comprises: sending the first description information to the remote device; acquiring, from the remote device, job information associated with a second additional reference job, the second additional reference job being determined by the remote device based on the first description information; and providing an optimization strategy associated with the second additional reference job as a third optimization suggestion for the target job based on the job information. In this way, embodiments of the present disclosure can further obtain a reference job from a remote device in the event that the current reference job may not be able to effectively assist in optimization, thereby increasing the probability of successfully optimizing the target job.
In some embodiments of the first aspect, sending the first description information to the remote device comprises: in response to receiving a request to obtain an additional reference job, first description information is sent to a remote device. Based on the mode, the acquisition of the remote reference job can be started according to the request of the user, so that the interactive friendliness degree is improved.
In some embodiments of the first aspect, the method further comprises: in response to determining that the optimization strategy is applied to the target job, comparison information is provided indicating an impact of the optimization strategy on the operational performance of the target job. Based on the mode, the user can know whether the optimization strategy achieves the expected effect more intuitively.
In a second aspect of the disclosure, an apparatus for assisting performance optimization is provided. The device comprises: an acquisition unit configured to acquire first description information based on execution of a target job in a system, the first description information including at least one of: system level information associated with the system, job level information associated with the target job, and code level information associated with code of the target job; and a determining unit configured to determine a target reference job from a plurality of reference jobs based on a difference between the first description information and second description information of a reference job from the plurality of reference jobs, wherein the target reference job has an associated optimization policy, the optimization policy being retrievable for optimizing the target job.
In this manner, embodiments of the present disclosure are able to compare a target job to be optimized with a plurality of reference jobs, thereby determining a target reference job that comes close to the target job characteristics. Further, an optimization strategy for optimizing the target reference job can be acquired by the developer as a reference for optimizing the target job. In this way, embodiments of the present disclosure can determine one or more reference jobs for a developer that can be borrowed, such that the developer can perform optimizations for target jobs accordingly based on these reference job optimization strategies. This can greatly reduce the learning cost required for developers to perform job performance optimization.
In some embodiments of the second aspect, the system level information indicates at least one of: static configuration information of the system; and dynamic performance information of the system. In this manner, embodiments of the present disclosure can utilize different types of system level information to screen out reference jobs that are similar to the operating environment of the target job, which may provide assistance in optimizing the target job from the system level.
In some embodiments of the second aspect, the job-level information indicates at least one of: process information associated with the target job; resource scheduling information associated with the target job; and utilization information of the target job for the particular computing unit. In this manner, embodiments of the present disclosure can utilize different types of job-level information to screen out reference jobs that are similar to the job-level characteristics of the target job, which may provide assistance in optimizing the target job from the job-level.
In some embodiments of the second aspect, the code level information indicates at least one of: function call information associated with a plurality of functions in code; lock and wait information associated with the code; and memory access information associated with the code. In this manner, embodiments of the present disclosure can utilize different types of code level information to screen out reference jobs that are similar to the code characteristics of the target job, which may provide assistance in optimizing the target job from the code level.
In some embodiments of the second aspect, the apparatus is further configured to: acquiring service information of a target operation, wherein the service information indicates the type of a service related to the target operation; and determining a plurality of reference jobs matching the business information from the reference job library. In this way, the embodiments of the present disclosure can effectively filter the reference jobs with inconsistent service types, thereby reducing the amount of computation required for screening the reference jobs.
In some embodiments of the second aspect, the apparatus is further configured to: determining a first set of reference jobs from the plurality of reference jobs based on the system level information; and determining a target reference job based on the first set of reference jobs. In this manner, embodiments of the present disclosure may first recommend reference jobs with close system characteristics from the system level, thereby helping the user attempt to optimize the target job from the system level.
In some embodiments of the second aspect, the apparatus is further configured to: acquiring development information associated with a development environment of a target job; and determining a first set of reference jobs from the plurality of reference jobs based on the system level information and the development information. In this way, embodiments of the present disclosure can account for differences between optimization strategies for different development environments.
In some embodiments of the second aspect, the apparatus is further configured to: determining a second set of reference jobs from the first set of reference jobs based on the job-level information; and determining a target reference job based on the second set of reference jobs. In this way, embodiments of the present disclosure may further obtain job-level information to filter reference jobs with matching job characteristics to provide job-level optimization suggestions in the case that the system-level suggestions cannot effectively optimize the jobs.
In some embodiments of the second aspect, the apparatus is further configured to: in response to determining that the number of the first set of reference jobs is greater than the threshold, a first indication is provided for directing generation of the job-level information. In this way, the embodiment of the present disclosure may effectively guide the user to perform the performance information acquisition of the next stage to obtain further optimization suggestions.
In some embodiments of the second aspect, the apparatus is further configured to: determining a third set of reference jobs from the first set of reference jobs based on the code level information; and determining a target reference job based on the third set of reference jobs. In this way, embodiments of the present disclosure may further obtain code level information to filter reference jobs with matching code characteristics to provide code level optimization suggestions in the event that the job level suggestions are not effective to optimize the jobs.
In some embodiments of the second aspect, the apparatus is further configured to: in response to determining that the number of second set of reference jobs is greater than the threshold, providing a second indication for directing generation of code level information. In this way, the embodiment of the present disclosure may effectively guide the user to perform the performance information acquisition of the next stage to obtain further optimization suggestions.
In some embodiments of the second aspect, the apparatus is further configured to: an optimization strategy associated with the target reference job is provided as a first optimization suggestion for the target job.
In some embodiments of the second aspect, the apparatus is further configured to: determining a first feature representation corresponding to the first description information and a second feature representation corresponding to the second description information; and determining a difference between the first descriptive information and the second descriptive information based on the first feature representation and the second feature representation. Based on the mode, the embodiment of the disclosure can convert different types of data into a uniform coding representation, so that automatic screening of reference jobs is realized.
In some embodiments of the second aspect, the first description information includes a numerical parameter, the first feature representation includes a first value corresponding to the numerical parameter, the first value is determined based on the value of the numerical parameter and a weight associated with the numerical parameter. In some embodiments of the second aspect, the first description information comprises a non-numerical parameter, and the first feature representation comprises a second value corresponding to the non-numerical parameter, the second value being determined based on encoding the non-numerical parameter.
In some embodiments of the second aspect, the plurality of reference jobs are from a library of audited reference jobs, and the apparatus is further configured to: determining a first additional reference job from the unchecked reference job library based on the first description information; and providing the optimization strategy associated with the first additional reference job as a second optimization suggestion for the target job. Based on such a manner, the embodiment of the disclosure can expand the screening range of the reference job under the condition that the current reference job may not effectively help the optimization, thereby improving the probability of successfully optimizing the target job.
In some embodiments of the second aspect, the apparatus is further configured to: sending the first description information to the remote device; acquiring, from the remote device, job information associated with a second additional reference job, the second additional reference job being determined by the remote device based on the first description information; and providing an optimization strategy associated with the second additional reference job as a third optimization suggestion for the target job based on the job information. In this way, embodiments of the present disclosure can further obtain a reference job from a remote device in the event that the current reference job may not be able to effectively assist in optimization, thereby increasing the probability of successfully optimizing the target job.
In some embodiments of the second aspect, the apparatus is further configured to: in response to receiving a request to obtain an additional reference job, first description information is sent to a remote device. Based on the mode, the acquisition of the remote reference operation can be started according to the request of the user, so that the interactive friendliness degree is improved.
In some embodiments of the second aspect, the apparatus is further configured to: in response to determining that the optimization strategy is applied to the target job, comparison information is provided indicating an impact of the optimization strategy on the operational performance of the target job. Based on the mode, the user can know whether the optimization strategy achieves the expected effect more intuitively.
In a third aspect of the present disclosure, there is provided an electronic device comprising: at least one computing unit; at least one memory coupled to the at least one computing unit and storing instructions for execution by the at least one computing unit, the instructions when executed by the at least one computing unit, causing the apparatus of the first aspect or the method of any one implementation of the first aspect.
In a fourth aspect of the present disclosure, a computer-readable storage medium is provided, on which one or more computer instructions are stored, wherein the one or more computer instructions are executed by a processor to implement the first aspect or the method in any one of the implementations of the first aspect.
In a fifth aspect of the present disclosure, there is provided a computer program product for causing a computer to perform some or all of the steps of the method of the first aspect or any one of the implementations of the first aspect when the computer program product runs on a computer.
It will be appreciated that the electronic device of the third aspect, the computer storage medium of the fourth aspect or the computer program product of the fifth aspect provided above are all adapted to perform the method provided by the first aspect. Therefore, explanations or explanations regarding the first aspect are equally applicable to the third, fourth, and fifth aspects. In addition, the beneficial effects achieved by the third aspect, the fourth aspect and the fifth aspect can refer to the beneficial effects in the corresponding methods, and are not described herein again.
In this manner, embodiments of the present disclosure can determine one or more reference jobs for the developer that can be borrowed, such that the developer can perform optimizations for the target job accordingly based on these reference job optimization policies. This can greatly reduce the learning cost required for developers to perform job performance optimization.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers in the drawings identify the same or similar elements, of which:
FIG. 1 illustrates a schematic diagram of an example environment in which embodiments of the present disclosure can be implemented;
FIG. 2 illustrates a flow diagram of an example process of assisting performance optimization in accordance with some embodiments of the present disclosure;
FIG. 3 illustrates a schematic diagram of determining a target reference job in accordance with some embodiments of the present disclosure;
FIG. 4 illustrates a schematic diagram of an example performance tuning system, in accordance with some embodiments of the present disclosure;
FIG. 5 illustrates a schematic block diagram of an apparatus that facilitates performance optimization in accordance with some embodiments of the present disclosure; and
FIG. 6 illustrates a block diagram of a computing device capable of implementing various embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and the embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
In describing embodiments of the present disclosure, the terms "include" and its derivatives should be interpreted as being inclusive, i.e., "including but not limited to. The term "based on" should be understood as "based at least in part on". The term "one embodiment" or "the embodiment" should be understood as "at least one embodiment". The terms "first," "second," and the like may refer to different or the same objects. Other explicit and implicit definitions are also possible below.
As discussed above, conventional performance optimization approaches typically require developers to have sufficient performance optimization knowledge. For example, some solutions provide references to developers to perform optimizations by providing the developers with specific performance data. However, developers need sufficient knowledge of performance optimization to be able to formulate a corresponding optimization strategy based on specific performance data, which raises the threshold of performance optimization, which in turn makes it difficult for the performance of some jobs to be efficiently optimized.
Example Environment
According to an embodiment of the present disclosure, a scheme for assisting performance optimization is provided. In the scheme, first description information is acquired based on the running of a target job in a system, wherein the first description information comprises at least one of the following items: system level information associated with the system, job level information associated with the target job, and code level information associated with the code of the target job. Further, a target reference job is determined from the plurality of reference jobs based on a difference between the first description information and second description information of a reference job of the plurality of reference jobs, wherein the target reference job has an associated optimization policy that can be obtained for optimizing the target job.
In this manner, embodiments of the present disclosure are able to compare a target job to be optimized with a plurality of reference jobs, thereby determining a target reference job that comes close to the target job characteristics. Further, an optimization strategy for optimizing the target reference job can be acquired by the developer as a reference for optimizing the target job. In this way, embodiments of the present disclosure can determine one or more reference jobs for a developer that can be borrowed, such that the developer can perform optimizations for target jobs accordingly based on these reference job optimization strategies. This can greatly reduce the learning cost required for developers to perform job performance optimization.
Fig. 1 illustrates a schematic diagram of an example environment 100 in which various embodiments of the present disclosure can be implemented. As shown in FIG. 1, in environment 100, a target job 110 may be deployed to run in a system 120. In some embodiments, the target job 110 may include any suitable project for running in the system 120. Examples of target jobs 110 include, but are not limited to: high performance computing HPC engineering, video processing engineering, big data analytics engineering, or distributed storage engineering, etc.
For example, the target job 110 may be deployed into the system 120 by a code developer or architect to debug or analyze the target job 110. As another example, the target job 110 may also be a formal project that normally runs in the system 120, which can accomplish the corresponding function,
in some embodiments, the system 120 may indicate any suitable hardware environment and/or software platform for running the target job 110. Such a hardware environment may include a single node or a cluster of nodes. Such software environments may include, for example: an operating system, a code compiler, a BIOS, and the like mounted on the node.
In some embodiments, a performance tuning tool may also be run by at least one node in the system 120 to obtain performance information related to the operation of the target job 110 in the system 120. Taking HPC engineering as an example, it may be deployed to run on a single node, and a performance tuning tool may also be installed on that node to obtain performance information. Taking distributed storage engineering as an example, it may be deployed, for example, on a cluster of multiple nodes, and one of the nodes may be installed with a performance tuning tool to obtain performance information for the cluster.
It should be understood that any suitable performance tuning tool known in the art or developed in the future may be employed to obtain corresponding performance information, and the present disclosure is not intended to be limited to a specific model or function of a performance tuning tool.
In one example, the performance tuning tool may provide the code developer or architect with relevant performance data for the target job 110 to assist the developer or architect in optimizing or adjusting the operation of the target job 110 in the system 120. As another example, the target job 110 may be a formal project in operation, and the performance tuning tool may provide the operational performance data of the target job to the operation and maintenance personnel to help the operation and maintenance personnel optimize the operation of the target job 110. For example, the target job 110 may provide a prompt to the operation and maintenance personnel regarding a decline in operational performance to alert the operation and maintenance personnel that the operational speed of the target job 110 in the system 120 should be optimized.
As shown in FIG. 1, the environment 100 also includes a secondary optimization device 140 that is capable of obtaining information 130 (also referred to as first description information 130) generated based on the execution of the target job 110 in the system 120. For example, the auxiliary optimization device 140 may communicate with a performance tuning tool in the system 120 to receive the first description information 130.
In some embodiments, the first descriptive information 130 may include system level information associated with the system 120. The system level information is intended to describe configuration information of the system 120 and/or operational performance of the system 120 from the system level. In this manner, embodiments of the present disclosure can utilize different types of system level information to screen out reference jobs that are similar to the execution environment of the target job, which may provide assistance in optimizing the target job from the system level.
In some embodiments, the system level information may include static configuration information for the system 120, such static configuration information being usable to indicate the execution environment of the target job 110. Examples of static configuration information may include, but are not limited to: processor information (e.g., model, number of cores, maximum frequency, cache size, etc.) for one or more nodes in system 120, memory information (e.g., total memory size, number of memories, number of free memory slots, memory type, maximum rate of memories, etc.), storage device information (e.g., total amount of storage, number of storage devices, type of storage devices, RAID information, etc.), network device information (e.g., number of network cards, number of ports, maximum transmission rate of network cards), software environment information (e.g., BIOS information, operating system information, system core information, virtual machine information, storage management system information), etc. It should be understood that the above specific information is merely exemplary, and that other suitable static configuration information may also be collected as desired. In some embodiments, the system level information may also include dynamic performance information of the system 120, such dynamic performance information being able to be used to indicate the real-time status of the system 120 when running the target job 110. Examples of dynamic performance information may include, but are not limited to: CPU usage information, load information, memory usage information, storage usage information, network information, or energy consumption information for system 120, etc.
In some embodiments, the first description information 130 may include job-level information associated with the target job 110. The job-level information is intended to describe the running state of the target job 110 in the system 120 from the job-level. In this manner, embodiments of the present disclosure can utilize different types of job-level information to screen out reference jobs that are similar to the job-level characteristics of the target job, which may help optimize the target job from the job-level.
In some embodiments, the job-level information may include process information associated with the target job 110 that can be used to indicate a particular running state of a process/thread. Examples of process information may include, but are not limited to: usage information of the process/thread on the CPU (e.g., percentage of the task occupying CPU in user space, percentage of the task occupying CPU in kernel space, percentage of the task occupying CPU in IO waiting, percentage of the task occupying CPU, etc.), usage information of the process/thread on the memory (e.g., virtual memory size used by the task, physical memory size used by the task, percentage of the task occupying memory, etc.), usage information of the process/thread on the storage (e.g., amount of data read from the hard disk per second by the task or amount of data written to the hard disk per second by the task), context switching information of the process/thread (e.g., number of active task context switching times per second or number of passive task context switching times per second, etc.), system call information of the process/thread (e.g., average system CPU time per call, number of system calls in the entire acquisition process, or number of system call failures in the entire acquisition process, etc.), etc.
In some embodiments, the job-level information may include resource scheduling information associated with the target job 110, which can be used to indicate system resource scheduling of the target job. Examples of resource scheduling information may include, but are not limited to: process/thread switching information (e.g., number of switches, average scheduling delay, maximum scheduling delay time, maximum delay time point, etc.), or NUMA (Non-uniform Memory Architecture) node switching information (e.g., number of NUMA node switches, etc.)
In some embodiments, the job-level information may include utilization information for a particular computing unit by the target job, such information also referred to as micro-architectural information that can indicate information about the execution of the target job on a particular computing unit (e.g., a particular CPU chip). Examples of microarchitectural information may include, but are not limited to: clock cycle number, instruction number, number of instructions executed in a single clock cycle, percentage of clock cycles occupied by the current call stack, or percentage of instructions occupied by the current call stack, etc.
In some embodiments, the first description information 130 may also include code level information associated with the code of the target job 110. The code level information is intended to describe the impact of different code portions of the target job 110 on the performance of the run from the code hierarchy.
In some embodiments, the code level information may include function call information associated with a plurality of functions in the code, which can indicate an execution state of the plurality of functions. Examples of function call information may include, but are not limited to: the running time of the function, the execution clock cycle of the function, the clock cycle percentage of the function, the instruction number of the function, the instruction percentage of the function, the instruction number executed in a single clock cycle of the function, the hot spot function ten times before the calling number, and the like.
In some embodiments, the code level information may include lock and wait information associated with the code that can describe information about the code that invoked the lock and wait task. Examples of lock and wait information may include, but are not limited to: the corresponding function name, the calling times of the corresponding function, the timestamp, the corresponding source code file name, the source code line number corresponding to the calling point and the like.
In some embodiments, the code level information may include memory information associated with the code that is capable of describing the memory state of different portions of the code. Examples of the access information may include, but are not limited to: function-related information (e.g., function name, file name where the function resides, source code line number, assembly instruction address, assembly code line number, etc.) of the access-to-memory MISS event or function-related information (e.g., function name, file name where the function resides, source code line number, etc.) of the occurrence of the pseudo-shared access.
It should be appreciated that existing or future performance tuning tools may be utilized to scan the operation of the target job 110 in the system 120 for a predetermined period of time to generate the first description information 130 discussed above. The present disclosure is not intended to be limited to a specific manner of generating the first descriptive information 130.
As shown in fig. 1, the auxiliary optimization apparatus 140 may determine a target reference job 160 from the plurality of reference jobs 150 based on the received first description information 130. In some implementations, such reference jobs 150 may be built based on actual historical jobs that have been optimized for execution. For example, different developers may upload information of successfully tuned jobs and corresponding optimization strategies, and after having been audited, such jobs may be added as reference jobs 150. In some embodiments, a corresponding reference job library may be maintained for the secondary optimization device 140 such that the secondary optimization device 140 can retrieve a plurality of reference jobs 150 from the reference job library.
In some implementations, the secondary optimization device 140 can obtain information related to the plurality of reference jobs 150 from a storage device locally. Alternatively or additionally, the information of at least a part of the plurality of reference jobs 150 may also be received by the secondary optimization device 140 via a network.
In some embodiments, unlike the target job 110 deployed in the system 120, the auxiliary optimization apparatus 140 may not acquire the source code of the reference job 150, but may acquire description information of the reference job 150 (for convenience of description, such description information is also referred to as second description information). Similar to the first descriptive information 130 discussed above, the second descriptive information may likewise include one or more of system level information, job level information, and code level information.
The auxiliary optimization apparatus 140 may determine a target reference job 160 matching the target job 110 from the plurality of reference jobs 150 based on the received first description information 130 and the second description information of the plurality of reference jobs 150. The specific process for determining the target reference job 160 will be described in detail below, and will not be described in detail herein for the time being.
As shown in FIG. 1, the target reference job 160 has a corresponding optimization strategy 170. In some embodiments, examples of optimization policies 170 may include, but are not limited to: adjusting hardware configuration (e.g., increasing memory number, increasing node number, upgrading network card), adjusting system settings (e.g., turning on system memory management unit SMMU), adjusting software environment (e.g., updating compiler kernel version, upgrading network card driver), optimizing code, etc.
It should be appreciated that such optimization strategies 170 may be manually formulated by a developer for historical jobs, automatically generated by a performance tuning tool, or generated using a machine learning algorithm. Such optimization strategies 170 may be further reviewed, for example, to determine that they are applicable to the corresponding reference job. It should be appreciated that any suitable manner of generating an optimization strategy may be suitable, and the present disclosure is not intended to be limiting on the generation process of the optimization strategy.
In some embodiments, as shown in FIG. 1, the terminal device 180 may also obtain the optimization strategy 170 for the target reference job 160 for presentation, for example, by way of a graphical user interface. For example, the auxiliary optimization device 140 may send the determined optimization strategy 170 for the target job 160 to the terminal device 180 as a recommendation to optimize the operation of the target job 110 on the system 120.
It should be understood that although in the example environment 100 of fig. 1, the nodes in the system 120, the secondary optimization device 140, and the terminal device 180 are shown as separate devices, two or more of them may be integrated in the same device, depending on the needs of the actual scenario.
In one example, the auxiliary optimization device 140 can be, for example, a cloud server device that can obtain the first description information from the system 120 through remote communication and send the optimization policy to the terminal device, for example, through network communication.
In another example, the secondary optimization device 140 can also be a node in the system 120 to run a corresponding secondary optimization process and send the optimization strategy to the terminal device 180, for example, by way of network communication.
In yet another example, the secondary optimization device 140 may also be a user-oriented terminal device that acquires the first description information 130 via network communication and determines the target reference job 160 based on local calculations and may provide the optimization policy 170, for example, via a coupled display.
In yet another example, the system 120 can be, for example, a user's terminal device, and both the performance tuning tool and the auxiliary optimization tool can be deployed on the terminal device. The terminal device may generate the first description information 130 from the execution of the target job 110 and determine the target reference job 160 accordingly. Further, the optimization strategy 170 may be presented to the user, for example, through a display of the terminal device, as a suggestion for optimizing the target job 110.
Example procedure
The process of assisting the optimization apparatus 140 in determining the target reference job 160 will be described in detail below with reference to fig. 2. Fig. 2 illustrates a flow diagram of an example process 200 of assisting performance optimization in accordance with some embodiments of the present disclosure. The process 200 may be implemented, for example, at the secondary optimization device 130 of fig. 1, and for ease of description, the process 200 will be described below with reference to fig. 1.
As shown in FIG. 2, at block 210, the auxiliary optimization device obtains first description information based on the running of the target job in the system, wherein the first description information includes at least one of: system level information associated with the system, job level information associated with the target job, and code level information associated with the code of the target job.
As discussed above with reference to fig. 1, the auxiliary optimization device 140 may obtain the first description information 130, for example, through a performance tuning tool deployed in the system 120.
While possible implementations of the first description information 130 have been discussed above in connection with fig. 1, it should be understood that the specific examples of the first description information 130 listed above are merely illustrative and that other suitable system level information, job level information, or code level information are possible.
At block 220, the secondary optimization device determines a target reference job from the plurality of reference jobs based on a difference between the first description information and second description information of a reference job from the plurality of reference jobs, wherein the target reference job has an associated optimization policy that can be obtained for optimizing the target job. In this way, the embodiments of the present disclosure can acquire a reference job similar to the current job from among a plurality of target reference jobs having respective optimization policies as a reference for optimizing the target job. Therefore, the professional ability requirement of the user for job optimization can be reduced, and more users can be helped to effectively perform job optimization.
In some embodiments, the secondary optimization device 140 may convert the first description information 130 and the second description information into corresponding feature representations, e.g., feature vectors. Further, the secondary optimization device 140 may determine a difference between the first descriptive information and the second descriptive information based on the converted feature representation. Based on the mode, the embodiment of the disclosure can convert different types of data into a uniform coding representation, so that automatic screening of reference jobs is realized.
In particular, the secondary optimization device 140 may convert the first description information 130 into a first feature representation, e.g. a first feature vector. As discussed above, the first descriptive information 130 may include information relating to different types of parameters, and such parameters may include numerical parameters or non-numerical parameters.
In some embodiments, the first description information may include a numerical parameter, and the auxiliary optimization device 140 may determine a first value of the first feature representation corresponding to the numerical parameter based on a value of the numerical parameter and a weight corresponding to the numerical parameter. For example, the number of CPU cores is a numerical parameter included in the first description information, and its value is, for example, 8. The secondary optimization device 140 may determine that the first value corresponding to the parameter of the number of CPU cores is 4 based on the value (8) and the corresponding weight (e.g., 0.5) when generating the first feature representation. By introducing weights, the secondary optimization device 140 may adjust the impact of different parameters on determining the target reference job 160. It should be understood that the above specific values and specific values of the weights are merely exemplary.
In some embodiments, the first description information may include a non-numerical parameter that the secondary optimization device 140 may encode to determine a second value in the first feature representation corresponding to the non-numerical parameter. For example, the BIOS version of system 120 is encoded according to rules based on:
Figure BDA0003075787550000101
that is, if the BIOS version number of system 120 is "2280V1 CS V1," secondary optimization device 140 may determine that its second value in the first characteristic representation is "1," for example, based on the encoding rule. It should be understood that the above coding rules and specific models or values are merely illustrative and are not intended to be a limitation of the present disclosure.
In this way, the auxiliary optimization apparatus 140 may convert the plurality of parameters included in the first description information 130 into the feature representation represented by numerical values. In some embodiments, the second feature representation corresponding to the second description information may be generated by the auxiliary optimization device 140 based on the same manner, for example.
Alternatively, the second description information is also previously determined by other devices and is associatively stored as information associated with the plurality of reference jobs 150. The secondary optimization device 140 may directly obtain the second feature representation without performing additional transformations.
In some embodiments, the secondary optimization device 140 can determine a difference between the first description information and the second description information based on the first feature representation and the second feature representation. Illustratively, the secondary optimization device 140 may determine the difference based on, for example, the distance between the vectors. Alternatively, the auxiliary optimization device 140 may measure the difference between the first description information and the second description information based on the cosine similarity between the first feature representation and the second feature representation.
In some embodiments, the secondary optimization device 140 may determine, for example, one or more reference jobs from the plurality of reference jobs 150 that differ by less than a threshold (or that have similarities greater than a threshold) as the target reference job 160. Alternatively, the auxiliary optimization apparatus 140 may also determine a predetermined number of reference jobs with the smallest difference from the plurality of reference jobs 150 as the target reference job 160. For example, the auxiliary optimization apparatus 140 may determine 3 reference jobs having the smallest difference as the target reference jobs 160.
In some embodiments, after determining the target reference job 160, its corresponding optimization policy 170 may be provided to the user, for example, as a suggestion to optimize the target job 110.
For example, the terminal device 180 may acquire the optimization policies 170 with the target reference job 160 from the auxiliary optimization device 140 and present such optimization policies 170 by way of a graphical user interface. Alternatively, the terminal device 180 may first obtain only the description information (e.g., job name, job summary, etc.) of the target reference job 160, and further obtain and present the optimization policy 170 of the target reference job 160 in response to a viewing request of the user.
In some embodiments, such optimization strategies 170 may also be obtained, for example, by a performance tuning tool to automatically perform optimizations for the target job 110. Based on the mode, the embodiment of the disclosure can further realize automatic optimization of the operation.
Based on the processes discussed above, embodiments of the present disclosure are able to determine a target reference job similar to the target job from a plurality of reference jobs with corresponding optimization policies, and utilize the target reference job to assist a user (e.g., developer/architect/operation and maintenance personnel, etc.) in optimizing the target job.
Multi-level assisted optimization process
In some embodiments, the secondary optimization device 140 may also determine the target reference job 160 based on a multi-level secondary optimization process. A multi-level secondary optimization process will be described below with reference to fig. 3, which fig. 3 illustrates a schematic 300 of determining a target reference job in accordance with some embodiments of the present disclosure.
As shown in fig. 3, in some embodiments, the auxiliary optimization device 140 may obtain business information 310 of the target job 110 and determine a plurality of reference jobs 150 from the reference job library 305 that match the business information 310. In some embodiments, the business information 310 may indicate, for example, the category of business involved by the target job 110.
In some embodiments, the user may fill out the business information 310 for the target job 110, for example, through an interface provided by the performance tuning tool. For example, the user may select a corresponding category from a plurality of service categories provided by the performance tuning tool as the service information 310. Examples of traffic classes may include, but are not limited to: video services, CDN, big data, databases, web services, etc. It should be understood that any other suitable traffic classes may be set according to actual needs, and the present disclosure is not intended to be limiting.
In some embodiments, after obtaining the business information 310, the secondary optimization device 140 may, for example, screen out a plurality of reference jobs 150 from the reference job library 305 for business category matches. For example, if the user fills out that the service category is video service, the auxiliary optimization device 140 may screen out a plurality of reference jobs 150 from the reference job library 305 that also belong to the video service. In this way, the embodiments of the present disclosure can effectively exclude the reference jobs with inconsistent service types, thereby reducing the amount of calculation required to filter the target reference jobs.
Alternatively, if the user does not fill out the business category, or the secondary optimization device 140 fails to acquire the business information 310 for other reasons, the secondary optimization device 140 may treat all of the reference jobs in the reference job library 305 as the plurality of reference jobs 150.
Further, the secondary optimization device 140 may first obtain system level information 315 as discussed above and determine a first set of reference jobs 320 from the plurality of reference jobs 150 based on the system level information 315. In this manner, embodiments of the present disclosure may first recommend reference jobs with close system characteristics from the system level, thereby helping the user attempt to optimize the target job from the system level.
Specifically, the secondary optimization device 140 may, for example, convert the system level information 315 into a characterization representation and determine the first set of reference jobs 320 based on differences between the characterization representation and corresponding characterization representations of the plurality of reference jobs 150. In some embodiments, the auxiliary optimization device 140 may also obtain development information associated with the development environment of the target job 110 and determine the first set of reference jobs 320 from the plurality of reference jobs 150 based on the system level information 315 and the development information. In this way, embodiments of the present disclosure can account for differences between optimization strategies for different development environments.
Illustratively, such development information may indicate a development language of the target job 110, for example, C language, java language, python language, and the like. Alternatively or additionally, such development information may also indicate, for example, a development environment of target job 110, such as ngnix or docker, among others. Such development information may be added, for example, based on the manner of encoding, as part of the characterization representation used for comparison, for determining the first set of reference jobs 320. For example, the first set of reference jobs 320 may include a plurality of reference jobs having a difference between the representations of the features less than a predetermined threshold.
The secondary optimization device 140 may further determine the target reference job 160 based on the first set of reference jobs 320. In some embodiments, the secondary optimization device 140 may directly take one or more of the first set of reference jobs 320 (e.g., the least diverse 3 reference jobs) as the target reference jobs 160 and provide their corresponding optimization strategies, for example, to the user.
In some embodiments, the number of first set of reference jobs 320 obtained may be greater since matching based on system level information 315 is more general. In this case, directly providing the optimization strategies for all of the first set of reference jobs 320 to the user may place an additional burden on the user.
In some embodiments, if it is determined that the number of first set of reference jobs 320 is greater than the predetermined threshold, the secondary optimization device 140 may determine to continue to perform the matching.
In another embodiment, if one or more of the first set of reference jobs 320 have already been determined to be the target reference job 160, but the user is still initiating a request to obtain a new optimization strategy, the secondary optimization device 140 may determine to continue to perform the matching. For example, a user may initiate a request to obtain a new optimization strategy via a user interface.
In yet another embodiment, the secondary optimization device 140 may automatically determine to continue to perform the matching if one or more of the first set of reference jobs 320 have been determined to be the target reference job 160 and their corresponding optimization strategies have been applied to the target job, but the secondary optimization device 140 determines that the performance of the target job 110 has not improved. For example, the secondary optimization device 140 may determine a comparison of performance before and after the optimization strategy is applied and automatically determine to continue to perform the matching if the performance is not improved or the degree of improvement is less than a predetermined degree.
As shown in fig. 3, in a further matching process, the secondary optimization device 140 may obtain job-level information 325 and determine a second set of reference jobs 330 based on the job-level information 325. In this way, embodiments of the present disclosure may further obtain job-level information to filter reference jobs with matching job characteristics to provide job-level optimization suggestions in the case that the system-level suggestions cannot effectively optimize the jobs.
In some embodiments, the performance tuning tool may generate both system level information 315 and job level information 325 at the initial stage. In some embodiments, the job-level information 325 may be generated by the performance tuning tool after determining to continue performing the matching, taking into account that an effective optimization strategy may be obtained using only the system-level information 315.
Illustratively, the secondary optimization device 140 may provide a first indication directing the generation of the job-level information 325 upon determining to continue to perform the match, e.g., because the number of first set of reference jobs 320 is greater than a threshold. For example, the auxiliary optimization device 140 may generate an indication to alert the user to use the performance tuning tool to generate the job level information 325 through the terminal device 180. For example, the terminal device 180 may present a reminder to guide the user in performing process performance analysis, resource scheduling analysis, or micro-architectural analysis to cause the performance tuning tool to generate the job level information 325. In this way, the embodiment of the present disclosure may effectively guide the user to perform the performance information acquisition of the next stage to obtain further optimization suggestions.
In some embodiments, the secondary optimization device 140 may construct corresponding feature representations based on the acquired job-level information 325 to determine the second set of reference jobs 330. In one example, the secondary optimization device 140 may, for example, convert only the job-level information 325 to corresponding feature vectors and screen out a second set of reference jobs 330 from the first set of reference jobs 320 that match the feature vectors. In another example, secondary optimization device 140 may generate corresponding feature vectors based on a combination of system level information 315 and job level information 325, for example, and determine a second set of reference jobs 330 from first set of reference jobs 320 or from a plurality of reference jobs 150. The second set of reference jobs 330 may include, for example, one or more reference jobs whose difference in the representations of the features is less than a threshold.
The secondary optimization device 140 may further determine the target reference job 160 based on the second set of reference jobs 330. In some embodiments, the secondary optimization device 140 may directly take one or more of the second set of reference jobs 330 (e.g., the least diverse 3 reference jobs) as the target reference jobs 160 and provide their corresponding optimization strategies, for example, to the user.
In some embodiments, if one or more of the second set of reference jobs 330 have been determined to be the target reference jobs 160, but the user is still initiating a request to acquire a new optimization strategy, the secondary optimization device 140 may determine to continue to perform the matching.
In other embodiments, the secondary optimization device 140 may automatically determine to continue to perform the matching if one or more of the second set of reference jobs 330 have been determined to be the target reference job 160 and their corresponding optimization strategies have been applied to the target job, but the secondary optimization device 140 determines that the performance of the target job 110 has not improved.
As shown in fig. 3, in a further matching process, the secondary optimization device 140 may obtain code level information 335 and determine a third set of reference jobs 340 based on the code level information 335.
In some embodiments, the performance tuning tool may generate system level information 315, job level information 325, and code level information 335 simultaneously at an initial stage. In some embodiments, code level information 335 may be generated by a performance tuning tool upon determining that further matching is to be performed, taking into account that an effective optimization strategy may be obtained using only system level information 315 and/or job level information 325.
Illustratively, the secondary optimization device 140 may provide a second indication for directing generation of the code-level information 335 when it is determined that further optimization is to be performed, e.g., because performance is an improvement. For example, the auxiliary optimization device 140 may generate an indication to alert the user to use the performance tuning tool to generate code level information 335 through the terminal device 180. For example, the terminal device 180 may present a reminder to guide the user in performing function call performance analysis, lock and wait performance analysis, or memory analysis to cause the performance tuning tool to generate code level information 335. In this way, the embodiment of the present disclosure may effectively guide the user to perform the performance information acquisition of the next stage to obtain further optimization suggestion.
In some embodiments, the secondary optimization device 140 may construct a corresponding feature representation based on the obtained code level information 335 to determine the third set of reference jobs 340. In one example, the secondary optimization device 140 may, for example, convert only the job-level information 335 into corresponding feature vectors and screen out a third set of reference jobs 340 from the second set of reference jobs 330 that match the feature vectors. In another example, the secondary optimization device 140 may generate corresponding feature vectors based on a combination of the system level information 315, the job level information 325, and the code level information 335, for example, and determine a third set of reference jobs 340 from the second set of reference jobs 330 or from the plurality of reference jobs 150. The third set of reference jobs 340 may, for example, include one or more reference jobs whose difference in the representations of the features is less than a threshold. In this way, embodiments of the present disclosure may further obtain code level information to filter reference jobs with matching code characteristics to provide code level optimization suggestions in the event that the job level suggestions are not effective to optimize the jobs.
The secondary optimization device 140 may further determine the target reference job 160 based on the third set of reference jobs 340. In some embodiments, the secondary optimization device 140 may directly take one or more of the third set of reference jobs 340 (e.g., the 3 reference jobs with the least difference) as the target reference jobs 160 and provide their corresponding optimization strategies, for example, to the user.
Through the hierarchical auxiliary optimization process discussed above, embodiments of the present disclosure can sequentially acquire reference jobs according to description information of different hierarchies for assisting optimization of a target job. In this way, the embodiment of the disclosure can guide the user to complete the collection of different performance data at different stages, thereby reducing the learning cost of the user. Moreover, such a process can also help users accumulate performance optimization knowledge.
In some embodiments, the auxiliary optimization device 140 may also obtain other additional reference jobs for providing suggestions of optimization target jobs 110.
In some embodiments, the plurality of reference jobs 150 may be from a library 305 of reviewed reference jobs. In the event that the optimization strategy determined based on the above procedure fails to effectively optimize the target job or the user expects to obtain more optimization suggestions, then the auxiliary optimization device 140 may also determine a first additional reference job from the library of unchecked reference jobs based on the first description information. Based on such a manner, the embodiment of the disclosure can expand the screening range of the reference job under the condition that the current reference job may not effectively help the optimization, thereby improving the probability of successfully optimizing the target job
Illustratively, such an unexamined reference job library may include, for example, jobs that are uploaded autonomously by a developer and are not examined, for example, in the case of freely uploading by a developer in a forum, the optimization strategy or the optimization result may not be examined, and the effectiveness may not be guaranteed.
Further, secondary optimization device 140 may determine a first additional reference job from the base of unapproved reference jobs based on a similar process as that described above for determining target reference job 160, and provide an optimization strategy associated with the first additional reference job as an optimization suggestion for target job 110.
In some embodiments, the secondary optimization device 140 may also provide a prompt that such optimization suggestions are unapproved, so that the user knows that there may be a risk in taking the optimization suggestions.
In some embodiments, the secondary optimization device 140 may also receive optimization suggestions from a remote device, for example. Specifically, the secondary optimization device 140 may also send the first descriptive information 130 to the remote device and obtain job information associated with a second additional job from the remote device, where the second additional job is determined by the remote device based on the first descriptive information 130. Based on such a manner, embodiments of the present disclosure can further acquire a reference job from a remote device under a condition that a current reference job may not be able to effectively assist in optimization, thereby improving a probability of successfully optimizing a target job.
For example, the number of reference job libraries local to the secondary optimization device 140 may be limited, and the secondary optimization device 140 may also send the first description information to a remote device (e.g., a remote server) if it is determined that the local reference job libraries cannot provide an effective optimization strategy. The remote device may, for example, screen out a second additional reference job from the richer reference jobs based on the process discussed above and send corresponding job information to the secondary optimization device 140. Further, the auxiliary optimization device 140 may provide an optimization strategy associated with the second additional job as a third optimization suggestion for the target job 110 based on the job information.
In some embodiments, the secondary optimization device 140 may send the first description information to the remote device upon receiving a request to obtain an additional reference job. For example, a user may send a request to obtain additional reference jobs through a performance tuning tool. Based on the mode, the acquisition of the remote reference job can be started according to the request of the user, so that the interactive friendliness degree is improved.
In this manner, embodiments of the present disclosure may utilize reference jobs of different sources to assist in optimization of a target job.
In some embodiments, upon determining that the optimization strategy is applied to the target job, the auxiliary optimization device 140 may also provide comparative information to indicate the impact of the optimization strategy on the operational performance of the target job. For example, such profile information may indicate a comparison of performance before applying the optimization strategy and performance after applying the optimization. Based on the mode, the user can know whether the optimization strategy achieves the expected effect more intuitively.
Example System
FIG. 4 illustrates a schematic diagram of an example performance tuning system 400, according to some embodiments of the present disclosure. As shown in FIG. 4, the performance tuning system 400 can include a performance tuning tool 410 and a performance tuning assistance module 420.
In some embodiments, the performance tuning tool 140 may be configured to collect the first description information as discussed above and send the first description information to the performance tuning assistance module 420.
As shown in fig. 4, the performance tuning assistance module 420 may include a job matching module 430, a feature extraction module 440, a guidance module 450, and an inference engine 460. The feature extraction module 440 may receive the first description information from the performance tuning tool 410, for example.
Inference engine 460 may receive the first descriptive information and convert it into a corresponding feature representation. Inference engine 460 may also receive a characterization representation corresponding to a reference job in reference job repository 470. It should be understood that while reference job library 470 is shown as being included in performance tuning assistance module 420, reference job library 470 may also be, for example, a remote library coupled to performance tuning assistance module 420 via a network.
Inference engine 460 may be used to perform the detailed process of determining a target reference job discussed above with reference to FIG. 2 and will not be described in detail herein. Illustratively, if the reference job determined based on the system level information is excessive, the inference engine may send an indication to the performance tuning tool 410 through the guidance module 450 to guide the user to perform the next operation to obtain further job level information. Alternatively, the inference engine may also utilize the job matching module 430 to determine a target reference job from the reference jobs and send information related to the target reference job to the performance tuning tool 410, e.g., to present the user with the name, summary, detailed optimization strategy, etc. of the target reference job. Illustratively, if the performance tuning tool 410 and the performance tuning assistance module 420 are located on different devices, the job matching module 430 may send information about the target reference job to the performance tuning tool 410 via the communication unit.
Example apparatus and devices
Fig. 5 further illustrates a block diagram of an apparatus 500 for assisting performance optimization according to an embodiment of the disclosure, the apparatus 500 may include a plurality of modules for performing corresponding steps in the process 200 as discussed in fig. 2. As shown in fig. 5, the apparatus 500 includes an information obtaining unit 510 configured to obtain first description information based on execution of a target job in a system, the first description information including at least one of: system level information associated with the system, job level information associated with the target job, and code level information associated with the code of the target job. The apparatus 500 further comprises a job determination unit 520 configured to determine a target reference job from the plurality of reference jobs based on a difference of the first description information and second description information of a reference job of the plurality of reference jobs, wherein the target reference job has an associated optimization strategy, which can be obtained for optimizing the target job.
Fig. 6 illustrates a schematic block diagram of an example device 600 that can be used to implement embodiments of the present disclosure. The apparatus 600 may be used to implement the secondary optimization apparatus 140. As shown, device 600 includes a computing unit 601 that may perform various suitable actions and processes according to computer program instructions stored in a Random Access Memory (RAM) 603 and/or a Read Only Memory (ROM) 602 or loaded into RAM 603 and/or ROM 602 from a storage unit 608. In the RAM 603 and/or the ROM 602, various programs and data required for the operation of the device 600 can also be stored. The computing unit 601 and the RAM 603 and/or the ROM 602 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
A number of components in the device 600 are connected to the I/O interface 605, including: an input unit 606 such as a keyboard, a mouse, or the like; an output unit 607 such as various types of displays, speakers, and the like; a storage unit 608, such as a magnetic disk, optical disk, or the like; and a communication unit 609 such as a network card, modem, wireless communication transceiver, etc. The communication unit 608 allows the device 600 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The computing unit 601 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of the computing unit 601 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The computing unit 601 performs the various methods and processes described above, such as the process 200. For example, in some embodiments, process 200 may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 608. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 600 via RAM and/or ROM and/or the communication unit 609. When the computer program is loaded into RAM and/or ROM and executed by computing unit 601, one or more steps of process 200 described above may be performed. Alternatively, in other embodiments, computing unit 601 may be configured to perform process 200 in any other suitable manner (e.g., by way of firmware).
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program code, when executed by the processor or controller, causes the functions/acts specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
Further, while operations are depicted in a particular order, this should be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation can also be implemented in multiple implementations separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (18)

1. A method for assisting performance optimization, comprising:
acquiring first description information based on the running of a target operation in a system, wherein the first description information comprises at least one of the following items: system level information associated with the system, job level information associated with the target job, and code level information associated with code of the target job; and
determining a target reference job from a plurality of reference jobs based on a difference between the first description information and second description information of a reference job from the plurality of reference jobs, wherein the target reference job has an associated optimization policy that can be obtained for optimizing the target job.
2. The method of claim 1, further comprising:
acquiring service information of the target operation, wherein the service information indicates the type of a service related to the target operation; and
determining the plurality of reference jobs matching the business information from a reference job library.
3. The method of claim 1, wherein determining the target reference job comprises:
determining a first set of reference jobs from the plurality of reference jobs based on the system level information; and
determining the target reference job based on the first set of reference jobs.
4. The method of claim 3, wherein determining the first set of reference jobs comprises:
acquiring development information associated with a development environment of the target job; and
determining the first set of reference jobs from the plurality of reference jobs based on the system level information and the development information.
5. The method of claim 3, wherein determining the target reference job based on the first set of reference jobs comprises:
determining a second set of reference jobs from the first set of reference jobs based on the job-level information; and
determining the target reference job based on the second set of reference jobs.
6. The method of claim 5, further comprising:
in response to determining that the number of the first set of reference jobs is greater than a threshold, providing a first indication for directing generation of the job-level information.
7. The method of claim 5, wherein determining the target reference job based on the second set of reference jobs comprises:
determining a third set of reference jobs from the first set of reference jobs based on the code level information; and
determining the target reference job based on the third set of reference jobs.
8. The method of claim 7, further comprising:
in response to determining that the number of second set of reference jobs is greater than a threshold, providing a second indication for directing generation of the code level information.
9. The method of claim 1, further comprising:
providing the optimization strategy associated with the target reference job as a first optimization suggestion for the target job.
10. The method of claim 1, further comprising:
determining a first feature representation corresponding to the first description information and a second feature representation corresponding to the second description information; and
determining the difference between the first descriptive information and the second descriptive information based on the first feature representation and the second feature representation.
11. The method of claim 1, wherein the plurality of reference jobs are from an audited library of reference jobs, and further comprising:
determining a first additional reference job from a library of unchecked reference jobs based on the first description information; and
providing an optimization strategy associated with the first additional reference job as a second optimization suggestion for the target job.
12. The method of claim 1, further comprising:
sending the first description information to a remote device;
obtaining, from the remote device, job information associated with a second additional reference job, the second additional reference job determined by the remote device based on the first description information; and
based on the job information, providing an optimization strategy associated with the second additional reference job as a third optimization suggestion for the target job.
13. The method of claim 12, wherein sending the first description information to a remote device comprises:
in response to receiving a request to attach a reference job, sending the first description information to the remote device.
14. The method of claim 1, further comprising:
in response to determining that the optimization strategy is applied to the target job, providing comparison information indicative of an impact of the optimization strategy on the operational performance of the target job.
15. An apparatus for assisting performance optimization, comprising:
an information acquisition unit configured to acquire first description information based on execution of a target job in a system, the first description information including at least one of: system level information associated with the system, job level information associated with the target job, and code level information associated with code of the target job; and
a job determination unit configured to determine a target reference job from a plurality of reference jobs based on a difference between the first description information and second description information of a reference job from the plurality of reference jobs, wherein the target reference job has an associated optimization policy that is retrievable for optimizing the target job.
16. An electronic device, comprising:
at least one computing unit;
at least one memory coupled to the at least one computing unit and storing instructions for execution by the at least one computing unit, the instructions when executed by the at least one computing unit, cause the apparatus to perform the method of any of claims 1-14.
17. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 14.
18. A computer program product comprising computer executable instructions, wherein the computer executable instructions, when executed by a processor, implement the method of any one of claims 1 to 14.
CN202110552714.3A 2021-05-20 2021-05-20 Method, apparatus, device, storage medium and program product for assisting performance optimization Pending CN115373675A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110552714.3A CN115373675A (en) 2021-05-20 2021-05-20 Method, apparatus, device, storage medium and program product for assisting performance optimization

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110552714.3A CN115373675A (en) 2021-05-20 2021-05-20 Method, apparatus, device, storage medium and program product for assisting performance optimization

Publications (1)

Publication Number Publication Date
CN115373675A true CN115373675A (en) 2022-11-22

Family

ID=84059301

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110552714.3A Pending CN115373675A (en) 2021-05-20 2021-05-20 Method, apparatus, device, storage medium and program product for assisting performance optimization

Country Status (1)

Country Link
CN (1) CN115373675A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118567733A (en) * 2024-07-31 2024-08-30 超云数字技术集团有限公司 Method, device and system for optimizing performance of server network card

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118567733A (en) * 2024-07-31 2024-08-30 超云数字技术集团有限公司 Method, device and system for optimizing performance of server network card

Similar Documents

Publication Publication Date Title
US8713526B2 (en) Assigning runtime artifacts to software components
CN109886859B (en) Data processing method, system, electronic device and computer readable storage medium
US10459704B2 (en) Code relatives detection
WO2011071010A1 (en) Load characteristics estimation system, load characteristics estimation method, and program
CN115373835A (en) Task resource adjusting method and device for Flink cluster and electronic equipment
CN109558248B (en) Method and system for determining resource allocation parameters for ocean mode calculation
US20210019456A1 (en) Accelerated simulation setup process using prior knowledge extraction for problem matching
CN112783614A (en) Object processing method, device, equipment, storage medium and program product
CN115373675A (en) Method, apparatus, device, storage medium and program product for assisting performance optimization
CN109582528A (en) State monitoring method, device, electronic equipment and computer readable storage medium
CN117725594A (en) Multiple composite detection method, device, equipment and storage medium of intelligent contract
US20240103853A1 (en) Code maintenance system
Schmidt et al. Load-balanced parallel constraint-based causal structure learning on multi-core systems for high-dimensional data
CN112766470A (en) Feature data processing method, instruction sequence generation method, device and equipment
US11392356B1 (en) Online machine learning based compilation
CN113220463B (en) Binding strategy inference method and device, electronic equipment and storage medium
CN107818501B (en) Actuarial method and device
CN111435356A (en) Data feature extraction method and device, computer equipment and storage medium
CN115495256A (en) Service calling method and device, electronic equipment and storage medium
CN113448822B (en) Test method, test device, computer readable medium and electronic equipment
CN111611167B (en) Embedded software testing method and system based on DSP
CN114398178A (en) Task execution method and device and electronic equipment
CN113360182A (en) Method and apparatus for system performance diagnostics
CN111949281A (en) Database installation method based on AI configuration, user equipment and storage medium
CN114356513B (en) Task processing method and device for cluster mode

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination