WO2024104535A1 - A method for enabling an electronic device to test a program in a performance testing environment - Google Patents

A method for enabling an electronic device to test a program in a performance testing environment Download PDF

Info

Publication number
WO2024104535A1
WO2024104535A1 PCT/DK2023/050260 DK2023050260W WO2024104535A1 WO 2024104535 A1 WO2024104535 A1 WO 2024104535A1 DK 2023050260 W DK2023050260 W DK 2023050260W WO 2024104535 A1 WO2024104535 A1 WO 2024104535A1
Authority
WO
WIPO (PCT)
Prior art keywords
load
production
performance
program
electronic device
Prior art date
Application number
PCT/DK2023/050260
Other languages
French (fr)
Inventor
O Sandeep KUMAR
Original Assignee
Maersk A/S
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Maersk A/S filed Critical Maersk A/S
Publication of WO2024104535A1 publication Critical patent/WO2024104535A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software

Definitions

  • the present disclosure pertains to the field of information technology.
  • the present disclosure relates to a method for enabling an electronic device to test a program in a performance testing environment and related electronic device.
  • Success of a software product depends on its ability to be reliable (e.g., functionally behaving as per expectations, fault tolerant, performant and/or scalable).
  • software testing plays an important role in both achieving and evaluating software (e.g., code) capable of delivering a high quality and performant software product.
  • the testing is mostly static and not performed in conditions of production.
  • Approval of the program under test is given based on how the application running on pre- production environment behaves under a constant or random load but not how application performs in production. There is a need to improve how a performance test is used to approve a program for production deployment.
  • the method comprises testing a program in a performance testing environment. Testing the program comprises obtaining performance data indicative of one or more performance parameters associated with the program in a production environment, and production load information indicative of a load pattern. The method comprises providing, based on the performance data and the production load information, an output. Further, an electronic device is disclosed. The electronic device comprises a memory, a processor, and an interface. The first electronic device is configured to perform any of the methods disclosed herein.
  • the disclosed electronic device and method may allow identifying issues, such as bottlenecks, of the program under test, that may only appear under production conditions. Further, the disclosed technique allows an evaluation of the program’s stability under production conditions.
  • the production load information allows a load generator to simulate and/or emulate exact production conditions for testing the program. For example, the testing provides a replica of production behaviour. This may help discover early an unexpected issue that potentially can happen in production.
  • the disclosed technique supports optimising a cost of test environments by not running a load test at constant peak volume and instead running with actual production pattern resulting in utilising just the right amount of resources required to incur the load.
  • the cost saved is for example directly proportional to number of tests run.
  • the disclosed technique is cost effective since performance issues can be detected without going into actual production due to unexpected peaks and valleys that only happen in production and isn’t accounted for during performance test due to the inability of load generator and also inability to predict real world behaviour.
  • the disclosed technique permits reducing efforts in workload modelling for a performance test, thereby saving time and cost.
  • the performance baseline provided in the production load information is for example from a past performance of a real production system.
  • Fig. 1 A is a diagram illustrating schematically an example process for performing a performance test of a program according to this disclosure
  • Fig. 1 B is a diagram illustrating an example representation of production load information according to this disclosure
  • Fig. 2 is a signalling diagram illustrating schematically an example process for performing a performance test of a program according to this disclosure
  • Fig. 3 is a flow-chart illustrating an exemplary method performed by an electronic device, for testing a program in a performance testing environment according to this disclosure
  • Fig. 4 is a block diagram illustrating an exemplary electronic device according to this disclosure.
  • a load generator is presently limited to running two scenarios: Peak and Non-Peak load.
  • the program under test e.g., an application
  • KPIs key performance indicators
  • TPS Transactions Per Second
  • the ramp up and/or ramp down refers to how users and/or threads are created and/or deleted in the system within a given time and pacing refers to time between two transactions. These factors can either be configured as a constant number or can be randomly configured in the load generator.
  • a performance test allows approving a program under test based on peak load which is incurred into the system at a steady state whereas by mimicking production load pattern and following peaks and valleys as disclosed herein, the testing can evaluate the performance of the program under test based on how the system peaks to a high volume and then comes down to a lower load. For example, the connections and/or objects which are reclaimed back to the pool can be detected and a leak can also be detected. This can help avoid server crashes and outage.
  • the way the production system works can be quite different in terms of load exerted on the system by clients with multiple peaks and valleys during any interval.
  • the capability of the load generator is limited to the amount of load that can be either constant or random, but the load generator is not capable of mimicking exact production behaviour with respect to load and/or TPS. This limits the ability of any performance test to identify real world performance issues that can be caused due to sudden peak or valley that may happen in production but not simulated in performance test due to limitation in load generator tool.
  • the present disclosure may enable testing a program in a production situation (e.g. staging environment) which is an approximate or exact replica of a production environment.
  • the present disclosure may allow testing a program in a staging environment capable of mimicking a production environment.
  • the present disclosure may enable preparation of a staging environment with actual production data, such as with one or more performance parameters associated with the program in the production environment and production load information (e.g., associated with the production environment).
  • the present disclosure may enable to derive realistic load behaviour from a production environment (e.g., a real environment), such as to generate production load information from a production environment.
  • the production load information allows simulating production load.
  • the simulated production load may be seen as production load that resembles a production environment.
  • a performance testing environment disclosed herein refers to an environment where a program can be tested for performance.
  • a performance testing environment may comprise software and hardware components that enable users (e.g., developers) to test a program.
  • a performance testing environment may comprise a staging environment (e.g., a pre-production environment).
  • a staging environment may refer to an environment for testing a program which exactly resembles a production environment (e.g., in terms of infrastructure).
  • a staging environment may allow testing a program (e.g., an application) before its deployment.
  • a performance test is driven by requirements (also known as Non Functional Requirement, NFR) which are translated and implemented using a load generation tool to push load to a pre-production or staging environment.
  • This pre-production or staging environment can be an exact replica of the production conditions in terms of infrastructure and data to have a test as close as possible to real life.
  • a production environment disclosed herein refers to an environment where the program (e.g., an application) is deployed and is available to end users (e.g., users intending to use the program).
  • the program may be deployed for internal use and/or for external use.
  • a production environment may be seen as an environment where a version (e.g., a most current version) of a program is deployed to be accessed by end users.
  • a production environment may be seen as an environment where end users can experience and/or interact and/or use a most current version of a program.
  • Performance data disclosed herein may be seen as data for measuring the performance of a program, e.g. under a simulated and/or emulated production conditions in the performance testing environment.
  • the performance data may comprise one or more performance parameters associated with the program in the production environment (e.g. in a simulated and/or emulated production environment, such as a staging environment).
  • One or more performance parameters associated with a program in a production environment disclosed herein may be seen as one or more performance parameters metrics and/or one or more key performance indicators associated with a program.
  • the one or more performance parameters may be seen as quantitative indicators (e.g., measurements) of performance of a program.
  • the one or more performance parameters associated with program in a production environment may be one or more performance parameters obtained from a repository (e.g., a database) which stores one or more performance parameters from a production environment.
  • the one or more performance parameters may be one or more performance parameters resulting from a previous release (e.g., launching) of a product associated with a program.
  • Production load information disclosed herein may be seen as load information associated with a production environment experienced and/or observed by an electronic device when executing the program, e.g. during a certain period of time.
  • Production load information indicative of a load pattern disclosed herein may be seen as load information associated with a production environment.
  • Production load information may comprise a load pattern for a specific time (e.g., a time stamp) and/or specific time range (e.g., a statistical representation of a time range).
  • a load pattern may be seen as a load behaviour occurring at a specific time and/or during specific time range.
  • Fig. 1 A is a diagram illustrating schematically an example process 1 for performing a test of a program according to this disclosure.
  • Fig.1 A shows an electronic device 300 according to this disclosure.
  • the electronic device includes optionally a plugin 320 configured to run the testing of the program, optionally an internal load generator 360, and optionally an internal Application Performance Monitoring (APM) tool 380.
  • the electronic device 300 is configured to obtain the performance data from the APM tool 380.
  • the electronic device 300 is configured to provide in 16 an output that is based on the production load information and the performance data and can be represented as illustrated by representation 20.
  • Fig. 1 A shows an Application Performance Monitoring (APM) device 400 external to the electronic device 300.
  • the electronic device 300 is configured to obtain in 14 the performance data and obtain in 12 production load information (illustrated by a representation 10) from the external APM device 400.
  • API Application Performance Monitoring
  • Fig. 1 B is a diagram illustrating an example representation 2 of production load information 30 according to this disclosure.
  • the production load information 30 obtained includes a time parameter 22 and a corresponding Transaction Per Seconds 24 at each time parameter for a period of time starting 25 December 2021 at midnight.
  • the date as 25 December 2021 is obtained in the electronic device (e.g., plugin), e.g. by user input.
  • the user may trigger the testing disclosed e.g. by clicking on user interface object.
  • the electronic device obtains the performance data and the production load inform as illustrated in 2, e.g. via an internal APM tool and/or external APM device, e.g. by query for load pattern for 25 December 2021 .
  • the production load information may be provided in any format accepted by the electronic device, e.g. a CSV format, XML, JSON, etc.
  • the electronic device pushes the load parameters to load generator based on which load can be simulated by the load generator.
  • Fig. 2 is a signalling diagram illustrating schematically an example process 500 for performing a performance test of a program according to this disclosure.
  • the electronic device 300 obtains, from an internal APM tool 380 and/or an external APM device 400, performance data 502 and a production load information 504 (e.g., production load information 30 of Fig. 1 B).
  • the production load data 504 may be associated with a load pattern (e.g., load pattern indicated in the production load information 30 of Fig. 1A).
  • the performance data may comprise one or more performance parameters associated with the program in a production environment (e.g., an emulated and/or simulated staging environment).
  • the production load data 504 can comprise, for example, a number of transactions per second (e.g., per time unit), such as illustrated in Fig. 1 B.
  • the number of transactions may be seen as a volume of transactions and/or requests processed by the program (e.g., an application).
  • the electronic device 300 may determine, based on the production load information 504, one or more load parameters.
  • the one or more load parameters are determined by the electronic device 300 (e.g. plugin 320) associated with a time parameter (e.g., time parameter 22 of Fig. 1 B).
  • the time parameter can, for example, one or more of: a time and/or a date, a time and/or date range, a statistical representation of a time and/or date range.
  • the time parameter is associated with a duration of the load pattern, as illustrated in Fig. 1 B.
  • the time parameter may comprise one or more time values, e.g. in form of a time vector.
  • the electronic device 300 e.g. plugin 320
  • the electronic device 300 e.g. plugin 320
  • the electronic device 300 may determine the one or more load parameters for each time value of the one or more time values.
  • the one or more load parameters can, for example, be a number of virtual users that matches the load pattern (e.g., which is associated with the production load information).
  • the one or more load parameters may comprise one or more of: a number of emulated users per time unit, a number of threads per time unit, a number of transactions per time unit, a response time, and a throughput.
  • the electronic device 300 may provide to a load generator 360, with the one or more load parameters 506.
  • the number of virtual users determined by the plugin can be seen as an input to the load generator 360. It may be envisaged that the load generator 360 is external to the electronic device 300.
  • the load generator 360 can, for example, simulate and/or emulate, based on the one or more load parameters 506, load data 508 for testing the program in the performance testing environment and/or staging environment under production conditions.
  • the load data 508 may be a time series (e.g. data including time series) indicating a production load (e.g., a simulated load) associated with the program at one or more instances of time.
  • the load generator 360 generates (e.g., simulate, emulate) the load data 508 associated with each time value comprised in the time parameter.
  • the load generator 360 generates (e.g., simulate, emulate), based on existing computational resources in the production environment, the load data 508 for use in the staging or performance testing environment.
  • the load data 508 may be a replica of the production pattern information 504.
  • the load data 508 may be a replica of the production pattern information 504 (e.g. an exact or approximate replica).
  • the load generator 360 may generate load date reflecting the production load information 504.
  • the electronic device 300 e.g. plugin 320 determines, based on the load data 508 and the performance data 502, production performance data 510 indicative of a performance of the program under production load, such as emulated and/or simulated production load.
  • the production performance data 510 can be seen as data which measures the performance of the program under simulated and/or emulated production conditions.
  • the production performance data 510 (e.g., associated with a time parameter) may include performance parameters that approximate or are close to (e.g. resemble) future performance parameters of the program when the program will be deployed in the production environment.
  • the production performance data 510 may comprise performance parameters that measure the computational and/or hardware and/or software resources used by the program under the production load.
  • the load generator 360 providing the load data 508 associated with each time value provides the advantage of testing the performance of the program on the basis of the production load at each time instance rather than just the peak values.
  • Fig. 3 shows a flow chart of an exemplary method 100, performed by an electronic device, for testing a program in a performance testing environment according to the disclosure.
  • the electronic device is the electronic device disclosed herein, such as electronic device 300 of Figs. 1A, 2, and 4.
  • the method 100 comprises testing S102 a program in a performance testing environment.
  • Testing S102 the program comprises obtaining S102A performance data indicative of one or more performance parameters associated with the program in a production environment, and production load information indicative of a load pattern.
  • obtaining the performance data comprises obtaining the performance data from a memory of the electronic device, and/or an internal APM tool.
  • the electronic device may test the program by obtaining the performance data from a memory of the electronic device.
  • obtaining the performance data comprises obtaining the performance data from an external APM device such as external APM device 400 of Fig. 1 .
  • the electronic device may test the program by obtaining the performance data from an external APM device and/or an internal APM tool.
  • the APM device and/or APM tool can be seen as a monitoring and analytics tool which can determine (e.g., to track) performance data and monitor, based on the performance data, the performance of a program.
  • the external APM device and/or the internal APM tool may be configured for performance monitoring.
  • the performance data comprises the one or more performance parameters associated with the program in a production environment.
  • the performance data shows which type of performance parameters to measure on, and a nominal (e.g. average) value for a given performance parameter.
  • the performance data may be associated with the program deployed in the production environment.
  • the one or more performance parameters can be seen as one or more performance metrics and/or one or more key performance indicators, KPIs.
  • the one or more performance parameters may be seen as quantitative indicators (e.g., measurements) of performance of the program.
  • the performance data comprises the one or more performance parameters including a trend parameter showing that a specific parameter follows a trend (e.g. increase or decrease over a last period of time).
  • the production load information can be load information associated with a production environment which is used by the electronic device for testing performance of the program under production conditions.
  • the production load information can comprise information indicative of a load pattern, such as information associated with an amount of load (e.g., characterized by, for example, a load parameter) the program imposes on existing computing resources.
  • the production load information can comprise a load pattern for a specific time and/or a time period, (e.g., a statistical representation of a time range).
  • the production load information can comprise a load pattern observed in the past.
  • a load pattern represents a load behaviour occurring at a specific time and/or during a time period, such as load pattern illustrated by representation 10 of Fig.1 A.
  • a load pattern may comprise a load parameter for each time value of the time period.
  • the load parameter may be seen as a volume of transactions processed by the program (e.g., an application) in the production environment.
  • the disclosed technique may allow determining performance of the program under varying loads (e.g., varying user loads), such as from non-peak loads to peak loads, including one or more pauses.
  • the electronic device can replicate a response time by injecting “pause ” times so that the APIs performance is replica of what is observed in production system at time instance.
  • the performance data and the production load information are obtained, based on an input time parameter, by the electronic device.
  • the input time parameter may be one or more of: a time and/or a date, a time and/or date range, a statistical representation of a time and/or date range.
  • the input time parameter can be provided by user input (such as 25 December 2021 of Fig. 1 B).
  • the electronic device can obtain (e.g., fetch) the performance data and the production load information over a network from the external APM device.
  • the electronic device e.g., the plugin
  • the electronic device perform a handshake with the internal APM tool and/or external APM device, e.g. for initialization before obtaining the performance data and the production load information for testing the program.
  • the electronic device e.g., the plugin
  • a communication protocol e.g., HTTP, TCP, UDP
  • the electronic device can obtain the performance data and/or the production load information in a data file format (e.g., XML, XLX, CSV, JSON).
  • the electronic device e.g., the plugin
  • the electronic device e.g., the plugin
  • the method 100 comprises providing S106, based on the performance data and the production load information, an output.
  • the output is for assisting a user (e.g., a programmer and/or a developer and/or a performance tester) in testing the program.
  • testing S102 the program comprises determining S102B one or more load parameters based on the production load information.
  • the performance data and the production load information may be stored in a memory the electronic device (e.g., a memory associated with the plugin).
  • the electronic device can, for example, determine, based on the input time parameter and the production load information, one or more load parameters.
  • the input time parameter is associated with a duration of the load pattern (e.g., which is associated with the production load information) and/or indicates the time period of interest.
  • the electronic device may determine one or more load parameters for each time value of the time period of interest (e.g., corresponding to the input time parameter, e.g. a time and/or date range).
  • the electronic device may determine, based on the production load information, one or more load parameters for a time value comprised in the time parameter (e.g., the time parameter comprises a single time value).
  • the one or more load parameters may be seen as load testing performance metrics (e.g., number of transactions per second, TPS, and/or number of requests per second, RPS).
  • Fig. 1 B shows an example on how to determine a load parameter such as number of users based on the TPS.
  • the one or more load parameters determined by the electronic device e.g., the plugin
  • testing S102 the program comprises providing S102C, to a load generator, the one or more load parameters.
  • the load generator can be seen as a load generation tool and/or a load generation device.
  • the electronic device including e.g. plugin 320 in Fig. 1 A
  • the load generator can be an internal load generator of the electronic device.
  • the load generator is external to the electronic device and the one or more load parameters are transmitted to the load generator.
  • testing S102 the program comprises obtaining S102D, from the load generator, load data indicative of a production load.
  • the load generator simulates, based on the one or more load parameters, load (e.g., volume of transactions processed by the program) to test a program (e.g., in terms of performance and/or scalability).
  • load e.g., volume of transactions processed by the program
  • the load generator determines the load data for the one or more load parameters.
  • the load generator is external to the electronic device and the load data is received by the electronic device from the load generator.
  • the load data may be a time series data indicating the production load (e.g., a simulated load) associated with the program at one or more instances of time.
  • the load generator may determine (e.g., simulate, emulate) load data associated with each time value of the time period of interest.
  • the electronic device guides the load generator to replicate, based on the load parameter, the load pattern associated with the production environment for each time value of the time period (e.g., for each instance of time).
  • the load data may be a replica of the production pattern information obtained (e.g., from a past production behaviour).
  • the load generator simulates the load data by calling an application programming interface, API, request (e.g., an API call).
  • the load generator simulates the load data by performing, based on the one or more load parameters, an API request to the program (e.g., application) under test in the staging environment (e.g., pre-production environment).
  • an outcome of the API call can be to obtain TPS, response time, errors and/or failures.
  • the load pattern associated with the production load information is extracted, by the electronic device (e.g., the plugin).
  • the load generator e.g., comprised in the electronic device
  • the one or more load parameters may be determined by the plugin based on the production load information.
  • the load generator may simulate the load data based on the one or more load parameters.
  • the load generator may reproduce a load pattern based on the load pattern associated with the production load information.
  • the load data may reflect the production load information associated with the production environment which contains a past production conditions provided in the production load information.
  • testing S102 the program comprises determining S102E, based on the load data and the performance data, production performance data indicative of a performance of the program under the production load.
  • the production performance data shows the performance of the program under the production load provided by the production load information (e.g., load pattern) to reflect the production environment, such as from a real production system (e.g., past production behaviour associated with the program).
  • the production performance data includes performance parameters providing a value to the performance parameter based on the production load provided by the production load information.
  • the production performance data includes the response time of the program under a production load emulated based on the number of emulated users per time unit provided in the load parameters. For example, when the performance data provides the average response time for a program, the production performance data provides an updated response time reflecting the response time under production condition (which may be the same, higher, or lower than the average response time).
  • the method 100 comprises determining S104 the output based on the production performance data.
  • the output is indicative of the performance of the program under the production load.
  • the output can include the production performance data (e.g., a simulated load behaviour).
  • the production performance data may be determined by the electronic device based on the load data.
  • the performance of the program is measured by the electronic device under the production load (e.g., associated with the production performance data) and provided in the production performance data.
  • the production performance data may be associated with a time period and provided for the time instances of the time period (e.g., production performance data 20 of Fig. 1 A).
  • the production performance data may comprise performance parameters that approximate the performance of the program in a real production environment.
  • the output indicates the performance of the program under the production load e.g. by providing a representation of the production performance data.
  • the output may be the approval of the program under test based on the production performance data being satisfactory (e.g. above a threshold).
  • the production performance data may be based on the computational, hardware and/or software resource used by the program under the production load.
  • the production performance data may indicate whether there is adequate capacity (e.g., scalability) to support the program under the production load.
  • the production performance data may identify performance bottlenecks associated with the program.
  • production performance data includes a parameter indicative of buffer overflow and/or a parameter indicative of leakage.
  • the production performance data may be useful to predict future changes in the performance of the program.
  • the disclosed methods may allow releasing a robust version of the program (e.g., an application) with a reduced likelihood of issues under production.
  • the production performance data can be used, by a user, in an environment.
  • the environment may be one or more of: a staging environment, a pre-production environment, a development environment, a performance testing environment, a quality control environment, a monitoring environment.
  • the disclosed electronic device may be used by a user in an environment where a performance test is being performed in a program.
  • the output of the disclosed method is used for updating the program under test, for controlling a machine and/or a process for quality control, for development of the program, for testing the program, for monitoring and/or for controlling a load balancing.
  • obtaining S102A the performance data comprises generating S102AA the performance data by querying a repository.
  • the electronic device e.g., the plugin
  • queries a repository e.g., a database
  • performance data e.g., requesting for performance data and/or collecting performance data
  • the electronic device and/or the external APM device may comprise the repository.
  • the repository may be an external database.
  • the repository comprises historical performance parameters.
  • the one or more historical performance parameters are one or more performance parameters associated with the program that have been measured previously.
  • the one or more performance parameters comprise one or more of: a response time, a throughput, and a resource utilisation parameter.
  • the one or more performance parameters can be seen as one or more performance metrics and/or one or more key performance indicators, KPIs.
  • the response time is, for example, the time for a first system node (e.g., an application server and/or an application programming interface, API) to respond to a request from a second system node (e.g., a client and/or a user).
  • the throughput can be, for example, the number of tasks (e.g., tasks associated with the request from a second system node) processed per time unit by the first system node (e.g., an application server and/or an application programming interface, API).
  • the resource utilisation parameter can be seen as parameter indicating how much a resource (e.g., hardware resource and/or software resource) is utilized by an operation of a program.
  • the resource utilisation parameter can include a hardware resource utilization parameter and/or a software resource utilization parameter.
  • the hardware resources comprise central processing unit, CPU, utilisation, memory utilisation, hardware accelerators and/or hardware clocks.
  • the CPU utilisation (e.g., CPU usage) can be, for example, a percentage of time the CPU spends in handling an operation and/or a task (e.g., a task associated with the request from the second system node).
  • the memory utilisation (e.g., memory usage) can be, for example, an amount of memory required to process the request.
  • the one or more load parameters are one or more actual load parameters that are used at the load generator for emulating and/or simulating the load data.
  • the one or more load parameters are one or more parameters that match the production load information (e.g., a load pattern) associated with a program in a production environment.
  • the one or more load parameters can be determined, based on the production load information, by the electronic device.
  • the electronic device may guide the load generator to replicate a load pattern similar to the load pattern associated with the production load information.
  • the load data may be seen as the replicated load pattern by the load generator.
  • the electronic device may guide the load generator by (e.g., permanently and/or when necessary) providing the load generator with the load parameter(s) allowing to simulate and/or emulate the production load without having to deploy the program.
  • the one or more load parameters comprise one or more of: a number of emulated users per time unit, a number of threads per time unit, a number of transactions per time unit, a response time, and a throughput.
  • the electronic device determines, based on an average response associated with a time parameter (as production load information), a throughput in transactions per time unit (as load parameter).
  • the one or more load parameters can, for example, be a number of virtual users (e.g., threads) to reflect the production load information obtained.
  • the load data can, for example, be associated with a production environment where a past version of the program was deployed by, for example, internal and/or external users.
  • the one or more load parameters can be a volume of transactions processed by the program (e.g., transactions per time unit and/or requests per time unit).
  • the one or more load parameters may be a time required for a program to respond to a user accessing the program in the production environment, under a load (e.g., a load value comprised in the production load information).
  • determining S102B the one or more load parameters based on the production load information comprises applying S102BA a load modelling technique to the production load information, and optionally to the performance data.
  • a load modelling technique enables determining (e.g., calculating as illustrated in Fig. 1 B) the one or more load parameters.
  • the load technique may allow estimating, based on the production load information (e.g., the load pattern), the one or more load parameters.
  • determining the one or more load parameters may comprise determining the one or more load parameters based on a mathematical representation of relationship between the production load information and the performance data. For example, the number of users and/or threads can be calculated, by the electronic device, based on production load. For example, the number of users and/or threads are used as an input to the load generator.
  • the electronic device comprises a display device.
  • providing S106 the output comprises displaying S106A, on the display device, a user interface object representative of the output.
  • the user interface object representative of the output can be an icon, a message and/or a notification (e.g., to a developer and/or tester).
  • the icon, the message and/or the notification may comprise the production performance data in form of a graph and/or a table and/or a vector and/or a distribution and/or a function.
  • the production performance data in form of a graph is illustrated in Fig. 1 A.
  • providing S106 the output comprises transmitting S106B the output to an external device.
  • providing the production performance data e.g., in form of a graphic and/or a table and/or a vector and/or a distribution and/or a function
  • providing the production performance data comprises informing a user about a load to be used when deploying new version of the program in a production environment. The user may identify, based on the production performance data, computing bottlenecks associated with the program under test and investigate stability of the program under test by analysing non-peak and peak traffic events.
  • Fig. 4 shows a block diagram of an exemplary electronic device 300 according to the disclosure.
  • the electronic device 300 comprises memory 301 , processor 302, and an interface 303.
  • the electronic device 300 is configured to perform any of the methods disclosed in Fig. 3.
  • the electronic device 300 may comprise a plugin and optionally a load generator (e.g., a load generator device or a load generator tool) and optionally an internal APM tool.
  • the electronic device 300 may comprise a plugin which is configured to perform the method disclosed herein.
  • the load generator is part of a device external to the electronic device, where the device is configured to communicate with the electronic device 300.
  • the electronic device is configured to communicate with an external APM device (such as external APM device 400 of Fig. 1 A and 2).
  • the electronic device 300 may be seen as a program testing device.
  • the electronic device 300 is configured to test (e.g., using the processor 302) a program in a performance testing environment.
  • the electronic device 300 is configured to test the program by obtaining (e.g., via the interface 303 and/or using the memory 301) performance data indicative of one or more performance parameters associated with the program in a production environment, and a production load information indicative of a load pattern.
  • the electronic device 300 is configured to provide (e.g., using the processor 302 and/or the interface 303), based on the performance data and the production load information, an output.
  • the electronic device 300 is optionally configured to perform any of the operations disclosed in Fig. 3 (such as any one or more of: S102, S102A, S102AA, S102B, S102BA, S102C, S102D, S102E, S104, S106, S106A, S106B).
  • the operations of the electronic device 300 may be embodied in the form of executable logic routines (e.g., lines of code, software programs, etc.) that are stored on a non-transitory computer readable medium (e.g., the memory 301) and are executed by the processor 302.
  • the operations of the electronic device 300 may be considered a method that the electronic device 300 is configured to carry out. Also, while the described functions and operations may be implemented in software, such functionality may as well be carried out via dedicated hardware or firmware, or some combination of hardware, firmware and/or software.
  • the memory circuitry 301 may be one or more of: a buffer, a flash memory, a hard drive, a removable media, a volatile memory, a non-volatile memory, a random access memory (RAM), and any other suitable device.
  • the memory 301 may include a non-volatile memory for long term data storage and a volatile memory that functions as system memory for the processor 302.
  • the memory 301 may exchange data with the processor 302 over a data bus. Control lines and an address bus between the memory 301 and the processor 302 also may be present (not shown in Fig. 4).
  • the memory 301 is considered a non-transitory computer readable medium.
  • the memory 301 may be configured to store performance data, one or more performance parameters, production load information, one or more load parameters, load data, production performance data in a part of the memory.
  • Embodiments of methods and products (electronic device) according to the disclosure are set out in the following items:
  • Item 1 A method, performed by an electronic device, the method comprising:
  • testing the program comprises obtaining performance data indicative of one or more performance parameters associated with the program in a production environment, and production load information indicative of a load pattern;
  • testing the program comprises:
  • testing the program comprises: determining, based on the load data and the performance data, production performance data indicative of a performance of the program under the production load.
  • Item 6 The method according to item 5, wherein the repository comprises historical performance parameters.
  • Item 7 The method according to any of the previous items, wherein the one or more performance parameters comprise one or more of: a response time, a throughput, and a resource utilisation parameter.
  • Item 8 The method according to any of items 2-7, wherein the one or more load parameters are one or more actual load parameters that are used at the load generator for emulating and/or simulating the load data.
  • Item 9 The method according to any of items 2-8, wherein the one or more load parameters comprise one or more of: a number of emulated users per time unit, a number of threads per time unit, a number of transactions per time unit, a response time, and a throughput.
  • Item 10 The method according to any of items 2-9, wherein determining the one or more load parameters based on the production load information comprises: applying a load modelling technique to the production load information.
  • Item 11 The method according to any of the previous items, wherein the electronic device comprises a display device; wherein providing the output comprises displaying, on the display device, a user interface object representative of the output.
  • Item 12 The method according to any of the previous items, wherein providing the output comprises transmitting the output to an external device.
  • Item 13 An electronic device comprising a memory, a processor, and an interface, wherein the electronic device is configured to perform any of the methods according to any of items 1-12.
  • Item 14 A computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by an electronic device cause the electronic device to perform any of the methods of items 1 -12.
  • first”, “second”, “third” and “fourth”, “primary”, “secondary”, “tertiary” etc. does not imply any particular order, but are included to identify individual elements.
  • the use of the terms “first”, “second”, “third” and “fourth”, “primary”, “secondary”, “tertiary” etc. does not denote any order or importance, but rather the terms “first”, “second”, “third” and “fourth”, “primary”, “secondary”, “tertiary” etc. are used to distinguish one element from another.
  • the words “first”, “second”, “third” and “fourth”, “primary”, “secondary”, “tertiary” etc. are used here and elsewhere for labelling purposes only and are not intended to denote any specific spatial or temporal ordering.
  • the labelling of a first element does not imply the presence of a second element and vice versa.
  • Figures comprises some circuitries or operations which are illustrated with a solid line and some circuitries or operations which are illustrated with a dashed line.
  • the circuitries or operations which are comprised in a solid line are circuitries or operations which are comprised in the broadest example embodiment.
  • the circuitries or operations which are comprised in a dashed line are example embodiments which may be comprised in, or a part of, or are further circuitries or operations which may be taken in addition to the circuitries or operations of the solid line example embodiments. It should be appreciated that these operations need not be performed in order presented. Furthermore, it should be appreciated that not all of the operations need to be performed.
  • the exemplary operations may be performed in any order and in any combination.
  • a computer-readable medium may include removable and non-removable storage devices including, but not limited to, Read Only Memory (ROM), Random Access Memory (RAM), compact discs (CDs), digital versatile discs (DVD), etc.
  • program circuitries may include routines, programs, objects, components, data structures, etc. that perform specified tasks or implement specific abstract data types.
  • Computer-executable instructions, associated data structures, and program circuitries represent examples of program code for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps or processes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

Disclosed is a method performed by an electronic device. The method comprises testing a program in a performance testing environment. Testing the program comprises obtaining performance data indicative of one or more performance parameters associated with the program in a production environment, and production load information indicative of a load pattern. The method comprises providing, based on the performance data and the production load information, an output.

Description

A METHOD FOR ENABLING AN ELECTRONIC DEVICE TO TEST A PROGRAM IN A
PERFORMANCE TESTING ENVIRONMENT
The present disclosure pertains to the field of information technology. The present disclosure relates to a method for enabling an electronic device to test a program in a performance testing environment and related electronic device.
BACKGROUND
Success of a software product depends on its ability to be reliable (e.g., functionally behaving as per expectations, fault tolerant, performant and/or scalable). For example, software testing plays an important role in both achieving and evaluating software (e.g., code) capable of delivering a high quality and performant software product. However, the testing is mostly static and not performed in conditions of production.
SUMMARY
Approval of the program under test is given based on how the application running on pre- production environment behaves under a constant or random load but not how application performs in production. There is a need to improve how a performance test is used to approve a program for production deployment.
Accordingly, there is a need for an electronic device and a method, which may mitigate, alleviate, or address the shortcomings existing and may provide testing of a program in conditions of a production environment.
Disclosed is a method performed by an electronic device. The method comprises testing a program in a performance testing environment. Testing the program comprises obtaining performance data indicative of one or more performance parameters associated with the program in a production environment, and production load information indicative of a load pattern. The method comprises providing, based on the performance data and the production load information, an output. Further, an electronic device is disclosed. The electronic device comprises a memory, a processor, and an interface. The first electronic device is configured to perform any of the methods disclosed herein.
Disclosed is a computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by an electronic device cause the electronic device to perform any of the methods disclosed herein.
It is an advantage of the present disclosure that the disclosed electronic device and method may allow identifying issues, such as bottlenecks, of the program under test, that may only appear under production conditions. Further, the disclosed technique allows an evaluation of the program’s stability under production conditions. The production load information allows a load generator to simulate and/or emulate exact production conditions for testing the program. For example, the testing provides a replica of production behaviour. This may help discover early an unexpected issue that potentially can happen in production.
The disclosed technique supports optimising a cost of test environments by not running a load test at constant peak volume and instead running with actual production pattern resulting in utilising just the right amount of resources required to incur the load. The cost saved is for example directly proportional to number of tests run.
The disclosed technique is cost effective since performance issues can be detected without going into actual production due to unexpected peaks and valleys that only happen in production and isn’t accounted for during performance test due to the inability of load generator and also inability to predict real world behaviour.
The disclosed technique permits reducing efforts in workload modelling for a performance test, thereby saving time and cost. The performance baseline provided in the production load information is for example from a past performance of a real production system.
The testing is thereby improved and allows an approval of the program under test that reduces the likelihood of issues during actual deployment of the program. BRIEF DESCRIPTION OF THE DRAWINGS
The above and other features and advantages of the present disclosure will become readily apparent to those skilled in the art by the following detailed description of exemplary embodiments thereof with reference to the attached drawings, in which:
Fig. 1 A is a diagram illustrating schematically an example process for performing a performance test of a program according to this disclosure,
Fig. 1 B is a diagram illustrating an example representation of production load information according to this disclosure,
Fig. 2 is a signalling diagram illustrating schematically an example process for performing a performance test of a program according to this disclosure,
Fig. 3 is a flow-chart illustrating an exemplary method performed by an electronic device, for testing a program in a performance testing environment according to this disclosure, and
Fig. 4 is a block diagram illustrating an exemplary electronic device according to this disclosure.
DETAILED DESCRIPTION
Various exemplary embodiments and details are described hereinafter, with reference to the figures when relevant. It should be noted that the figures may or may not be drawn to scale and that elements of similar structures or functions are represented by like reference numerals throughout the figures. It should also be noted that the figures are only intended to facilitate the description of the embodiments. They are not intended as an exhaustive description of the disclosure or as a limitation on the scope of the disclosure. In addition, an illustrated embodiment needs not have all the aspects or advantages shown. An aspect or an advantage described in conjunction with a particular embodiment is not necessarily limited to that embodiment and can be practiced in any other embodiments even if not so illustrated, or if not so explicitly described.
The figures are schematic and simplified for clarity, and they merely show details which aid understanding the disclosure, while other details have been left out. Throughout, the same reference numerals are used for identical or corresponding parts. A load generator is presently limited to running two scenarios: Peak and Non-Peak load. The program under test (e.g., an application) can be measured in terms of key performance indicators (KPIs) under peak and non-peak load and given an approval if the KPIs meet a service level agreement pre-defined. During the test, these peak/non-peak volumes are maintained at a steady state in terms of load and/or Transactions Per Second (TPS) and then the system running the program under test is observed for any glitch or error during the steady state.
As part of a performance test, there are couple of others factors to keep in mind: the number of virtual users and/or threads that mimic real users and exert load on the system, and the ramp up and/or ramp down and pacing. The ramp up and/or ramp down refers to how users and/or threads are created and/or deleted in the system within a given time and pacing refers to time between two transactions. These factors can either be configured as a constant number or can be randomly configured in the load generator.
A performance test allows approving a program under test based on peak load which is incurred into the system at a steady state whereas by mimicking production load pattern and following peaks and valleys as disclosed herein, the testing can evaluate the performance of the program under test based on how the system peaks to a high volume and then comes down to a lower load. For example, the connections and/or objects which are reclaimed back to the pool can be detected and a leak can also be detected. This can help avoid server crashes and outage.
However, in real deployment, the way the production system works can be quite different in terms of load exerted on the system by clients with multiple peaks and valleys during any interval. The capability of the load generator is limited to the amount of load that can be either constant or random, but the load generator is not capable of mimicking exact production behaviour with respect to load and/or TPS. This limits the ability of any performance test to identify real world performance issues that can be caused due to sudden peak or valley that may happen in production but not simulated in performance test due to limitation in load generator tool.
The present disclosure may enable testing a program in a production situation (e.g. staging environment) which is an approximate or exact replica of a production environment. In other words, the present disclosure may allow testing a program in a staging environment capable of mimicking a production environment. Put differently, the present disclosure may enable preparation of a staging environment with actual production data, such as with one or more performance parameters associated with the program in the production environment and production load information (e.g., associated with the production environment). The present disclosure may enable to derive realistic load behaviour from a production environment (e.g., a real environment), such as to generate production load information from a production environment. The production load information allows simulating production load. The simulated production load may be seen as production load that resembles a production environment.
A performance testing environment disclosed herein refers to an environment where a program can be tested for performance. A performance testing environment may comprise software and hardware components that enable users (e.g., developers) to test a program. A performance testing environment may comprise a staging environment (e.g., a pre-production environment). A staging environment may refer to an environment for testing a program which exactly resembles a production environment (e.g., in terms of infrastructure). A staging environment may allow testing a program (e.g., an application) before its deployment. For example, a performance test is driven by requirements (also known as Non Functional Requirement, NFR) which are translated and implemented using a load generation tool to push load to a pre-production or staging environment. This pre-production or staging environment can be an exact replica of the production conditions in terms of infrastructure and data to have a test as close as possible to real life.
A production environment disclosed herein refers to an environment where the program (e.g., an application) is deployed and is available to end users (e.g., users intending to use the program). The program may be deployed for internal use and/or for external use. In other words, a production environment may be seen as an environment where a version (e.g., a most current version) of a program is deployed to be accessed by end users. Put differently, a production environment may be seen as an environment where end users can experience and/or interact and/or use a most current version of a program.
Performance data disclosed herein may be seen as data for measuring the performance of a program, e.g. under a simulated and/or emulated production conditions in the performance testing environment. The performance data may comprise one or more performance parameters associated with the program in the production environment (e.g. in a simulated and/or emulated production environment, such as a staging environment). One or more performance parameters associated with a program in a production environment disclosed herein may be seen as one or more performance parameters metrics and/or one or more key performance indicators associated with a program. In other words, the one or more performance parameters may be seen as quantitative indicators (e.g., measurements) of performance of a program. The one or more performance parameters associated with program in a production environment may be one or more performance parameters obtained from a repository (e.g., a database) which stores one or more performance parameters from a production environment. Put differently, the one or more performance parameters may be one or more performance parameters resulting from a previous release (e.g., launching) of a product associated with a program.
Production load information disclosed herein may be seen as load information associated with a production environment experienced and/or observed by an electronic device when executing the program, e.g. during a certain period of time. Production load information indicative of a load pattern disclosed herein may be seen as load information associated with a production environment. Production load information may comprise a load pattern for a specific time (e.g., a time stamp) and/or specific time range (e.g., a statistical representation of a time range). In other words, a load pattern may be seen as a load behaviour occurring at a specific time and/or during specific time range.
Fig. 1 A is a diagram illustrating schematically an example process 1 for performing a test of a program according to this disclosure. Fig.1 A shows an electronic device 300 according to this disclosure. The electronic device includes optionally a plugin 320 configured to run the testing of the program, optionally an internal load generator 360, and optionally an internal Application Performance Monitoring (APM) tool 380. In some examples, the electronic device 300 is configured to obtain the performance data from the APM tool 380.
The electronic device 300 is configured to provide in 16 an output that is based on the production load information and the performance data and can be represented as illustrated by representation 20. Fig. 1 A shows an Application Performance Monitoring (APM) device 400 external to the electronic device 300. In some examples, the electronic device 300 is configured to obtain in 14 the performance data and obtain in 12 production load information (illustrated by a representation 10) from the external APM device 400.
Fig. 1 B is a diagram illustrating an example representation 2 of production load information 30 according to this disclosure. In this example, the production load information 30 obtained includes a time parameter 22 and a corresponding Transaction Per Seconds 24 at each time parameter for a period of time starting 25 December 2021 at midnight.
For example, the date as 25 December 2021 is obtained in the electronic device (e.g., plugin), e.g. by user input. The user may trigger the testing disclosed e.g. by clicking on user interface object. The electronic device obtains the performance data and the production load inform as illustrated in 2, e.g. via an internal APM tool and/or external APM device, e.g. by query for load pattern for 25 December 2021 .
The production load information may be provided in any format accepted by the electronic device, e.g. a CSV format, XML, JSON, etc.
For example, the electronic device determines the load parameters, e.g. how many users are required to achieve the corresponding TPS, e.g. by: Number of users = TPS * RT where RT denotes the average response time of the program under test (in seconds). For example, for 25/12/2022 00:00:00, the required TPS is 2000 which means 20 users are required to get the desired load: 2000 * (10 / 1000). Similarly to achieve 128689 TPS, 1287 users are required.
For example, the electronic device pushes the load parameters to load generator based on which load can be simulated by the load generator.
Fig. 2 is a signalling diagram illustrating schematically an example process 500 for performing a performance test of a program according to this disclosure.
The electronic device 300 (e.g. plugin 320) obtains, from an internal APM tool 380 and/or an external APM device 400, performance data 502 and a production load information 504 (e.g., production load information 30 of Fig. 1 B). The production load data 504 may be associated with a load pattern (e.g., load pattern indicated in the production load information 30 of Fig. 1A).
The performance data may comprise one or more performance parameters associated with the program in a production environment (e.g., an emulated and/or simulated staging environment). The production load data 504 can comprise, for example, a number of transactions per second (e.g., per time unit), such as illustrated in Fig. 1 B. The number of transactions may be seen as a volume of transactions and/or requests processed by the program (e.g., an application).
The electronic device 300 (e.g. plugin 320) may determine, based on the production load information 504, one or more load parameters. The one or more load parameters are determined by the electronic device 300 (e.g. plugin 320) associated with a time parameter (e.g., time parameter 22 of Fig. 1 B).
The time parameter can, for example, one or more of: a time and/or a date, a time and/or date range, a statistical representation of a time and/or date range. For example, the time parameter is associated with a duration of the load pattern, as illustrated in Fig. 1 B. The time parameter may comprise one or more time values, e.g. in form of a time vector. The electronic device 300 (e.g. plugin 320) may determine the one or more load parameters for each time value of the one or more time values. The one or more load parameters can, for example, be a number of virtual users that matches the load pattern (e.g., which is associated with the production load information). The one or more load parameters may comprise one or more of: a number of emulated users per time unit, a number of threads per time unit, a number of transactions per time unit, a response time, and a throughput.
The electronic device 300 (e.g. plugin 320) may provide to a load generator 360, with the one or more load parameters 506. For example, the number of virtual users determined by the plugin can be seen as an input to the load generator 360. It may be envisaged that the load generator 360 is external to the electronic device 300.
The load generator 360 can, for example, simulate and/or emulate, based on the one or more load parameters 506, load data 508 for testing the program in the performance testing environment and/or staging environment under production conditions. For example, the load data 508 may be a time series (e.g. data including time series) indicating a production load (e.g., a simulated load) associated with the program at one or more instances of time. For example, the load generator 360 generates (e.g., simulate, emulate) the load data 508 associated with each time value comprised in the time parameter. The load generator 360 generates (e.g., simulate, emulate), based on existing computational resources in the production environment, the load data 508 for use in the staging or performance testing environment. The load data 508 may be a replica of the production pattern information 504. The load data 508 may be a replica of the production pattern information 504 (e.g. an exact or approximate replica). In some examples, the load generator 360 may generate load date reflecting the production load information 504. The electronic device 300 (e.g. plugin 320) determines, based on the load data 508 and the performance data 502, production performance data 510 indicative of a performance of the program under production load, such as emulated and/or simulated production load. For example, the production performance data 510 can be seen as data which measures the performance of the program under simulated and/or emulated production conditions. For example, the production performance data 510 (e.g., associated with a time parameter) may include performance parameters that approximate or are close to (e.g. resemble) future performance parameters of the program when the program will be deployed in the production environment. In one or more examples, the production performance data 510 may comprise performance parameters that measure the computational and/or hardware and/or software resources used by the program under the production load.
The load generator 360 providing the load data 508 associated with each time value provides the advantage of testing the performance of the program on the basis of the production load at each time instance rather than just the peak values.
Fig. 3 shows a flow chart of an exemplary method 100, performed by an electronic device, for testing a program in a performance testing environment according to the disclosure. The electronic device is the electronic device disclosed herein, such as electronic device 300 of Figs. 1A, 2, and 4.
The method 100 comprises testing S102 a program in a performance testing environment. Testing S102 the program comprises obtaining S102A performance data indicative of one or more performance parameters associated with the program in a production environment, and production load information indicative of a load pattern.
In one or more examples, obtaining the performance data comprises obtaining the performance data from a memory of the electronic device, and/or an internal APM tool. Put differently, the electronic device may test the program by obtaining the performance data from a memory of the electronic device. In one or more examples, obtaining the performance data comprises obtaining the performance data from an external APM device such as external APM device 400 of Fig. 1 . In other words, the electronic device may test the program by obtaining the performance data from an external APM device and/or an internal APM tool. In one or more examples, the APM device and/or APM tool can be seen as a monitoring and analytics tool which can determine (e.g., to track) performance data and monitor, based on the performance data, the performance of a program. In other words, the external APM device and/or the internal APM tool may be configured for performance monitoring.
In one or more examples, the performance data comprises the one or more performance parameters associated with the program in a production environment. In other words, the performance data shows which type of performance parameters to measure on, and a nominal (e.g. average) value for a given performance parameter. For example, the performance data may be associated with the program deployed in the production environment. In one or more examples, the one or more performance parameters can be seen as one or more performance metrics and/or one or more key performance indicators, KPIs. In other words, the one or more performance parameters may be seen as quantitative indicators (e.g., measurements) of performance of the program. The performance data comprises the one or more performance parameters including a trend parameter showing that a specific parameter follows a trend (e.g. increase or decrease over a last period of time).
For example, the production load information can be load information associated with a production environment which is used by the electronic device for testing performance of the program under production conditions. For example, the production load information can comprise information indicative of a load pattern, such as information associated with an amount of load (e.g., characterized by, for example, a load parameter) the program imposes on existing computing resources. In one or more examples, the production load information can comprise a load pattern for a specific time and/or a time period, (e.g., a statistical representation of a time range). In one or more examples, the production load information can comprise a load pattern observed in the past. Put differently, a load pattern represents a load behaviour occurring at a specific time and/or during a time period, such as load pattern illustrated by representation 10 of Fig.1 A. For example, a load pattern may comprise a load parameter for each time value of the time period. The load parameter may be seen as a volume of transactions processed by the program (e.g., an application) in the production environment. The disclosed technique may allow determining performance of the program under varying loads (e.g., varying user loads), such as from non-peak loads to peak loads, including one or more pauses. For example, the electronic device can replicate a response time by injecting “pause ” times so that the APIs performance is replica of what is observed in production system at time instance.
For example, the performance data and the production load information are obtained, based on an input time parameter, by the electronic device. The input time parameter may be one or more of: a time and/or a date, a time and/or date range, a statistical representation of a time and/or date range. The input time parameter can be provided by user input (such as 25 December 2021 of Fig. 1 B).
In one or more examples, the electronic device (e.g., the plugin) can obtain (e.g., fetch) the performance data and the production load information over a network from the external APM device. Put differently, the electronic device (e.g., the plugin) perform a handshake with the internal APM tool and/or external APM device, e.g. for initialization before obtaining the performance data and the production load information for testing the program.. For example, the electronic device (e.g., the plugin) can perform a handshake with the external APM device by using a communication protocol (e.g., HTTP, TCP, UDP). For example, the electronic device (e.g., the plugin) can obtain the performance data and/or the production load information in a data file format (e.g., XML, XLX, CSV, JSON). For example, the electronic device (e.g., the plugin) can obtain a graphical and/or a statistical representation of the performance data and/or the production load information. For example, the electronic device (e.g., the plugin) can obtain the performance data and the production load in form of a graphic and/or a table and/or a vector and/or a distribution and/or a function. The method 100 comprises providing S106, based on the performance data and the production load information, an output. In one or more examples, the output is for assisting a user (e.g., a programmer and/or a developer and/or a performance tester) in testing the program.
In one or more example methods, testing S102 the program comprises determining S102B one or more load parameters based on the production load information. Upon obtaining the performance data and the production load information (as illustrated by 30 in Fig. 1 B), the performance data and the production load information may be stored in a memory the electronic device (e.g., a memory associated with the plugin). The electronic device can, for example, determine, based on the input time parameter and the production load information, one or more load parameters. For example, the input time parameter is associated with a duration of the load pattern (e.g., which is associated with the production load information) and/or indicates the time period of interest. The electronic device may determine one or more load parameters for each time value of the time period of interest (e.g., corresponding to the input time parameter, e.g. a time and/or date range). The electronic device may determine, based on the production load information, one or more load parameters for a time value comprised in the time parameter (e.g., the time parameter comprises a single time value). The one or more load parameters may be seen as load testing performance metrics (e.g., number of transactions per second, TPS, and/or number of requests per second, RPS). Fig. 1 B shows an example on how to determine a load parameter such as number of users based on the TPS. The one or more load parameters determined by the electronic device (e.g., the plugin) may reflect the production load information (e.g., the load pattern) obtained.
In one or more example methods, testing S102 the program comprises providing S102C, to a load generator, the one or more load parameters. In one or more examples, the load generator can be seen as a load generation tool and/or a load generation device. The electronic device (including e.g. plugin 320 in Fig. 1 A) may include the load generator (e.g., load generator 360 of Fig. 1 A). In other words, the load generator can be an internal load generator of the electronic device. In some example, the load generator is external to the electronic device and the one or more load parameters are transmitted to the load generator.
In one or more example methods, testing S102 the program comprises obtaining S102D, from the load generator, load data indicative of a production load. In one or more examples, the load generator simulates, based on the one or more load parameters, load (e.g., volume of transactions processed by the program) to test a program (e.g., in terms of performance and/or scalability). For example, the load generator determines the load data for the one or more load parameters. In some example, the load generator is external to the electronic device and the load data is received by the electronic device from the load generator. In one or more examples, the load data may be a time series data indicating the production load (e.g., a simulated load) associated with the program at one or more instances of time. The load generator may determine (e.g., simulate, emulate) load data associated with each time value of the time period of interest. In one or more examples, the electronic device guides the load generator to replicate, based on the load parameter, the load pattern associated with the production environment for each time value of the time period (e.g., for each instance of time). In other words, the load data may be a replica of the production pattern information obtained (e.g., from a past production behaviour). For example, the load generator simulates the load data by calling an application programming interface, API, request (e.g., an API call). In other words, the load generator simulates the load data by performing, based on the one or more load parameters, an API request to the program (e.g., application) under test in the staging environment (e.g., pre-production environment). For example, an outcome of the API call can be to obtain TPS, response time, errors and/or failures.
For example, the load pattern associated with the production load information is extracted, by the electronic device (e.g., the plugin). The load generator (e.g., comprised in the electronic device) may obtain from the plugin (e.g., comprised in the electronic device) the one or more load parameters. The one or more load parameters may be determined by the plugin based on the production load information. The load generator may simulate the load data based on the one or more load parameters. The load generator may reproduce a load pattern based on the load pattern associated with the production load information. The load data may reflect the production load information associated with the production environment which contains a past production conditions provided in the production load information. It may be envisaged that a past version of the program have been experienced and/or accessed by one or more users in a production environment and that the production load information captures some of the behaviour of the past version of the program. In one or more example methods, testing S102 the program comprises determining S102E, based on the load data and the performance data, production performance data indicative of a performance of the program under the production load. For example, the production performance data shows the performance of the program under the production load provided by the production load information (e.g., load pattern) to reflect the production environment, such as from a real production system (e.g., past production behaviour associated with the program). For example, the production performance data includes performance parameters providing a value to the performance parameter based on the production load provided by the production load information. For example, the production performance data includes the response time of the program under a production load emulated based on the number of emulated users per time unit provided in the load parameters. For example, when the performance data provides the average response time for a program, the production performance data provides an updated response time reflecting the response time under production condition (which may be the same, higher, or lower than the average response time).
In one or more example methods, the method 100 comprises determining S104 the output based on the production performance data. In one or more example methods, the output is indicative of the performance of the program under the production load. In one or more examples, the output can include the production performance data (e.g., a simulated load behaviour). The production performance data may be determined by the electronic device based on the load data. For example, the performance of the program is measured by the electronic device under the production load (e.g., associated with the production performance data) and provided in the production performance data. The production performance data may be associated with a time period and provided for the time instances of the time period (e.g., production performance data 20 of Fig. 1 A). In other words, the production performance data may comprise performance parameters that approximate the performance of the program in a real production environment. In other words, the output indicates the performance of the program under the production load e.g. by providing a representation of the production performance data. In some examples, the output may be the approval of the program under test based on the production performance data being satisfactory (e.g. above a threshold). In one or more examples, the production performance data may be based on the computational, hardware and/or software resource used by the program under the production load. The production performance data may indicate whether there is adequate capacity (e.g., scalability) to support the program under the production load. The production performance data may identify performance bottlenecks associated with the program. In some examples, production performance data includes a parameter indicative of buffer overflow and/or a parameter indicative of leakage. The production performance data may be useful to predict future changes in the performance of the program. The disclosed methods may allow releasing a robust version of the program (e.g., an application) with a reduced likelihood of issues under production.
For example, the production performance data can be used, by a user, in an environment. The environment may be one or more of: a staging environment, a pre-production environment, a development environment, a performance testing environment, a quality control environment, a monitoring environment. In other words, the disclosed electronic device may be used by a user in an environment where a performance test is being performed in a program. For example, the output of the disclosed method is used for updating the program under test, for controlling a machine and/or a process for quality control, for development of the program, for testing the program, for monitoring and/or for controlling a load balancing.
In one or more example methods, obtaining S102A the performance data comprises generating S102AA the performance data by querying a repository. In one or more examples, the electronic device (e.g., the plugin) queries a repository (e.g., a database) to retrieve performance data (e.g., requesting for performance data and/or collecting performance data) from an internal APM tool and/or the external APM device. For example, the electronic device and/or the external APM device may comprise the repository. In some examples, the repository may be an external database.
In one or more example methods, the repository comprises historical performance parameters. In one or more examples, the one or more historical performance parameters are one or more performance parameters associated with the program that have been measured previously. In one or more example methods, the one or more performance parameters comprise one or more of: a response time, a throughput, and a resource utilisation parameter. In one or more examples, the one or more performance parameters can be seen as one or more performance metrics and/or one or more key performance indicators, KPIs. The response time is, for example, the time for a first system node (e.g., an application server and/or an application programming interface, API) to respond to a request from a second system node (e.g., a client and/or a user). The throughput can be, for example, the number of tasks (e.g., tasks associated with the request from a second system node) processed per time unit by the first system node (e.g., an application server and/or an application programming interface, API). The resource utilisation parameter can be seen as parameter indicating how much a resource (e.g., hardware resource and/or software resource) is utilized by an operation of a program. The resource utilisation parameter can include a hardware resource utilization parameter and/or a software resource utilization parameter. In one or more examples, the hardware resources comprise central processing unit, CPU, utilisation, memory utilisation, hardware accelerators and/or hardware clocks. The CPU utilisation (e.g., CPU usage) can be, for example, a percentage of time the CPU spends in handling an operation and/or a task (e.g., a task associated with the request from the second system node). The memory utilisation (e.g., memory usage) can be, for example, an amount of memory required to process the request.
In one or more example methods, the one or more load parameters are one or more actual load parameters that are used at the load generator for emulating and/or simulating the load data. In one or more examples, the one or more load parameters are one or more parameters that match the production load information (e.g., a load pattern) associated with a program in a production environment. The one or more load parameters can be determined, based on the production load information, by the electronic device. The electronic device may guide the load generator to replicate a load pattern similar to the load pattern associated with the production load information. The load data may be seen as the replicated load pattern by the load generator. The electronic device may guide the load generator by (e.g., permanently and/or when necessary) providing the load generator with the load parameter(s) allowing to simulate and/or emulate the production load without having to deploy the program. In one or more example methods, the one or more load parameters comprise one or more of: a number of emulated users per time unit, a number of threads per time unit, a number of transactions per time unit, a response time, and a throughput. In one or more examples, the electronic device determines, based on an average response associated with a time parameter (as production load information), a throughput in transactions per time unit (as load parameter). The one or more load parameters can, for example, be a number of virtual users (e.g., threads) to reflect the production load information obtained. The load data can, for example, be associated with a production environment where a past version of the program was deployed by, for example, internal and/or external users. In one or more examples, the one or more load parameters can be a volume of transactions processed by the program (e.g., transactions per time unit and/or requests per time unit). The one or more load parameters may be a time required for a program to respond to a user accessing the program in the production environment, under a load (e.g., a load value comprised in the production load information).
In one or more example methods, determining S102B the one or more load parameters based on the production load information comprises applying S102BA a load modelling technique to the production load information, and optionally to the performance data. In one or more examples, a load modelling technique enables determining (e.g., calculating as illustrated in Fig. 1 B) the one or more load parameters. In other words, the load technique may allow estimating, based on the production load information (e.g., the load pattern), the one or more load parameters. For example, determining the one or more load parameters may comprise determining the one or more load parameters based on a mathematical representation of relationship between the production load information and the performance data. For example, the number of users and/or threads can be calculated, by the electronic device, based on production load. For example, the number of users and/or threads are used as an input to the load generator.
In one or more example methods, the electronic device comprises a display device. In one or more example methods, providing S106 the output comprises displaying S106A, on the display device, a user interface object representative of the output. In one or more examples, the user interface object representative of the output can be an icon, a message and/or a notification (e.g., to a developer and/or tester). The icon, the message and/or the notification may comprise the production performance data in form of a graph and/or a table and/or a vector and/or a distribution and/or a function. For example, the production performance data in form of a graph is illustrated in Fig. 1 A.
In one or more example methods, providing S106 the output comprises transmitting S106B the output to an external device. In one or more examples, providing the production performance data (e.g., in form of a graphic and/or a table and/or a vector and/or a distribution and/or a function) as the output to an external device comprises informing a user about a load to be used when deploying new version of the program in a production environment. The user may identify, based on the production performance data, computing bottlenecks associated with the program under test and investigate stability of the program under test by analysing non-peak and peak traffic events.
Fig. 4 shows a block diagram of an exemplary electronic device 300 according to the disclosure. The electronic device 300 comprises memory 301 , processor 302, and an interface 303. The electronic device 300 is configured to perform any of the methods disclosed in Fig. 3. The electronic device 300 may comprise a plugin and optionally a load generator (e.g., a load generator device or a load generator tool) and optionally an internal APM tool. The electronic device 300 may comprise a plugin which is configured to perform the method disclosed herein. In some examples, the load generator is part of a device external to the electronic device, where the device is configured to communicate with the electronic device 300. In some examples, the electronic device is configured to communicate with an external APM device (such as external APM device 400 of Fig. 1 A and 2).
The electronic device 300 may be seen as a program testing device.
The electronic device 300 is configured to test (e.g., using the processor 302) a program in a performance testing environment.
The electronic device 300 is configured to test the program by obtaining (e.g., via the interface 303 and/or using the memory 301) performance data indicative of one or more performance parameters associated with the program in a production environment, and a production load information indicative of a load pattern. The electronic device 300 is configured to provide (e.g., using the processor 302 and/or the interface 303), based on the performance data and the production load information, an output.
The electronic device 300 is optionally configured to perform any of the operations disclosed in Fig. 3 (such as any one or more of: S102, S102A, S102AA, S102B, S102BA, S102C, S102D, S102E, S104, S106, S106A, S106B). The operations of the electronic device 300 may be embodied in the form of executable logic routines (e.g., lines of code, software programs, etc.) that are stored on a non-transitory computer readable medium (e.g., the memory 301) and are executed by the processor 302.
Furthermore, the operations of the electronic device 300 may be considered a method that the electronic device 300 is configured to carry out. Also, while the described functions and operations may be implemented in software, such functionality may as well be carried out via dedicated hardware or firmware, or some combination of hardware, firmware and/or software.
The memory circuitry 301 may be one or more of: a buffer, a flash memory, a hard drive, a removable media, a volatile memory, a non-volatile memory, a random access memory (RAM), and any other suitable device. In a typical arrangement, the memory 301 may include a non-volatile memory for long term data storage and a volatile memory that functions as system memory for the processor 302. The memory 301 may exchange data with the processor 302 over a data bus. Control lines and an address bus between the memory 301 and the processor 302 also may be present (not shown in Fig. 4). The memory 301 is considered a non-transitory computer readable medium.
The memory 301 may be configured to store performance data, one or more performance parameters, production load information, one or more load parameters, load data, production performance data in a part of the memory.
Embodiments of methods and products (electronic device) according to the disclosure are set out in the following items:
Item 1 . A method, performed by an electronic device, the method comprising:
- testing a program in a performance testing environment; wherein testing the program comprises obtaining performance data indicative of one or more performance parameters associated with the program in a production environment, and production load information indicative of a load pattern; and
- providing, based on the performance data and the production load information, an output.
Item 2. The method according to item 1 , wherein testing the program comprises:
- determining one or more load parameters based on the production load information;
- providing, to a load generator, the one or more load parameters; and
- obtaining, from the load generator, load data indicative of a production load.
Item 3. The method according to item 2, wherein testing the program comprises: determining, based on the load data and the performance data, production performance data indicative of a performance of the program under the production load.
Item 4. The method according to item 3, the method comprising:
- determining the output based on the production performance data, wherein the output is indicative of the performance of the program under the production load.
Item 5. The method according to any of the previous items, wherein obtaining the performance data comprises:
- generating the performance data by querying a repository.
Item 6. The method according to item 5, wherein the repository comprises historical performance parameters.
Item 7. The method according to any of the previous items, wherein the one or more performance parameters comprise one or more of: a response time, a throughput, and a resource utilisation parameter.
Item 8. The method according to any of items 2-7, wherein the one or more load parameters are one or more actual load parameters that are used at the load generator for emulating and/or simulating the load data.
Item 9. The method according to any of items 2-8, wherein the one or more load parameters comprise one or more of: a number of emulated users per time unit, a number of threads per time unit, a number of transactions per time unit, a response time, and a throughput. Item 10. The method according to any of items 2-9, wherein determining the one or more load parameters based on the production load information comprises: applying a load modelling technique to the production load information.
Item 11. The method according to any of the previous items, wherein the electronic device comprises a display device; wherein providing the output comprises displaying, on the display device, a user interface object representative of the output.
Item 12. The method according to any of the previous items, wherein providing the output comprises transmitting the output to an external device.
Item 13. An electronic device comprising a memory, a processor, and an interface, wherein the electronic device is configured to perform any of the methods according to any of items 1-12.
Item 14. A computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by an electronic device cause the electronic device to perform any of the methods of items 1 -12.
The use of the terms “first”, “second”, “third” and “fourth”, “primary”, “secondary”, “tertiary” etc. does not imply any particular order, but are included to identify individual elements. Moreover, the use of the terms “first”, “second”, “third” and “fourth”, “primary”, “secondary”, “tertiary” etc. does not denote any order or importance, but rather the terms “first”, “second”, “third” and “fourth”, “primary”, “secondary”, “tertiary” etc. are used to distinguish one element from another. Note that the words “first”, “second”, “third” and “fourth”, “primary”, “secondary”, “tertiary” etc. are used here and elsewhere for labelling purposes only and are not intended to denote any specific spatial or temporal ordering. Furthermore, the labelling of a first element does not imply the presence of a second element and vice versa.
It may be appreciated that Figures comprises some circuitries or operations which are illustrated with a solid line and some circuitries or operations which are illustrated with a dashed line. The circuitries or operations which are comprised in a solid line are circuitries or operations which are comprised in the broadest example embodiment. The circuitries or operations which are comprised in a dashed line are example embodiments which may be comprised in, or a part of, or are further circuitries or operations which may be taken in addition to the circuitries or operations of the solid line example embodiments. It should be appreciated that these operations need not be performed in order presented. Furthermore, it should be appreciated that not all of the operations need to be performed. The exemplary operations may be performed in any order and in any combination.
It is to be noted that the word "comprising" does not necessarily exclude the presence of other elements or steps than those listed.
It is to be noted that the words "a" or "an" preceding an element do not exclude the presence of a plurality of such elements.
It should further be noted that any reference signs do not limit the scope of the claims, that the exemplary embodiments may be implemented at least in part by means of both hardware and software, and that several "means", "units" or "devices" may be represented by the same item of hardware.
The various exemplary methods, devices, nodes, and systems described herein are described in the general context of method steps or processes, which may be implemented in one aspect by a computer program product, embodied in a computer- readable medium, including computer-executable instructions, such as program code, executed by computers in networked environments. A computer-readable medium may include removable and non-removable storage devices including, but not limited to, Read Only Memory (ROM), Random Access Memory (RAM), compact discs (CDs), digital versatile discs (DVD), etc. Generally, program circuitries may include routines, programs, objects, components, data structures, etc. that perform specified tasks or implement specific abstract data types. Computer-executable instructions, associated data structures, and program circuitries represent examples of program code for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps or processes.
Although features have been shown and described, it will be understood that they are not intended to limit the claimed disclosure, and it will be made obvious to those skilled in the art that various changes and modifications may be made without departing from the scope of the claimed disclosure. The specification and drawings are, accordingly, to be regarded in an illustrative rather than restrictive sense. The claimed disclosure is intended to cover all alternatives, modifications, and equivalents.

Claims

1 . A method, performed by an electronic device, the method comprising:
- testing a program in a performance testing environment; wherein testing the program comprises obtaining performance data indicative of one or more performance parameters associated with the program in a production environment, and production load information indicative of a load pattern; and
- providing, based on the performance data and the production load information, an output.
2. The method according to claim 1 , wherein testing the program comprises:
- determining one or more load parameters based on the production load information;
- providing, to a load generator, the one or more load parameters; and
- obtaining, from the load generator, load data indicative of a production load.
3. The method according to claim 2, wherein testing the program comprises: determining, based on the load data and the performance data, production performance data indicative of a performance of the program under the production load.
4. The method according to claim 3, the method comprising:
- determining the output based on the production performance data, wherein the output is indicative of the performance of the program under the production load.
5. The method according to any of the previous claims, wherein obtaining the performance data comprises:
- generating the performance data by querying a repository.
6. The method according to claim 5, wherein the repository comprises historical performance parameters.
7. The method according to any of the previous claims, wherein the one or more performance parameters comprise one or more of: a response time, a throughput, and a resource utilisation parameter.
8. The method according to any of claims 2-7, wherein the one or more load parameters are one or more actual load parameters that are used at the load generator for emulating and/or simulating the load data.
9. The method according to any of claims 2-8, wherein the one or more load parameters comprise one or more of: a number of emulated users per time unit, a number of threads per time unit, a number of transactions per time unit, a response time, and a throughput.
10. The method according to any of claims 2-9, wherein determining the one or more load parameters based on the production load information comprises: applying a load modelling technique to the production load information.
PCT/DK2023/050260 2022-11-17 2023-11-01 A method for enabling an electronic device to test a program in a performance testing environment WO2024104535A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DKPA202270568 2022-11-17
DKPA202270568 2022-11-17

Publications (1)

Publication Number Publication Date
WO2024104535A1 true WO2024104535A1 (en) 2024-05-23

Family

ID=91083832

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/DK2023/050260 WO2024104535A1 (en) 2022-11-17 2023-11-01 A method for enabling an electronic device to test a program in a performance testing environment

Country Status (1)

Country Link
WO (1) WO2024104535A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020138226A1 (en) * 2001-03-26 2002-09-26 Donald Doane Software load tester
US9558465B1 (en) * 2013-11-11 2017-01-31 Amazon Technologies, Inc. Annotations-based generic load generator engine
CN111045942A (en) * 2019-12-12 2020-04-21 同盾控股有限公司 Application testing method, system, storage medium and electronic equipment
US11301362B1 (en) * 2019-11-27 2022-04-12 Amazon Technologies, Inc. Control system for distributed load generation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020138226A1 (en) * 2001-03-26 2002-09-26 Donald Doane Software load tester
US9558465B1 (en) * 2013-11-11 2017-01-31 Amazon Technologies, Inc. Annotations-based generic load generator engine
US11301362B1 (en) * 2019-11-27 2022-04-12 Amazon Technologies, Inc. Control system for distributed load generation
CN111045942A (en) * 2019-12-12 2020-04-21 同盾控股有限公司 Application testing method, system, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
Maricq et al. Taming performance variability
Farshchi et al. Metric selection and anomaly detection for cloud operations using log and metric correlation analysis
Aggarwal et al. The power of system call traces: Predicting the software energy consumption impact of changes
US20190087301A1 (en) Generating different workload types for cloud service testing
US20130055250A1 (en) Performance benchmarking in virtual machines
US20130282354A1 (en) Generating load scenarios based on real user behavior
Araujo et al. An investigative approach to software aging in android applications
US11669374B2 (en) Using machine-learning methods to facilitate experimental evaluation of modifications to a computational environment within a distributed system
CN116405412B (en) Method and system for verifying cluster effectiveness of simulation server based on chaotic engineering faults
CN113111000A (en) Continuous integration automation test system and method, electronic device and storage medium
WO2020027931A1 (en) Real time telemetry monitoring tool
Klinaku et al. Architecture-based evaluation of scaling policies for cloud applications
Portillo‐Dominguez et al. PHOEBE: an automation framework for the effective usage of diagnosis tools in the performance testing of clustered systems
Vondra et al. Modifying CloudSim to accurately simulate interactive services for cloud autoscaling
Souza et al. A Tool for Automatic Dependability Test in Eucalyptus Cloud Computing Infrastructures.
CN116382987A (en) Performance test method, device, processor and machine-readable storage medium
WO2024104535A1 (en) A method for enabling an electronic device to test a program in a performance testing environment
Costa et al. Taxonomy of performance testing tools: a systematic literature review
Lehrig et al. Performance prototyping with protocom in a virtualised environment: A case study
Amaral et al. FaultSee: reproducible fault injection in distributed systems
US20180203790A1 (en) Detection of software errors
CN112346994A (en) Test information correlation method and device, computer equipment and storage medium
Oberle et al. An architectural prototype for testware as a service
Kılınç et al. Cloud-based test tools: A brief comparative view
Lipka et al. Verification of SimCo—Simulation tool for testing of component-based application

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23890927

Country of ref document: EP

Kind code of ref document: A1