WO2014088398A1 - Automated test environment deployment with metric recommender for performance testing on iaas cloud - Google Patents
Automated test environment deployment with metric recommender for performance testing on iaas cloud Download PDFInfo
- Publication number
- WO2014088398A1 WO2014088398A1 PCT/MY2013/000218 MY2013000218W WO2014088398A1 WO 2014088398 A1 WO2014088398 A1 WO 2014088398A1 MY 2013000218 W MY2013000218 W MY 2013000218W WO 2014088398 A1 WO2014088398 A1 WO 2014088398A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- test
- metric
- performance
- metrics
- agent module
- Prior art date
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 228
- 238000000034 method Methods 0.000 claims abstract description 45
- 238000013515 script Methods 0.000 claims abstract description 34
- 238000011056 performance test Methods 0.000 claims abstract description 32
- 230000004931 aggregating effect Effects 0.000 claims abstract description 7
- 238000012544 monitoring process Methods 0.000 claims abstract description 6
- 238000013461 design Methods 0.000 claims description 5
- 239000003795 chemical substances by application Substances 0.000 description 31
- 230000008569 process Effects 0.000 description 5
- 238000004458 analytical method Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000796 flavoring agent Substances 0.000 description 1
- 235000019634 flavors Nutrition 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000003278 mimic effect Effects 0.000 description 1
- 238000013102 re-test Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000000344 soap Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Prevention of errors by analysis, debugging or testing of software
- G06F11/3698—Environments for analysis, debugging or testing of software
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3409—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
- G06F11/3414—Workload generation, e.g. scripts, playback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3466—Performance evaluation by tracing or monitoring
- G06F11/3476—Data logging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Prevention of errors by analysis, debugging or testing of software
- G06F11/3668—Testing of software
- G06F11/3672—Test management
- G06F11/3684—Test management for test design, e.g. generating new test cases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2201/00—Indexing scheme relating to error detection, to error correction, and to monitoring
- G06F2201/815—Virtual
Definitions
- the present invention relates to a system and method for automating test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a-Service) Cloud.
- laaS Infrastructure-as-a-Service
- Performance testing is an assessment which determines the performance of a system under a particular workload. Simulated load will be introduced at stages and key point metrics are captured for analysis. Improvements are incorporated by developers at each iteration. The assessment of the test cycle continues until performance goals are met to validate and verify quantitative and qualitative attributes such as reliability, stability or resource utilization of the applications.
- the process of the construction of performance test involves various activities from setting up a test environment, identifying performance goals, designing test cases, performing and monitoring test execution and identifying suitable metrics to be collected before analyzing the same for improvement.
- a set of performance test tools are required in order to design the test cases as well as the right test environment to simulate the load.
- the complexity of setting a system under test (SUT), identification of suitable test tool and identification of suitable test metrics will lead to the approach of the present invention.
- the system and method of the present invention handles performance test life cycle in a more simplified manner which will simplify the test cycle.
- the present invention proposes a system and method to automate the SUT environment setup using laaS cloud infrastructure.
- the present invention also assist tester in achieving performance goals based on threshold set forth and simplify and automate selection of performance metrics based on Application Type; and Operating System (OS) flavours & distributions.
- the present invention provides metric recommender wherein performance test metric is automated using physical resources of virtual machine in which the complexity of setting up test environment will be overcome by providing cloud computing to set up test virtual environment and aggregating the correct metrics using metric agents to achieve performance goals.
- the present invention provides a system and method to automate test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a-Service) Cloud.
- the system comprising at least one user interface (104) for setting up test performance; at least one Metric Agent Module (106, 206) for introducing list of suitable metrics to be captured according to performance goals; at least one Interface Agent Module (212) for interfacing with API (Application Programming Interface) of Performance Test Tool and Cloud Controller; at least one Deployment Agent Module (108, 208) for provisioning test environment based on test scripts; at least one Reporting Agent Module (110, 210) for collecting and aggregating test result from Data Store (220) and displaying in portal; at least one Metric Store (102, 202) for storing list of metrics; and at least one Data Store (118, 220) for storing test results from Performance Test Tool (114, 218) and Metric Logger (216).
- Another aspect of the present invention provides for at least one Metric Agent Module (106, 206) and at least one Metric Store (102, 202) which is used for Metric Suggestion Functionality wherein Metric Suggestion Functionality is based on application used by developer and data stored in Data Store (118, 220).
- a further aspect of the present invention provides for at least one Deployment Agent Module (108, 208) for provisioning test environment based on test scripts.
- the said Deployment Agent (108, 208) having means for communicating with Cloud Controller to provision virtual test environment based on requirements stated in test scripts; communicating with Performance Test Tool to configure test plan; and installing and configuring System under Test (SUT) in virtual test environment.
- Another aspect of the present invention provides a method (300, 400) to automate test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a-Service) Cloud.
- the method comprising steps of initializing test by setting performance goals (302); provisioning test environment based on test scripts (304); performing test execution (306, 412); monitoring, capturing and storing test results based on pre-defined metrics (414);analyzing, reporting and retesting to ensure test completed as expected (308); and provisioning test tool by extracting test results and displaying in portal (416).
- a further aspect of the present invention provides for the further steps for initializing test by setting performance goals.
- the said method comprising further steps of introducing list of suitable metrics to be captured according to performance goals (402); obtaining list of suggested metrics to create test specifications (404); and preparing test scripts for test deployment (404).
- Another aspect of the present invention provides for a method for provisioning test environment based on test scripts. The method comprising steps of communicating with Cloud Controller to provision virtual test environment based on requirements stated in test scripts (406); communicating with Performance Test Tool to configure test plan (408); and installing and configuring System under Test (SUT) in virtual test environment (410).
- SUT System under Test
- Communication with Cloud Controller to provision virtual test environment based on requirements stated in test scripts further comprising launching SUT environment which consist of a plurality of System under Test Virtual Machines (SUTVMs).
- SUT environment consist of a plurality of System under Test Virtual Machines (SUTVMs).
- Another aspect of the present invention provides a method for provisioning test tool by extracting test results and displaying in portal. The method comprising steps of communicating with API (Application Programming Interface) of Performance Test Tool (502); and retrieving configuration setting from test scripts which contains load test design (504).
- API Application Programming Interface
- a further aspect of the present invention provides a method for communicating with Performance Test Tool to configure test plan.
- the method comprising steps of launching test tool at SaaS (Software as a Service) application and aligning test according to test goals (602); inserting test scripts to Test Tools (604); forwarding test case on median goals (606); and retrieving test results (608).
- SaaS Software as a Service
- Another aspect of the present invention provides a method for communicating with Performance Test Tool to configure test plan.
- the method comprising steps of launching test tool at SaaS (Software as a Service) application and aligning test according to test goals (602); inserting test scripts to Test Tools (604); forwarding test case on median goals (606); and retrieving test results (608).
- SaaS Software as a Service
- Another aspect of the present invention provides a method for installing and configuring System under Test in virtual test environment.
- the method comprising steps of realigning SUTVMs to ensure that each VMs will not compete with other resources (702); and realigning SUTVMs to reflect stability of test results (704).
- FIG. 1.0 illustrates the system architecture of the present invention.
- FIG. 2.0 illustrates the component of the system of the present invention.
- FIG. 3.0 illustrates a simplified method of the present invention to automate test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a-Service) Cloud.
- FIG. 4.0 is a flowchart illustrating a detailed method of the present invention to automate test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a-Service) Cloud.
- FIG. 5.0 is a flowchart illustrating the method for provisioning test tool by extracting test results and displaying in a portal.
- FIG. 6.0 is a flowchart illustrating the method for communicating with Performance Test Tool to configure test plan.
- FIG. 7.0 is a flowchart illustrating the method for installing and configuring System under Test in virtual test environment.
- TABLE 1.0 is a table detailing the metrics to be stored in Metrics Store.
- the present invention provides a system and method for automating test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a-Service) Cloud.
- laaS Infrastructure-as-a-Service
- FIG. 1.0 illustrates the system architecture of the present invention
- FIG. 2.0 illustrates the component of the system of the present invention to automate test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a-Service) Cloud.
- the present invention is accessed via a web portal which is hosted either on physical or virtual machine. Configuration of test plan and default metrics as well as aggregation of test results into a dashboard is conducted on portals which act as a test platform. All parameters for initialization will be compiled by portals and thereafter forwarded to Metrics Agent. Initialization is conducted on parameters such as performance criteria (e.g. 5 seconds for 100 concurrent users), load criteria (e.g. 100 concurrent users transaction with 50Kb per user), application architecture (e.g. using Tomcat JBoss, MySQL with load balance) and etc.
- performance criteria e.g. 5 seconds for 100 concurrent users
- load criteria e.g. 100 concurrent users transaction with 50Kb per user
- application architecture e.g. using Tomcat
- the system architecture of the present invention comprising a user interface (104) for tester to set up test performance; a Metric Agent Module (106, 206) for introducing list of suitable metrics to be captured according to performance goals; an Interface Agent Module (212) for interfacing with API (Application Programming Interface) of Performance Test Tool and Cloud Controller; a Deployment Agent Module (108, 208) for provisioning test environment based on test scripts; a Reporting Agent Module (110, 210) for collecting and aggregating test result from Data Store (220) and displaying in portal; a Metric Store (102, 202) for storing list of metrics; and a Data Store (118, 220) for storing test results from Performance Test Tool (114, 218) and a Metric Logger (216) for monitoring and collecting required metrics during test execution.
- a user interface (104) for tester to set up test performance
- a Metric Agent Module for introducing list of suitable metrics to be captured according to performance goals
- an Interface Agent Module for interfacing with API (Application Programming Interface) of Performance Test
- the Metric Agent Module (106, 206) and the Metric Store (102, 202) is used for Metric Suggestion Functionality wherein Metric Suggestion Functionality is based on application used by developer and data stored in Data Store (118, 220).
- the list of suggested metrics to be stored in Data Store (118,220) is detailed in Table 1.0.
- FIG. 3.0 illustrates a simplified method of the present invention to automate test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a- Service) Cloud while FIG.
- 4.0 is a flowchart illustrating a detailed method of the present invention to automate test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a-Service) Cloud.
- automated test environment deployment with metric recommender for performance testing on laaS Cloud is initialized by first setting performance goals (302).
- the methodology for setting performance goals further comprising steps of introducing list of suitable metrics to be captured according to performance goals (402); obtaining list of suggested metrics to create test specifications (404); and preparing test scripts for test deployment (404).
- test tool such as JMeter, IBM RPT and designing the test cases.
- Test tool must adhere to the SUT interaction whether it is SOAP, REST or other API. Thereafter, the tester needs to design the test cases.
- Each test case must reflect the objective of the performance goal to ensure that the system is able to sustain at least 1000 concurrent connections or request and each request needs to be replied within a 5 second time frame.
- Suitable metrics are to be indentified and identifying suitable metrics involves a further step of collecting and aggregating system metrics to ensure problems or bottleneck are tracked at the right spots.
- test environment is provisioned based on test scripts (304) as tester needs to configure test environment to mimic the production environment.
- test scripts In a complex multi-tier system, it may consist of database, application server, or web server. Provisioning on physical server for testing is expensive. Cloud Computing environment provides for a faster setup and provisioning in a control environment.
- the said Metric Agent Module (106, 206) Upon successful creation of test specification by the Metric Agent Module (106, 206), the said Metric Agent Module (106, 206) communicates with the Interface Agent Module (212) for deployment process. Subsequently, the Interface Agent Module (212) uses test specifications which comprise of test environment setup and performance tool configurations to deploy and perform test execution (306, 412). The Interface Agent Module (212) communicates with the Cloud Controller for launching required virtual resources as stated in the specifications.
- the Cloud controller which runs on a specific platform is used to manage, launch, terminate and configure virtual test resources by exposing the web-service Application Programming Interfaces (APIs) for communication purpose with the Interface Agent Module (212).
- the default setup of the present invention comprises of web server (e.g. Apache Tomcat, Apache PHP and etc.), application server (e.g. JBoss, PHP, .NET and etc.) and database server (e.g. MySQL, Postgress and etc.).
- Test cases are configured based on the specification through the Interface Agent Module (212) wherein the Interface Agent Module (212) communicates with performance test tool (114, 218) through APIs.
- test specifications consist of the number of virtual users, the type of test plan, load size, test period and etc.
- Performance test tool (114, 218) includes the software for measuring performance, reliability, compatibility and characteristics such as JMeter, LoadRunner, Rational and etc. The said performance tool can be hosted in either physical or virtual resources. Test results obtained from the test execution is stored in Data Store (118, 210) for analysis and reporting while the collection of metrics are in accordance with the specifications provided by the Metrics Agent Module (106, 206).
- the Deployment Agent Module (108, 208) provisions test environment in accordance to test scripts to ensure accuracy of test. Upon the availability of test beds (222), the
- Deployment Agent (108, 208) begins the installation and configuration of the SUT.
- the required application and database scheme will be uploaded by the tester via a portal which will invoke the Deployment Agent (108, 208) to communicate with deployed virtual test beds (222).
- the said Deployment Agent (108, 208) provisions test environment based on test scripts by communicating with the Cloud Controller to provision virtual test environment based on requirements stated in test scripts (406); communicating with Performance Test Tool to configure test plan (408) and thereafter install and configure SUT in virtual test environment (410).
- Uploaded files will be transferred to virtual machines using file transfer protocol (FTP). Installation and configuration is done by the script which is bundled together with the virtual machine.
- the Deployment agent will trigger the script once the files are successfully transferred.
- Communicating with the Cloud Controller to provision virtual test environment based on requirements stated in test scripts further comprising launching SUT environment which consist of a plurality of System under Test Virtual Machines (SUTVMs).
- SUVMs System under Test Virtual Machines
- FIG. 6.0 is a flowchart illustrating the method for communicating with Performance Test Tool to configure test plan
- FIG. 7.0 is a flowchart is a flowchart illustrating the method for installing and configuring System under Test in virtual test environment.
- communicating with Performance Test Tool to configure test plan further comprising steps of launching test tool at SaaS (Software as a Service) application and aligning test according to test goals (602); inserting test scripts to Test Tools (604); forwarding test case on median goals (606); and retrieving test results (608).
- SaaS Software as a Service
- the methodology for installing and configuring the SUT in virtual test environment comprising steps of realigning SUTVMs to ensure that each VMs will not compete with other resources (702); and realigning SUTVMs to reflect stability of test results (704).
- the script will notify the tester that the system is ready to be tested.
- Test execution process will be initiated by the tester through the portal which triggers the performance test tool to initiate the test plan.
- required metrics will be monitored and captured by the Metric Logger (216).
- Metrics Agent 106,206
- the test specifications e.g. a Java-based web application running on Apache Tomcat, JBoss and MySQL server.
- the Metric Logger 216 monitors the performance of CPU, memory, network RX TX, disk read-write, JBoss service and etc which is not covered by the said tools.
- At least one Metric Logger (216) is embedded in all servers to monitor and collect the required metrics during test execution.
- Data Store (118, 220) stores test results in XML format for analysis and reporting process. The system monitors, captures and stores the test results accordingly in accordance to the pre-defined metrics (414).Test results collected will be stored in Data Store (118, 220) in XML format for analysis and reporting process. Thereafter, the system monitors, captures and store the test results based on pre-defined metrics (414).
- Reporting Agent (110, 210) is responsible in formatting the test results stored in Data Store in readable and graphical representation. Results obtained enable tester to dictate the areas that produced higher bottleneck to resolve the issue. The tester has the option to analyze, report and retest the test using the same resources with different test specifications to ensure test is completed as expected (308).
- FIG. 5.0 is a flowchart illustrating the method for provisioning test tool by extracting test results and displaying in a portal.
- Test tools are provisioned by extracting test results and displaying the formatted results into the portal dashboard (416) by communicating with API (Application Programming Interface) of Performance Test Tool (502); and retrieving configuration setting from test scripts which contains load test design (504).
- API Application Programming Interface
- Performance Test Tool 502
- configuration setting from test scripts which contains load test design
- the system and method of the present invention provides an automated test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a-Service) Cloud.
- the metric recommender of the present invention enables automated test environment deployment wherein performance test metric is automated using physical resources of virtual machine in which the complexity of setting up test environment will be overcome by providing cloud computing to set up test virtual environment and aggregating the correct metrics using metric agents to achieve performance goals.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
Automated test environment deployment for performance testing on laaS Cloud is provided by utilizing metric recommender. The system and method of the present invention comprising a user interface (104) for setting up test performance; a Metric Agent Module (106, 206) for introducing list of suitable metrics to be captured according to performance goals; an Interface Agent Module (212) for interfacing with API (Application Programming Interface) of Performance Test Tool and Cloud Controller; a Deployment Agent Module (108, 208) for provisioning test environment based on test scripts; a Reporting Agent Module (110, 210) for collecting and aggregating test result from Data Store (220) and displaying in portal; a Metric Store (102, 202) for storing list of metrics; a Metric Logger (216) for monitoring and collecting required metrics during test execution; a Data Store (118, 220) for storing test results from Performance Test Tool (114, 218) and Metric Logger (216). The present invention utilizes a metric recommender wherein the performance test metric is automated using physical resources of virtual machine in which the complexity of setting up test environment will be overcome by providing cloud computing to set up test virtual environment. Metrics are aggregated using metric agents to achieve performance goals by iteration to validate and verify all quantitative and qualitative attributes as reliability, stability or resource utilization of the applications.
Description
AUTOMATED TEST ENVIRONMENT DEPLOYMENT WITH METRIC
RECOMMENDER FOR PERFORMANCE TESTING ON IAAS CLOUD
FIELD OF INVENTION
The present invention relates to a system and method for automating test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a-Service) Cloud. BACKGROUND ART
Performance testing is an assessment which determines the performance of a system under a particular workload. Simulated load will be introduced at stages and key point metrics are captured for analysis. Improvements are incorporated by developers at each iteration. The assessment of the test cycle continues until performance goals are met to validate and verify quantitative and qualitative attributes such as reliability, stability or resource utilization of the applications.
The process of the construction of performance test involves various activities from setting up a test environment, identifying performance goals, designing test cases, performing and monitoring test execution and identifying suitable metrics to be collected before analyzing the same for improvement. A set of performance test tools are required in order to design the test cases as well as the right test environment to simulate the load. The complexity of setting a system under test (SUT), identification of suitable test tool and identification of suitable test metrics will lead to the approach of the present invention. The system and method of the present invention handles performance test life cycle in a more simplified manner which will simplify the test cycle.
The present invention proposes a system and method to automate the SUT environment setup using laaS cloud infrastructure. The present invention also assist tester in achieving performance goals based on threshold set forth and simplify and automate selection of performance metrics based on Application Type; and Operating System (OS) flavours & distributions.
The present invention provides metric recommender wherein performance test metric is automated using physical resources of virtual machine in which the complexity of setting up test environment will be overcome by providing cloud computing to set up test virtual environment and aggregating the correct metrics using metric agents to achieve performance goals.
The subject matter claimed herein is not limited to embodiments that solve any disadvantages or that operate only in environments such as those described above. Rather, this background is only provided to illustrate one exemplary technology area where some embodiments described herein may be practice.
SUMMARY OF INVENTION
The present invention provides a system and method to automate test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a-Service) Cloud. The system comprising at least one user interface (104) for setting up test performance; at least one Metric Agent Module (106, 206) for introducing list of suitable metrics to be captured according to performance goals; at least one Interface Agent Module (212) for interfacing with API (Application Programming Interface) of Performance Test Tool and Cloud Controller; at least one Deployment Agent Module (108, 208) for provisioning test environment based on test scripts; at least one Reporting Agent Module (110, 210) for collecting and aggregating test result from Data Store (220) and displaying in portal; at least one Metric Store (102, 202) for storing list of metrics; and at least one Data Store (118, 220) for storing test results from Performance Test Tool (114, 218) and Metric Logger (216).
Another aspect of the present invention provides for at least one Metric Agent Module (106, 206) and at least one Metric Store (102, 202) which is used for Metric Suggestion Functionality wherein Metric Suggestion Functionality is based on application used by developer and data stored in Data Store (118, 220).
A further aspect of the present invention provides for at least one Deployment Agent Module (108, 208) for provisioning test environment based on test scripts. The said Deployment Agent (108, 208) having means for communicating with Cloud Controller to provision virtual test environment based on requirements stated in test scripts; communicating with Performance Test Tool to configure test plan; and installing and configuring System under Test (SUT) in virtual test environment.
Another aspect of the present invention provides a method (300, 400) to automate test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a-Service) Cloud. The method comprising steps of initializing test by setting performance goals (302); provisioning test environment based on test scripts (304); performing test execution (306, 412); monitoring, capturing and storing test results based on pre-defined metrics (414);analyzing, reporting and retesting to ensure test
completed as expected (308); and provisioning test tool by extracting test results and displaying in portal (416).
A further aspect of the present invention provides for the further steps for initializing test by setting performance goals. The said method comprising further steps of introducing list of suitable metrics to be captured according to performance goals (402); obtaining list of suggested metrics to create test specifications (404); and preparing test scripts for test deployment (404). Another aspect of the present invention provides for a method for provisioning test environment based on test scripts. The method comprising steps of communicating with Cloud Controller to provision virtual test environment based on requirements stated in test scripts (406); communicating with Performance Test Tool to configure test plan (408); and installing and configuring System under Test (SUT) in virtual test environment (410). Communication with Cloud Controller to provision virtual test environment based on requirements stated in test scripts further comprising launching SUT environment which consist of a plurality of System under Test Virtual Machines (SUTVMs). Another aspect of the present invention provides a method for provisioning test tool by extracting test results and displaying in portal. The method comprising steps of communicating with API (Application Programming Interface) of Performance Test Tool (502); and retrieving configuration setting from test scripts which contains load test design (504).
A further aspect of the present invention provides a method for communicating with Performance Test Tool to configure test plan. The method comprising steps of launching test tool at SaaS (Software as a Service) application and aligning test according to test goals (602); inserting test scripts to Test Tools (604); forwarding test case on median goals (606); and retrieving test results (608).
Another aspect of the present invention provides a method for communicating with Performance Test Tool to configure test plan. The method comprising steps of launching test tool at SaaS (Software as a Service) application and aligning test according to test
goals (602); inserting test scripts to Test Tools (604); forwarding test case on median goals (606); and retrieving test results (608).
Another aspect of the present invention provides a method for installing and configuring System under Test in virtual test environment. The method comprising steps of realigning SUTVMs to ensure that each VMs will not compete with other resources (702); and realigning SUTVMs to reflect stability of test results (704).
The present invention consists of features and a combination of parts hereinafter fully described and illustrated in the accompanying drawings, it being understood that various changes in the details may be made without departing from the scope of the invention or sacrificing any of the advantages of the present invention.
BRIEF DESCRIPTION OF ACCOMPANYING DRAWINGS
To further clarify various aspects of some embodiments of the present invention, a more particular description of the invention will be rendered by references to specific embodiments thereof, which are illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail through the accompanying drawings in which: FIG. 1.0 illustrates the system architecture of the present invention.
FIG. 2.0 illustrates the component of the system of the present invention.
FIG. 3.0 illustrates a simplified method of the present invention to automate test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a-Service) Cloud.
FIG. 4.0 is a flowchart illustrating a detailed method of the present invention to automate test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a-Service) Cloud.
FIG. 5.0 is a flowchart illustrating the method for provisioning test tool by extracting test results and displaying in a portal. FIG. 6.0 is a flowchart illustrating the method for communicating with Performance Test Tool to configure test plan.
FIG. 7.0 is a flowchart illustrating the method for installing and configuring System under Test in virtual test environment.
TABLE 1.0 is a table detailing the metrics to be stored in Metrics Store.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
The present invention provides a system and method for automating test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a-Service) Cloud. Hereinafter, this specification will describe the present invention according to the preferred embodiments. It is to be understood that limiting the description to the preferred embodiments of the invention is merely to facilitate discussion of the present invention and it is envisioned without departing from the scope of the appended claims.
Reference is first being made to FIG. 1.0 and FIG. 2.0 respectively. FIG. 1.0 illustrates the system architecture of the present invention while FIG. 2.0 illustrates the component of the system of the present invention to automate test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a-Service) Cloud. The present invention is accessed via a web portal which is hosted either on physical or virtual machine. Configuration of test plan and default metrics as well as aggregation of test results into a dashboard is conducted on portals which act as a test platform. All parameters for initialization will be compiled by portals and thereafter forwarded to Metrics Agent. Initialization is conducted on parameters such as performance criteria (e.g. 5 seconds for 100 concurrent users), load criteria (e.g. 100 concurrent users transaction with 50Kb per user), application architecture (e.g. using Tomcat JBoss, MySQL with load balance) and etc.
As illustrated in FIG. 1.0 and FIG. 2.0, the system architecture of the present invention comprising a user interface (104) for tester to set up test performance; a Metric Agent Module (106, 206) for introducing list of suitable metrics to be captured according to performance goals; an Interface Agent Module (212) for interfacing with API (Application Programming Interface) of Performance Test Tool and Cloud Controller; a Deployment Agent Module (108, 208) for provisioning test environment based on test scripts; a Reporting Agent Module (110, 210) for collecting and aggregating test result from Data Store (220) and displaying in portal; a Metric Store (102, 202) for storing list of metrics; and a Data Store (118, 220) for storing test results from Performance Test Tool (114, 218) and a Metric Logger (216) for monitoring and collecting required metrics during test execution. The Metric Agent Module (106, 206) and the Metric Store (102, 202) is used
for Metric Suggestion Functionality wherein Metric Suggestion Functionality is based on application used by developer and data stored in Data Store (118, 220). The list of suggested metrics to be stored in Data Store (118,220) is detailed in Table 1.0. Reference is now being made to FIG. 3.0 and FIG. 4.0 respectively. FIG. 3.0 illustrates a simplified method of the present invention to automate test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a- Service) Cloud while FIG. 4.0 is a flowchart illustrating a detailed method of the present invention to automate test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a-Service) Cloud. As illustrated in FIG. 3.0, automated test environment deployment with metric recommender for performance testing on laaS Cloud is initialized by first setting performance goals (302). The methodology for setting performance goals further comprising steps of introducing list of suitable metrics to be captured according to performance goals (402); obtaining list of suggested metrics to create test specifications (404); and preparing test scripts for test deployment (404).
To perform the test, the tester must first select test tool such as JMeter, IBM RPT and designing the test cases. Test tool must adhere to the SUT interaction whether it is SOAP, REST or other API. Thereafter, the tester needs to design the test cases. Each test case must reflect the objective of the performance goal to ensure that the system is able to sustain at least 1000 concurrent connections or request and each request needs to be replied within a 5 second time frame. Suitable metrics are to be indentified and identifying suitable metrics involves a further step of collecting and aggregating system metrics to ensure problems or bottleneck are tracked at the right spots.
Thereafter, test environment is provisioned based on test scripts (304) as tester needs to configure test environment to mimic the production environment. In a complex multi-tier system, it may consist of database, application server, or web server. Provisioning on physical server for testing is expensive. Cloud Computing environment provides for a faster setup and provisioning in a control environment.
Upon successful creation of test specification by the Metric Agent Module (106, 206), the said Metric Agent Module (106, 206) communicates with the Interface Agent Module
(212) for deployment process. Subsequently, the Interface Agent Module (212) uses test specifications which comprise of test environment setup and performance tool configurations to deploy and perform test execution (306, 412). The Interface Agent Module (212) communicates with the Cloud Controller for launching required virtual resources as stated in the specifications. The Cloud controller which runs on a specific platform is used to manage, launch, terminate and configure virtual test resources by exposing the web-service Application Programming Interfaces (APIs) for communication purpose with the Interface Agent Module (212). The default setup of the present invention comprises of web server (e.g. Apache Tomcat, Apache PHP and etc.), application server (e.g. JBoss, PHP, .NET and etc.) and database server (e.g. MySQL, Postgress and etc.).
Test cases are configured based on the specification through the Interface Agent Module (212) wherein the Interface Agent Module (212) communicates with performance test tool (114, 218) through APIs. For example, test specifications consist of the number of virtual users, the type of test plan, load size, test period and etc. Performance test tool (114, 218) includes the software for measuring performance, reliability, compatibility and characteristics such as JMeter, LoadRunner, Rational and etc. The said performance tool can be hosted in either physical or virtual resources. Test results obtained from the test execution is stored in Data Store (118, 210) for analysis and reporting while the collection of metrics are in accordance with the specifications provided by the Metrics Agent Module (106, 206).
The Deployment Agent Module (108, 208) provisions test environment in accordance to test scripts to ensure accuracy of test. Upon the availability of test beds (222), the
Deployment Agent (108, 208) begins the installation and configuration of the SUT.
Thereafter, the required application and database scheme will be uploaded by the tester via a portal which will invoke the Deployment Agent (108, 208) to communicate with deployed virtual test beds (222). The said Deployment Agent (108, 208) provisions test environment based on test scripts by communicating with the Cloud Controller to provision virtual test environment based on requirements stated in test scripts (406); communicating with Performance Test Tool to configure test plan (408) and thereafter install and configure SUT in virtual test environment (410).
Uploaded files will be transferred to virtual machines using file transfer protocol (FTP). Installation and configuration is done by the script which is bundled together with the virtual machine. The Deployment agent will trigger the script once the files are successfully transferred. Communicating with the Cloud Controller to provision virtual test environment based on requirements stated in test scripts further comprising launching SUT environment which consist of a plurality of System under Test Virtual Machines (SUTVMs).
Reference is now being made to FIG. 6.0 and FIG. 7.0 respectively. FIG. 6.0 is a flowchart illustrating the method for communicating with Performance Test Tool to configure test plan and FIG. 7.0 is a flowchart is a flowchart illustrating the method for installing and configuring System under Test in virtual test environment. As illustrated in FIG. 6.0, communicating with Performance Test Tool to configure test plan further comprising steps of launching test tool at SaaS (Software as a Service) application and aligning test according to test goals (602); inserting test scripts to Test Tools (604); forwarding test case on median goals (606); and retrieving test results (608). Thereafter, as illustrated in FIG. 7.0, the methodology for installing and configuring the SUT in virtual test environment comprising steps of realigning SUTVMs to ensure that each VMs will not compete with other resources (702); and realigning SUTVMs to reflect stability of test results (704). Once the SUT is successfully installed and configured, the script will notify the tester that the system is ready to be tested.
Test execution process will be initiated by the tester through the portal which triggers the performance test tool to initiate the test plan. During initiation and execution of the test (306, 412), required metrics will be monitored and captured by the Metric Logger (216). These metrics are provided by Metrics Agent (106,206) in the test specifications (e.g. a Java-based web application running on Apache Tomcat, JBoss and MySQL server). By default, the number of response time, throughput, latency and bandwidth will be monitored by the performance test tool (114, 218). At the server side, the Metric Logger (216) monitors the performance of CPU, memory, network RX TX, disk read-write, JBoss service and etc which is not covered by the said tools. At least one Metric Logger (216) is embedded in all servers to monitor and collect the required metrics during test execution. Data Store (118, 220) stores test results in XML format for analysis and reporting process. The system monitors, captures and stores the test results accordingly
in accordance to the pre-defined metrics (414).Test results collected will be stored in Data Store (118, 220) in XML format for analysis and reporting process. Thereafter, the system monitors, captures and store the test results based on pre-defined metrics (414). Reporting Agent (110, 210) is responsible in formatting the test results stored in Data Store in readable and graphical representation. Results obtained enable tester to dictate the areas that produced higher bottleneck to resolve the issue. The tester has the option to analyze, report and retest the test using the same resources with different test specifications to ensure test is completed as expected (308).
Reference is now being made to FIG. 5.0. FIG. 5.0 is a flowchart illustrating the method for provisioning test tool by extracting test results and displaying in a portal. Test tools are provisioned by extracting test results and displaying the formatted results into the portal dashboard (416) by communicating with API (Application Programming Interface) of Performance Test Tool (502); and retrieving configuration setting from test scripts which contains load test design (504). Upon completion of the test cycle, formatted results will be extracted and displayed on the portal dashboard.
The system and method of the present invention provides an automated test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a-Service) Cloud. The metric recommender of the present invention enables automated test environment deployment wherein performance test metric is automated using physical resources of virtual machine in which the complexity of setting up test environment will be overcome by providing cloud computing to set up test virtual environment and aggregating the correct metrics using metric agents to achieve performance goals.
The present invention may be embodied in other specific forms without departing from its essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore indicated by the appended claims rather than by the foregoing description. All changes, which come within the meaning and range of equivalency of the claims, are to be embraced within their scope.
Claims
1. A system (100, 200) to automate test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a-Service) Cloud, the system comprising:
at least one user interface (104) for setting up test performance; at least one Metric Agent Module (106, 206) for introducing list of suitable metrics to be captured according to performance goals;
at least one Interface Agent Module (212) for interfacing with API
(Application Programming Interface) of Performance Test Tool and at least one Cloud Controller;
at least one Deployment Agent Module (108, 208) for provisioning test environment based on test scripts;
at least one Reporting Agent Module (110, 210) for collecting and aggregating test result from Data Store (220) and displaying in portal; at least one Metric Store (102, 202) for storing list of metrics; at least one Metric Logger (216) for monitoring and collecting required metrics during test execution;
at least one Data Store (118, 220) for storing test results from Performance Test Tool (114, 218) and Metric Logger (216).
2. A system (100, 200) according to Claim 1 , wherein the at least one Metric Agent Module (106, 206) and the at least one Metric Store (102, 202) is used for Metric Suggestion Functionality wherein Metric Suggestion Functionality is based on application used by developer and data stored in Data Store (118, 220).
3. A system (100, 200) according to Claim 1 , wherein the at least one Deployment Agent Module (108, 208) for provisioning test environment based on test scripts further having means for:
communicating with Cloud Controller to provision virtual test environment based on requirements stated in test scripts;
communicating with Performance Test Tool to configure test plan; and installing and configuring System under Test (SUT) in virtual test environment.
A method (300, 400) to automate test environment deployment utilizing metric recommender for performance testing on laaS (Infrastructure-as-a-Service) Cloud, the method comprising steps of:
initializing test by setting performance goals (302);
provisioning test environment based on test scripts (304);
performing test execution (306, 412);
monitoring, capturing and storing test results based on pre-defined metrics (414);
analyzing, reporting and retesting to ensure test completed as expected (308); and
provisioning test tool by extracting test results and displaying in portal (416).
A method (400) according to Claim 4, wherein initializing test by setting performance goals further comprising steps of:
introducing list of suitable metrics to be captured according to performance goals (402);
obtaining list of suggested metrics to create test specifications (404); and preparing test scripts for test deployment (404).
A method (400) according to Claim 4, wherein provisioning test environment based on test scripts further comprising steps of:
communicating with Cloud Controller to provision virtual test environment based on requirements stated in test scripts (406);
communicating with Performance Test Tool to configure test plan (408); and
installing and configuring System under Test (SUT) in virtual test environment (410).
7. A method (500) according to Claim 4, wherein provisioning test tool by extracting test results and displaying in portal further comprising steps of:
communicating with API (Application Programming Interface) of Performance Test Tool (502); and
retrieving configuration setting from test scripts which contains load test design (504).
8. A method according to Claim 6, wherein communicating with Cloud Controller to provision virtual test environment based on requirements stated in test scripts further comprising launching SUT environment which consist of a plurality of System under Test Virtual Machines (SUTVMs).
9. A method (600) according to Claim 6, wherein communicating with Performance Test Tool to configure test plan further comprising steps of:
launching test tool at SaaS (Software as a Service) application and aligning test according to test goals (602);
inserting test scripts to Test Tools (604);
forwarding test case on median goals (606); and
retrieving test results (608).
10. A method (700) according to Claim 6, wherein installing and configuring System under Test in virtual test environment further comprising steps of:
realigning SUTVMs to ensure that each VMs will not compete with other resources (702); and
realigning SUTVMs to reflect stability of test results (704).
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
MYPI2012005289A MY176486A (en) | 2012-12-06 | 2012-12-06 | Automated test environment deployment with metric recommender for performance testing on iaas cloud |
MYPI2012005289 | 2012-12-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014088398A1 true WO2014088398A1 (en) | 2014-06-12 |
Family
ID=50073398
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/MY2013/000218 WO2014088398A1 (en) | 2012-12-06 | 2013-11-29 | Automated test environment deployment with metric recommender for performance testing on iaas cloud |
Country Status (2)
Country | Link |
---|---|
MY (1) | MY176486A (en) |
WO (1) | WO2014088398A1 (en) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016018333A1 (en) * | 2014-07-31 | 2016-02-04 | Hewlett-Packard Development Company, L.P. | Determining application change success ratings |
US9274782B2 (en) | 2013-12-20 | 2016-03-01 | International Business Machines Corporation | Automated computer application update analysis |
CN105843725A (en) * | 2016-03-17 | 2016-08-10 | 广州杰赛科技股份有限公司 | Monitoring method and device of IaaS platform |
US9619371B2 (en) | 2015-04-16 | 2017-04-11 | International Business Machines Corporation | Customized application performance testing of upgraded software |
WO2017142393A1 (en) * | 2016-02-17 | 2017-08-24 | Mimos Berhad | System for managing user experience test in controlled test environment and method thereof |
WO2017144432A1 (en) * | 2016-02-26 | 2017-08-31 | Nokia Solutions And Networks Oy | Cloud verification and test automation |
WO2017177783A1 (en) * | 2016-04-11 | 2017-10-19 | 中兴通讯股份有限公司 | Test system and method invoking third-party test tool |
US9886367B2 (en) | 2015-04-29 | 2018-02-06 | International Business Machines Corporation | Unified processing test structure |
US10031831B2 (en) | 2015-04-23 | 2018-07-24 | International Business Machines Corporation | Detecting causes of performance regression to adjust data systems |
CN112395176A (en) * | 2020-11-16 | 2021-02-23 | 公安部第三研究所 | Method, device, system, equipment, processor and storage medium for testing distributed cloud storage performance |
CN112699041A (en) * | 2021-01-04 | 2021-04-23 | 中车青岛四方车辆研究所有限公司 | Automatic deployment method, system and equipment for embedded software |
US11102081B1 (en) | 2020-09-28 | 2021-08-24 | Accenture Global Solutions Limited | Quantitative network testing framework for 5G and subsequent generation networks |
US20210326121A1 (en) * | 2020-04-17 | 2021-10-21 | Jpmorgan Chase Bank, N.A. | Cloud portability code scanning tool |
CN117055905A (en) * | 2023-10-12 | 2023-11-14 | 北京谷器数据科技有限公司 | Method for rapidly and locally deploying SaaS platform |
US20240004966A1 (en) * | 2021-06-29 | 2024-01-04 | Capital One Services, Llc | Onboarding of Monitoring Tools |
CN118550841A (en) * | 2024-07-30 | 2024-08-27 | 深圳市遨游通讯设备有限公司 | Method and system for realizing cross-layer communication of intercom module based on hong Monte-cover system |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050154559A1 (en) * | 2004-01-12 | 2005-07-14 | International Business Machines Corporation | System and method for heuristically optimizing a large set of automated test sets |
-
2012
- 2012-12-06 MY MYPI2012005289A patent/MY176486A/en unknown
-
2013
- 2013-11-29 WO PCT/MY2013/000218 patent/WO2014088398A1/en active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050154559A1 (en) * | 2004-01-12 | 2005-07-14 | International Business Machines Corporation | System and method for heuristically optimizing a large set of automated test sets |
Non-Patent Citations (2)
Title |
---|
J D MEIER ET AL: "Performance Testing Guidance for Web Applications", 1 January 2007 (2007-01-01), pages 1 - 221, XP055106205, Retrieved from the Internet <URL:http://perftestingguide.codeplex.com/downloads/get/17955> [retrieved on 20140307] * |
JOHN GRUNDY ET AL: "SoftArch/MTE: Generating Distributed System Test-Beds from High-Level Software Architecture Descriptions", AUTOMATED SOFTWARE ENGINEERING, vol. 12, no. 1, 1 January 2005 (2005-01-01), pages 5 - 39, XP055044958, ISSN: 0928-8910, DOI: 10.1023/B:AUSE.0000049207.62380.74 * |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9274782B2 (en) | 2013-12-20 | 2016-03-01 | International Business Machines Corporation | Automated computer application update analysis |
US10860458B2 (en) | 2014-07-31 | 2020-12-08 | Micro Focus Llc | Determining application change success ratings |
WO2016018333A1 (en) * | 2014-07-31 | 2016-02-04 | Hewlett-Packard Development Company, L.P. | Determining application change success ratings |
US9619371B2 (en) | 2015-04-16 | 2017-04-11 | International Business Machines Corporation | Customized application performance testing of upgraded software |
US10031831B2 (en) | 2015-04-23 | 2018-07-24 | International Business Machines Corporation | Detecting causes of performance regression to adjust data systems |
US9886367B2 (en) | 2015-04-29 | 2018-02-06 | International Business Machines Corporation | Unified processing test structure |
WO2017142393A1 (en) * | 2016-02-17 | 2017-08-24 | Mimos Berhad | System for managing user experience test in controlled test environment and method thereof |
KR102089284B1 (en) | 2016-02-26 | 2020-03-17 | 노키아 솔루션스 앤드 네트웍스 오와이 | Cloud verification and test automation |
KR20180120203A (en) * | 2016-02-26 | 2018-11-05 | 노키아 솔루션스 앤드 네트웍스 오와이 | Cloud validation and test automation |
WO2017144432A1 (en) * | 2016-02-26 | 2017-08-31 | Nokia Solutions And Networks Oy | Cloud verification and test automation |
CN105843725A (en) * | 2016-03-17 | 2016-08-10 | 广州杰赛科技股份有限公司 | Monitoring method and device of IaaS platform |
WO2017177783A1 (en) * | 2016-04-11 | 2017-10-19 | 中兴通讯股份有限公司 | Test system and method invoking third-party test tool |
US11650797B2 (en) * | 2020-04-17 | 2023-05-16 | Jpmorgan Chase Bank, N.A. | Cloud portability code scanning tool |
US20210326121A1 (en) * | 2020-04-17 | 2021-10-21 | Jpmorgan Chase Bank, N.A. | Cloud portability code scanning tool |
US12001815B2 (en) | 2020-04-17 | 2024-06-04 | Jpmorgan Chase Bank, N.A. | Cloud portability code scanning tool |
US11102081B1 (en) | 2020-09-28 | 2021-08-24 | Accenture Global Solutions Limited | Quantitative network testing framework for 5G and subsequent generation networks |
EP3975482A1 (en) * | 2020-09-28 | 2022-03-30 | Accenture Global Solutions Limited | Quantitative network testing framework for 5g and subsequent generation networks |
CN112395176A (en) * | 2020-11-16 | 2021-02-23 | 公安部第三研究所 | Method, device, system, equipment, processor and storage medium for testing distributed cloud storage performance |
CN112699041A (en) * | 2021-01-04 | 2021-04-23 | 中车青岛四方车辆研究所有限公司 | Automatic deployment method, system and equipment for embedded software |
CN112699041B (en) * | 2021-01-04 | 2024-03-26 | 中车青岛四方车辆研究所有限公司 | Automatic deployment method, system and equipment for embedded software |
US20240004966A1 (en) * | 2021-06-29 | 2024-01-04 | Capital One Services, Llc | Onboarding of Monitoring Tools |
CN117055905A (en) * | 2023-10-12 | 2023-11-14 | 北京谷器数据科技有限公司 | Method for rapidly and locally deploying SaaS platform |
CN118550841A (en) * | 2024-07-30 | 2024-08-27 | 深圳市遨游通讯设备有限公司 | Method and system for realizing cross-layer communication of intercom module based on hong Monte-cover system |
Also Published As
Publication number | Publication date |
---|---|
MY176486A (en) | 2020-08-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014088398A1 (en) | Automated test environment deployment with metric recommender for performance testing on iaas cloud | |
CN109302522B (en) | Test method, test device, computer system, and computer medium | |
US11755919B2 (en) | Analytics for an automated application testing platform | |
US9465718B2 (en) | Filter generation for load testing managed environments | |
US9582391B2 (en) | Logistics of stress testing | |
US8731896B2 (en) | Virtual testbed for system verification test | |
US20100153780A1 (en) | Techniques for generating a reusable test script for a multiple user performance test | |
US11237948B2 (en) | Rendering engine component abstraction system | |
CN110750458A (en) | Big data platform testing method and device, readable storage medium and electronic equipment | |
US9946639B2 (en) | Transactional boundaries for virtualization within a software system | |
JP2015524596A (en) | System and method for configuring a cloud computing system | |
US20100153087A1 (en) | Techniques for generating a reusable test script for a single user performance test | |
CN103365770A (en) | Mobile terminal software testing system and software testing method | |
CN112241360A (en) | Test case generation method, device, equipment and storage medium | |
CN106815142A (en) | A kind of method for testing software and system | |
CN117597669A (en) | Test method, system and device | |
JP2013524312A (en) | Method, computer program, and apparatus for verifying task execution in an adaptive computer system | |
JP5400873B2 (en) | Method, system, and computer program for identifying software problems | |
TWI627528B (en) | System and method applied to cloud virtual machine automated test environment deployment | |
CN108009086B (en) | System Automation Testing Method Based on Use Case Decomposition and Functional Learning | |
KR102430523B1 (en) | Mobile Device verification automation platform system based on web and verification method of the same | |
US8997048B1 (en) | Method and apparatus for profiling a virtual machine | |
TW201640371A (en) | Method and architecture for cluster implementation of cloud desktop efficacy detector | |
CN114003457B (en) | Data acquisition method and device, storage medium and electronic device | |
KR20170044320A (en) | Method of analyzing application objects based on distributed computing, method of providing item executable by computer, server performing the same and storage media storing the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13829012 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13829012 Country of ref document: EP Kind code of ref document: A1 |