WO2017142393A1 - Système permettant de gérer un test d'expérience d'utilisateur dans un environnement de test contrôlé et son procédé - Google Patents

Système permettant de gérer un test d'expérience d'utilisateur dans un environnement de test contrôlé et son procédé Download PDF

Info

Publication number
WO2017142393A1
WO2017142393A1 PCT/MY2016/050085 MY2016050085W WO2017142393A1 WO 2017142393 A1 WO2017142393 A1 WO 2017142393A1 MY 2016050085 W MY2016050085 W MY 2016050085W WO 2017142393 A1 WO2017142393 A1 WO 2017142393A1
Authority
WO
WIPO (PCT)
Prior art keywords
test
user experience
user
plan
system under
Prior art date
Application number
PCT/MY2016/050085
Other languages
English (en)
Inventor
Fook Ann LOO
Shi Tzuaan SOO
Muhammad Dhiauddin MOHAMED SUFFIAN
Ashok SIVAJI
Original Assignee
Mimos Berhad
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mimos Berhad filed Critical Mimos Berhad
Publication of WO2017142393A1 publication Critical patent/WO2017142393A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3438Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment monitoring of user actions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • G06F11/3414Workload generation, e.g. scripts, playback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3466Performance evaluation by tracing or monitoring
    • G06F11/3476Data logging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/40Data acquisition and logging

Definitions

  • the present invention relates generally to arrangement for user experience test. More particularly, the present invention relates to an improved system and method for managing a user experience test for a system under test in a controlled test environment.
  • Usability testing or user experience test is an important aspect in user centered design approach that has become a major focus of research activities. It is usually conducted once the product has reached certain design and development stages and the respondents have been duly recruited.
  • the user experience test provides some insights, understanding and recommendations about the user needs in relation to a product under testing, which will be useful for further improvements to the product later on.
  • the respondents or groups of potential users are required to complete specified routine tasks while being monitored by a moderator or tester who will observe, listen and take notes of the same. This can be done either in a usability lab, remotely or on-site with portable equipment.
  • the user experience test has extraordinarily benefited, in term of product review and efficiency, many technological fields digitally and physically such as website development, application or firmware for smartphone, etc.
  • U.S Publication No. 2004/0015867 A1 discloses an automated usability testing system and method.
  • the system comprises a test plan and a data logger creator for constructing a test plan and for collecting test data, respectively.
  • the system also comprises a log analyzer for summarizing the data log in a report.
  • Another problem is that the user experience test is typically conducted in a limited number of test environments due to the cost factor, which will lead to non-holistic review of a product.
  • the user experience test of a website for instance, is conducted through one network connection only such as LAN wired connection, where the real environment that contains many different network connections will never be achieved.
  • the different network connections in general have different characteristics.
  • the present invention provides a system for managing a user experience test in a controlled test environment.
  • the system of the present invention can be characterized by a test controller configured for providing and executing a test plan associated with the user experience test for a system under test to a set of users via a plurality of network connections through a user interface agent; an adaptive load generator agent configured for establishing a user traffic flow by simulating an anticipated system load of the system under test in accordance with the test plan thereof, wherein the user traffic flow represents actions and types of virtual users; an adaptive network emulator agent configured for establishing one or more emulated network connections by modifying at least three connection parameters including bandwidth, latency and packet loss of one or more of the plurality of network connections in accordance with the test plan thereof; an analyzer and issue management module configured for providing performance analysis and computing a feedback score in relation to user feedback transmitted by the test controller, wherein the feedback score is compared against a score threshold to trigger execution of another user experience test, a functional test and a performance test on a selected part of the test plan for reporting root cause analysis; a software release post-commit trigger module in communication with the analyzer and issue management
  • the system further comprises a test report and test log module configured for managing reports and logs generated during the user experience test; and a test plan management module configured for preparing the test plan of the user experience test including test tasks, test questions, test configurations, test environment settings, user guides, performance test scripts and functional test scripts.
  • the one or more of the plurality of network connections is selected based on Internet protocol (IP) address, network ports, media access control (MAC) address of source or destination, and network protocol.
  • IP Internet protocol
  • MAC media access control
  • the functional test relates to assessment of feature functions of the system under test.
  • the performance test relates to assessment of performance of the system under test in the user traffic flow and the one or more emulated network connections.
  • the root cause analysis is a comparison of results obtained from the user experience test, the functional test and the performance test.
  • the software release is an upgrade version of the system under test developed based on the root cause analysis.
  • the analyzer and issue management module generates a problem report comprising the user experience test, the functional test, the performance test and results thereof, and the root cause analysis to a developer of the system under test for upgrading the system under test prior to generation of the software release.
  • the method of the present invention can be characterized by providing and executing a test plan associated with the user experience test for a system under test to a set of users via a plurality of network connections; establishing a user traffic flow by simulating an anticipated system load of the system under test in accordance with the test plan; establishing one or more emulated network connections by modifying at least three connection parameters including bandwidth, latency and packet loss of one or more of the plurality of network connections in accordance with the test plan; providing performance analysis thereby computing a feedback score based on user feedback received thereof; comparing the feedback score against a score threshold to trigger execution of another user experience test, a functional test and a performance test on a selected part of the test plan for reporting root cause analysis; and monitoring a software release associated with the system under test thereby triggering download and execution of the same.
  • the method further comprises obtaining a problem report comprising the user experience test, the functional test, the performance test and results thereof, and the root cause analysis; performing a version upgrade, based on the problem report, to the system under test; and generating a software release associated with the system under test.
  • the present invention provides integration and connectivity with other suitable tests such as functional and performance tests which are closely related to the user experience test so that the tester would be able to draw a conclusion pertaining to the system under test accurately and quickly. Furthermore, the present invention advantageously provides root cause analysis based on a series of tests performed thereof to allow immediate identification and solving of the usability, functional and performance issues raised to a system under test. It is therefore another advantage of the present invention that closely resembles the real environment in which the product will be used in term of user traffic flow and network connections.
  • the present invention allows for creation of the test environments of the user experience testing indefinitely. It can be achieved by way of manipulating the user traffic flow in respect of actions and types of virtual users and the network connection in respect of its connection parameters.
  • Figure 1 is a schematic block diagram showing a system for managing a user experience test according to one embodiment of the present invention
  • Figure 2 is a schematic block diagram showing a test controller and other modules associated with the system for managing a user experience test according to one embodiment of the present invention
  • Figure 3 is a flow diagram depicting a method of managing a user experience test according to one embodiment of the present invention.
  • Figure 4 is a flow diagram depicting the step of comparing a feedback score against a score threshold of the method of managing a user experience test according to one embodiment of the present invention.
  • the present invention provides an improved system and method for managing an unmoderated remote user experience test (UET) within a controlled test environment that provides integration and connectivity with functional and performance tests. It is further an object of the present invention to closely resemble the real test environment for UET in term of user traffic flow and network connections. It is also an object of the present invention to allow indefinite creation of the test environments of UET.
  • the present invention is developed in a highly compact, cost-effective, and simple manner, without the use of complicated and sophisticated components.
  • the system comprises a plurality of modules and agents or engines suitably adopted and interconnected with each other to deliver the desired effects and objectives thereof.
  • the system comprises a test controller 100, a user interface agent 101 , an adaptive load generator agent 102, an adaptive network emulator agent 103, an analyzer and issue management module 104, a software release post-commit trigger module 105, a test setup agent 106, a test report and test log module 107, a test plan management module 108 and a functional test execution agent 109.
  • the test controller 100 is the main test execution module for the system of the present invention. It is connected to every other modules and agents employed by the system.
  • the test controller 100 on a first part, is preferably connected to the user interface agent 101 , the adaptive load generator agent 102, the adaptive network emulator agent 103, the test setup agent 106 and the functional test execution agent 109, which further allows indirect connection of the test controller 100 to a set of users and system under test (SUT), as assembled in Figure 1 .
  • the test controller 100 is connected to the analyzer and issue management module 104, the software release post-commit trigger module 105, the test report and test log module 107 and the test plan management module 108 besides the user interface agent 101.
  • the arrangement of the modules and agents is best depicted by Figure 2.
  • the test controller 100 may execute the test plan associated with the UET loaded from the test plan management module 108 via a plurality of network connections through the user interface agent 101.
  • the test controller 100 is further configured to monitor and control all the modules and agents in the system of the present invention.
  • the test controller 100 monitors and controls the test setup agent 106 to setup the SUT, the analyzer and issue management module 104 to analyze user feedback and provide performance analysis, the adaptive network emulator agent 103 to start network emulation, the adaptive load generator agent 102 to generate the desired user traffic flows for the testing, and the user interface agent 101 to give the test tasks to the users in question.
  • the test controller 100 receives user feedback from the users through the user interface agent 101 .
  • the test task and the user feedback may be in the form of graphical user interface (GUI) inputs, text, photo, video and many more.
  • GUI graphical user interface
  • the user interface agent 101 may comprise a user interface that will be used as communication means between the system particularly the test controller 100 and the users.
  • the adaptive load generator agent 102 can be adopted to create and establish a user traffic flow for the UET.
  • the adaptive load generator agent 102 will simulate an anticipated system load of the SUT, as specified by the test plan thereof.
  • the user traffic flow represents actions and types of virtual users that shall be applied as a system load during the UET to resemble the real environment of the SUT.
  • the user traffic flow generated thereof will simulate the virtual users performing different actions on the SUT; the reason being to simulate actual usage of the SUT in the real environment.
  • user types and user actions in the system of the present invention There are different user types and user actions in the system of the present invention.
  • Examples of user types can include, but not limited to, project managers, developers, test engineers, process engineers, manufacturing team and marketing team, who are virtually exist in the system.
  • Examples of user actions can include, but not limited to, project managers are generating reports from the SUT, developers are updating development progress in the SUT, and test engineers are uploading test results to the SUT.
  • the tester or administrators can specify the preferred system load to be generated and used on the SUT during the testing.
  • One of the purposes is to allow the system of the present invention to be able to simulate the anticipated system load in the real environment.
  • the tester can simulate five project managers who are generating reports from the SUT, 100 developers who are updating development progress in the SUT, and 50 test engineers who are uploading test results to the SUT.
  • the tester will then upload the associated automation test scripts of user actions and user types to the test plan management module 108 to be compiled with the rest of information in the test plan.
  • the adaptive load generator agent 102 under controlled by the test controller 100, simulates the user traffic flow accordingly.
  • the system load of the SUT will be the same as the real environment and project the actual user experience as in the real environment.
  • the test results would be more accurate as response time of the SUT is similar to the response time of that in the real environment.
  • the adaptive load generator agent 102 may be configured to simulate many other anticipated system load, for example, during special condition, year-end shopping season, school holiday season, stock clearance sale, etc.
  • the tester can also execute or run the UET with a series of different system load to check the changes in the user experience scoring when the system load and response time changes. The result from this exercise can be used as a reference for the administrator to upgrade the SUT when, for example, the response time drops below a certain threshold, which reflect unacceptable bad user experience.
  • the adaptive load generator agent 102 is an adaptive module which is capable to automatically change the number of simulated users based on the number of real users participate in the UET. For example, say the administrator would like to conduct an UET with five users doing registration and 20 users searching for products on a website. If there are one real user who is doing registration process and five real users who are searching for products on a website, the adaptive load generator agent 102 will adaptively adjust the system to simulate four virtual users doing registration and 15 virtual users searching for products. Alternatively, the adaptive load generator agent 102 can be used to simulate a fixed number of user traffic types regardless of the number of real users participated in the UET, which is implementable in the test plan. The adaptive load generator agent 102 is also configured to run performance test on the SUT, the results of which will be used by the analyzer and issue management module 104 to find the root cause of the user experience related issues.
  • the adaptive network emulator agent 103 can be configured to create and establish one or more emulated network connections for the UET. It is a network emulator module which is capable of changing the network connection parameters.
  • the emulated network connections can be achieved by way of modifying at least three connection parameters of one or more selected network connections of the plurality of network connections in line with test plan thereof.
  • the at least three connection parameters include bandwidth, latency and packet loss. Nevertheless, other connection parameters may also be used, such that the modifications and variations do not depart from the present invention.
  • the tester or administrator would be able to specify the preferred types of network connection to be emulated for the UET. Examples of network connections as well as emulated network connections can include, but not limited to, cellular 3G network connection, cellular long term evolution (LTE) or 4G network connection and wireless local area network of 802.1 1 family network connections.
  • LTE long term evolution
  • the adaptive network emulator agent 103 will act as a layer-2 network switch that is totally transparent to the network traffics. When network traffics of the network connections pass through the adaptive network emulator agent 103, the adaptive network emulator agent 103 will change the connection parameters of selected network connections as specified in the test plan. It is imperative to note that not all network connections that passed through the adaptive network emulator agent 103 will have their connection parameters changed. The adaptive network emulator agent 103 will only perform network emulation to the selected network connections as instructed by the test controller 100 through its test plan. That is to say, other non-selected network connections will not be affected and will pass through the adaptive network emulator agent 103 without any changes in their connection parameters. The network connections will be selected based on certain requirements or criteria.
  • requirements or criteria can include source or destination Internet protocol (I P) address, source or destination network ports, source or destination media access control (MAC) address and network protocol used thereof.
  • I P Internet protocol
  • MAC media access control
  • the SUT will be the same as the real environment as it projects the actual user experience in terms of network connection as in the real environment.
  • a real user is connected to the SUT at 100 Mbps, 0.01 ms latency and 0.001 % packet loss.
  • the adaptive network emulator agent 103 will change the original network connection to 1.5 Mbps, 120 ms and 1.5% packet loss to emulate a cellular 3G data connection, i.e. an emulated network connection.
  • the real users will be connected to the SUT via the emulated network connections.
  • the response time of the application will be affected because in the emulated network connection, the bandwidth is smaller, latency is higher and packet loss is higher as opposed to the original network connection.
  • the analyzer and issue management module 104 can be configured for providing performance analysis and for computing a feedback score in relation to the user feedback transmitted from the test controller 100.
  • the user feedback received from the users is preferably analyzed appropriately according to the test plan.
  • the user feedback which is in response to the test questions thereof will be gathered and computed based on the types of test questions the test controller 100 have put in to the users.
  • the scores for each of the test questions designated by the users will be summed up to generate a feedback score.
  • the feedback score is then compared against a score threshold set by the tester or administrator.
  • the feedback score may be normalized to a common scale such as an average value, which is also may be compared against a corresponding score threshold.
  • an individual score of each of the test questions may be compared to an individual score threshold, at this stage. Based on the comparison between the feedback score and the score threshold, the analyzer and issue management module 104 will trigger execution of another UET, a functional test and a performance test on a selected part of the test plan where the issues arise.
  • the functional test preferably relates to assessment of feature functions of the SUT, while the performance test is related to assessment of performance of the SUT in the user traffic flow and the emulated network connections.
  • the functional automation test script and the performance automation test script are uploaded by the tester in the test plan at the system initialization stage.
  • the functional test is preferably run by the functional test execution agent 109.
  • the analyzer and issue management module 104 will decide whether to execute the other stipulated tests or not based on the comparison result.
  • the results obtained from the UET, the functional test and the performance test are consolidated and further analyzed for root cause analysis.
  • the analyzer and issue management module 104 will immediately find the root cause of the usability or user experience issue(s) based on the results obtained from the three tests.
  • the root cause analysis is a comparison of results obtained from the UET, the functional test and the performance test thereof. Examples of analysis done by the analyzer and issue management module 104 can include but not limited to: i. If the users are not satisfied with a feature function of the SUT, but the functional test is passed, then it can be concluded that the issue is not caused by the feature functions; ii.
  • the users are satisfied with the feature function of the SUT, but not satisfied with the feature function when the SUT is executed at high system load of user traffic flow, then it can be concluded that the system performance is poor at that user traffic flow and the issue is caused by its performance; iii. If the users are not satisfied with the feature function regardless whether the system load is high or low, then it can be concluded that the issue is not caused by its performance, but it is caused by the system design such as workflow, layout, colors, etc. instead. The tester can obtain more detail about this system design issue from the users by using different types of test question such as open questions.
  • the analyzer and issue management module 104 will generate a problem report to the developer.
  • the problem report preferably comprises the UET, the functional test, the performance test and the results thereof, the user feedback along with the root cause analysis for the SUT.
  • the report can be generated by a bug tracking software like JI RA, Bugzilla, etc.
  • the developer upon receiving the problem report, will next decide for upgrading or modifying the SUT in order to solve the issues.
  • the root cause analysis will facilitate the developer in identifying or locating the parts of the SUT that need to be fixed.
  • test questions are scale questions where the users response by way of selecting a score from a scale of 1 to 5, e.g. scale 1 is "very bad” and 5 is “very good”, and a score threshold is 3.
  • scale 1 is "very bad” and 5 is “very good”
  • a score threshold is 3.
  • the analyzer and issue management module 104 will next calculate an average score of all the scores from the five users and compare the average score against the score threshold 3. If the average score falls below the score threshold, then the analyzer and issue management module 104 will trigger the test controller 100 to execute the functional test, the performance test and the UET on that particular function for the SUT. Thereafter, the results from the tests will be gathered and further scrutinized for the root cause analysis. Finally, the analyzer and issue management module 104 will generate a problem report and forward the same to the developer.
  • the software release generally refers to the distribution of a new or an upgraded version of computer software such as the SUT. Such the action will therefore trigger a signal transmission to the software release post-commit trigger module 105.
  • the signal is generated by the "post- commit hook" in the centralized repository.
  • the post-commit hook is run after the upgrading is committed and a new version is created.
  • the software release post-commit trigger module 105 which is in communication with the analyzer and issue management module 104 can be configured to monitor any issuance and distribution of a software release associated with the SUT from the analyzer and issue management module 104, particularly.
  • the software release post-commit trigger module 105 will subsequently trigger download of the software release at the test controller 100. Further, it also will trigger execution of the same at the test controller 100, whereby the test controller 100 will re-executes the test plan or executes a new UET on the upgraded SUT. Once the re-execution or new execution of UET is completed, the analyzer and issue management module 104 will perform its portion by way of providing analysis, comparing the feedback score with the score threshold, and eventually issuing a problem report for the SUT. The iteration cycle of the above-mentioned steps will continue accordingly.
  • Initialization of the controlled test environment for the UET including setup and configurations is preferably taken care of by the test setup agent 106. It is essentially capable of setting the operating environment that the SUT should be tested during the UET, in accordance with the test plan, the user traffic flow and the emulated network connections thereof.
  • the setup and configuration of the operating environment, i.e. the controlled test environment can include, but not limited to, type of hypervisor and virtual machines, size of random access memory (RAM), type and number of central processing unit (CPU) cores, size and type of hard disk, public cloud vendor, location of the public cloud virtual machine, type of operating system, type of server and type of application runtime.
  • the test setup agent 106 is basically run based on the instructions from the test controller 100. For instance, the test setup agent 106 will install and setup a new software release of the SUT released by the developer which contains bug fixes upon an instruction from the test controller 100.
  • the test report and test log module 107 can be configured to manage the reports and logs for the SUT generated during the UET.
  • the test logs and test report will be stored in the persistent storage thereof.
  • Figures 3 and 4 describe, in a step-by-step manner, the method according to one preferred embodiment of the present invention.
  • the method begins, upon triggered by the test controller 100, with an initialization of the test environment including setup and configurations of the SUT by the test setup agent 106 based on a test plan previously prepared at the test plan management module 108 by the tester or administrators.
  • the test plan associated with the UET for the SUT as provided by the test controller 100 (who receives the same from the test plan management module 108) will be provided and executed (see step 200).
  • the adaptive load generator 102 will establish a user traffic flow by simulating an anticipated system load of the SUT in accordance with the test plan thereof.
  • the adaptive network emulator agent 103 in step 202, will establish one or more emulated network connections by modifying at least three connection parameters including bandwidth, latency and packet loss of one or more of the plurality of network connections in accordance with the test plan thereof.
  • the analyzer and issue management module 104 will provide performance analysis to thereby compute a feedback score based on user feedback received thereof (see step 203). Thereafter, the feedback score will be compared against a score threshold to trigger execution of another UET, a functional test and a performance test on a selected part of the test plan for reporting the root cause analysis, as in step 204.
  • This step 204 is preferably executed by the analyzer and issue management module 104 (see "A").
  • the analyzer and issue management module 104 will check for any usability issue pertaining to the SUT. It can be done by way of comparing the feedback score with the score threshold. If there is no usability issue detected, then a UET or test report will be immediately generated. However, if there is a usability issue, the analyzer and issue management module 104 will trigger execution of another UET, a functional test and a performance test on a selected part of the test plan where the issues arise. The results obtained from the UET, the functional test and the performance test are consolidated and further analyzed for root cause analysis. The analyzer and issue management module 104 will immediately find the root cause of the usability or user experience issue(s) based on the results obtained from the three tests. Once the analyzer and issue management module 104 has found the root cause of the usability or user experience issue(s), then it will generate a problem report to the developer.
  • Step 205 indicates a step of monitoring a software release associated with the SUT thereby triggering download and execution of the same, which shall be executed in a continuous manner.
  • the software release is preferably generated once the problem report has been created by the analyzer and issue management module 104.
  • a version upgrade is performed to the SUT based on the input from the developer who fixed the issues (see step 207).
  • a software release associated with the SUT will be generated in step 208.
  • the software release will next be made available to the software release post- commit trigger module 105 or other modules with similar functions so that its presence will be apparent and noticed.
  • the software release post-commit trigger module 105 will subsequently detect the software release. It will thus trigger download of the software release at the test controller 100 and further trigger execution of the same at the test controller 100, whereby the test controller 100 will re-execute the test plan or executes a new UET on the upgraded SUT.
  • the terms “a” and “an,” as used herein, are defined as one or more than one.
  • the term “plurality,” as used herein, is defined as two or more than two.
  • the term “another,” as used herein, is defined as at least a second or more.
  • the terms “including” and/or “having,” as used herein, are defined as comprising (i.e., open language).

Abstract

La présente invention se rapporte à un système et à un procédé perfectionnés permettant de gérer un test d'expérience d'utilisateur (UET pour User Experience Test) pour un système soumis à un test (SUT pour System Under Test) dans un environnement de test contrôlé. Le système comprend un dispositif de commande de test (100), un agent d'interface utilisateur (101), un agent générateur de charge adaptative (102), un agent émulateur de réseau adaptatif (103), un module de gestion de différent et d'analyseur (104), un module de déclenchement de post-validation de sortie logicielle (105), un agent de réglage de test (106), un module de journal de test et de rapport de test (107), un module de gestion de plan de test (108) et un agent d'exécution de test fonctionnel (109). Le procédé consiste à fournir un plan de test pour le système SUT à des utilisateurs par le biais de connexions réseau (200); à établir un flux de trafic d'utilisateur en simulant la charge du système (201); à établir une ou plusieurs connexions réseau émulées par modification d'au moins trois paramètres de connexion (202); à fournir une analyse de performance, ce qui permet de calculer un score de réactions (203); à comparer le score de réactions à un seuil de score pour déclencher l'exécution d'un autre test UET, d'un test fonctionnel et d'un test de performance pour rapporter l'analyse des causes profondes (204); et à surveiller un logiciel, ce qui permet de déclencher son téléchargement et son exécution (205).
PCT/MY2016/050085 2016-02-17 2016-11-30 Système permettant de gérer un test d'expérience d'utilisateur dans un environnement de test contrôlé et son procédé WO2017142393A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
MYPI2016000283 2016-02-17
MYPI2016000283 2016-02-17

Publications (1)

Publication Number Publication Date
WO2017142393A1 true WO2017142393A1 (fr) 2017-08-24

Family

ID=59625336

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/MY2016/050085 WO2017142393A1 (fr) 2016-02-17 2016-11-30 Système permettant de gérer un test d'expérience d'utilisateur dans un environnement de test contrôlé et son procédé

Country Status (1)

Country Link
WO (1) WO2017142393A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107894936A (zh) * 2017-11-13 2018-04-10 重庆首亨软件有限公司 一种单片机测试软件
CN113029329A (zh) * 2020-09-15 2021-06-25 山东华科信息技术有限公司 批量检测振动传感器的检测系统及其检测方法
CN115437351A (zh) * 2022-09-06 2022-12-06 中国第一汽车股份有限公司 自动化测试系统、方法、电子设备及存储介质
US11520686B2 (en) 2021-01-26 2022-12-06 The Toronto-Dominion Bank System and method for facilitating performance testing
WO2023202037A1 (fr) * 2022-04-20 2023-10-26 中兴通讯股份有限公司 Procédé et appareil de test de service, et dispositif associé

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040015867A1 (en) * 2002-07-16 2004-01-22 Macko John Steven Travis Automated usability testing system and method
US20120311128A1 (en) * 2011-05-31 2012-12-06 Pechanec Jiri Performance testing in a cloud environment
US20130159774A1 (en) * 2011-12-19 2013-06-20 Siemens Corporation Dynamic reprioritization of test cases during test execution
WO2014088398A1 (fr) * 2012-12-06 2014-06-12 Mimos Berhad Déploiement d'environnement de test automatisé avec système de recommandation de métriques pour test de performance sur infrastructure infonuagique en tant que service (iaas)
US20150363304A1 (en) * 2014-06-17 2015-12-17 Kishan Nagamalla Self-learning and self-validating declarative testing

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040015867A1 (en) * 2002-07-16 2004-01-22 Macko John Steven Travis Automated usability testing system and method
US20120311128A1 (en) * 2011-05-31 2012-12-06 Pechanec Jiri Performance testing in a cloud environment
US20130159774A1 (en) * 2011-12-19 2013-06-20 Siemens Corporation Dynamic reprioritization of test cases during test execution
WO2014088398A1 (fr) * 2012-12-06 2014-06-12 Mimos Berhad Déploiement d'environnement de test automatisé avec système de recommandation de métriques pour test de performance sur infrastructure infonuagique en tant que service (iaas)
US20150363304A1 (en) * 2014-06-17 2015-12-17 Kishan Nagamalla Self-learning and self-validating declarative testing

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107894936A (zh) * 2017-11-13 2018-04-10 重庆首亨软件有限公司 一种单片机测试软件
CN113029329A (zh) * 2020-09-15 2021-06-25 山东华科信息技术有限公司 批量检测振动传感器的检测系统及其检测方法
CN113218432A (zh) * 2020-09-15 2021-08-06 山东华科信息技术有限公司 批量检测温湿度传感器的检测系统及其检测方法
US11520686B2 (en) 2021-01-26 2022-12-06 The Toronto-Dominion Bank System and method for facilitating performance testing
US11681607B2 (en) 2021-01-26 2023-06-20 The Toronto-Dominion Bank System and method for facilitating performance testing
WO2023202037A1 (fr) * 2022-04-20 2023-10-26 中兴通讯股份有限公司 Procédé et appareil de test de service, et dispositif associé
CN115437351A (zh) * 2022-09-06 2022-12-06 中国第一汽车股份有限公司 自动化测试系统、方法、电子设备及存储介质

Similar Documents

Publication Publication Date Title
WO2017142393A1 (fr) Système permettant de gérer un test d'expérience d'utilisateur dans un environnement de test contrôlé et son procédé
WO2020140820A1 (fr) Procédé de test de logiciel, système, appareil, dispositif, support et produit de programme informatique
Polo et al. Test automation
US9465718B2 (en) Filter generation for load testing managed environments
US20110004460A1 (en) Virtual testbed for system verification test
CN110347395A (zh) 基于云计算平台的软件发布方法及装置
GB2523338A (en) Testing a virtualised network function in a network
CN107463362A (zh) 基于多个Jenkins的持续部署的方法和系统
US11086752B2 (en) Methods, systems, and computer readable media for vendor-neutral testing and scoring of systems under test
US9823999B2 (en) Program lifecycle testing
Nidagundi et al. New method for mobile application testing using lean canvas to improving the test strategy
Lei et al. Performance and scalability testing strategy based on kubemark
Rodrigues et al. Pletsperf-a model-based performance testing tool
CN114338423A (zh) 一种持续迭代的自动化网络靶标构建技术
CN112199273A (zh) 一种虚拟机压力/性能测试方法及系统
JP6959624B2 (ja) セキュリティアセスメントシステム
Harsh et al. Cloud enablers for testing large-scale distributed applications
Berłowski et al. Highly automated agile testing process: An industrial case study
US20200233780A1 (en) Performance engineering platform and metric management
Kılınç et al. Cloud-based test tools: A brief comparative view
Galpaya Stress Testing Tool to check the performance of a Moodle Instance
Damm Evaluating and Improving Test Efficiency
Lazić et al. A simultaneous application of combinatorial testing and virtualization as a method for software testing
Ignatious et al. Identifying A Regression Test Prioritization Technique and Proposing A Tool for Automation for Trade me Website
Bhamangol A Comparative Analysis Study Of Software Testing Tools With Quality Factors Affects In Software Development Cycle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16890788

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16890788

Country of ref document: EP

Kind code of ref document: A1