WO2016124230A1 - Procédé pour tests automatisés de logiciels distribués et unité de test - Google Patents

Procédé pour tests automatisés de logiciels distribués et unité de test Download PDF

Info

Publication number
WO2016124230A1
WO2016124230A1 PCT/EP2015/052261 EP2015052261W WO2016124230A1 WO 2016124230 A1 WO2016124230 A1 WO 2016124230A1 EP 2015052261 W EP2015052261 W EP 2015052261W WO 2016124230 A1 WO2016124230 A1 WO 2016124230A1
Authority
WO
WIPO (PCT)
Prior art keywords
test
event
client
component
local
Prior art date
Application number
PCT/EP2015/052261
Other languages
English (en)
Inventor
Prakriya Venkata Ramana Murthy
Andreas Ulrich
Original Assignee
Siemens Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Aktiengesellschaft filed Critical Siemens Aktiengesellschaft
Priority to PCT/EP2015/052261 priority Critical patent/WO2016124230A1/fr
Publication of WO2016124230A1 publication Critical patent/WO2016124230A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Definitions

  • the invention relates to a method for automated testing of a distributed software and a testing unit for automated soft ⁇ ware testing.
  • the invention relates to a method for automated testing of distributed software or distributed software product.
  • the software comprises at least two components which are distrib ⁇ uted over a computer network.
  • a master test component and at least two client test components are used for the testing.
  • the test case is executed com ⁇ prising the following steps: Reading a test case specifica- tion describing at least a test procedure for at least some of the client test components, comprising at least one con ⁇ current event that comprises at least one local event denot ⁇ ing a test procedure on a client test component for execution in a timely coordinated manner.
  • a test case is executed thereby dispatching at least one local event of the concur ⁇ rent event to the respective client test components.
  • at least one local test verdict from the execution of the local event on the respective client test component is received and, by the master test component a test verdict using the at least one local verdict is determined.
  • One aspect of this method is that such a piece of -in par- ticular distributed- software that runs, for example, on dif ⁇ ferent computer platforms, can be automatically tested.
  • At least one of the client test components pro ⁇ vides an interface, such as e.g. a graphical user interface (GUI) or application programming interface (API) .
  • GUI graphical user interface
  • API application programming interface
  • the invention further relates to an according test system and computer program product.
  • GUIs graphical user interfaces
  • Fig. 1 An example of a piece of distributed software con- sisting of three interconnected components which are distrib ⁇ uted over an arbitrary computer network topology;
  • Fig. 2 A flow diagram of the test design and test execution process
  • Fig. 3 A generic test architecture that supports testing of distributed software using a concurrent event sequence as test case
  • Fig. 4 A schematic of a test case comprising a concurrent event sequence to exemplify a test scenario of an incoming emergency call to a call centre.
  • Fig. 1 depicts an exemplary embodiment of an abstract view of a piece of distributed software consisting of several compo ⁇ nents, a first component COMP1, a second component COMP2 and a third component COMP3 connected, via a network interface at the respective component, amongst each other and arbitrarily distributed over an underlying computer network.
  • connection can be realised wirebound or wireless, in both ways or only one direction or any combina ⁇ tions thereof.
  • number of connections between any two components is not restricted to one. In particular combinations of various connection types can be applied, such as one or more wireless connections in combination with one or more wirebound connections.
  • the second components COMP2 and third component COMP3 offer individual interfaces I2a, 12b and I3a to interact with the environment of the system, while the first component COMP1 has no interface for external interaction. In automated test ⁇ ing these external interfaces are used to stimulate the piece of distributed software and observe its responses.
  • GUI graphical user interface
  • API application programming interface
  • Test procedures implement test actions at the interfaces of the SUT. These test actions need to be identified for a given SUT first. They are of the form of
  • Stimulus i.e. an input to the SUT, which is con ⁇ trollable by a tester.
  • Typical examples thereof in case of a GUI are: mouse click on a button, input to a text field etc
  • Consistent verdict setting mechanism which is called verdict arbitration when executing a test case. Given that test actions are executed at distributed interfaces and potentially concurrently, a test verdict mechanism needs to be put in place that calculates unambiguously the verdict of an executed test case, which is conven ⁇ tionally denoted as PASS, FAIL and INCONCLUSIVE.
  • test case specification requires support to ex ⁇ press the concurrency and coordination of the identified test actions in a way that is easy to understand by test engineers.
  • the test case specification approach shall support also the automatic generation of executable test scripts . While the identification of proper test actions is always SUT specific and a general solution cannot be provided (challenge 1) , testing of distributed software in general can be, for example, carried out by automating test cases written in the test specification language TTCN-3 (Testing and Test Control Notation version 3, defined by ETSI) to deal with the concur ⁇ rent nature of the SUT (challenge 2) .
  • TTCN-3 Test and Test Control Notation version 3, defined by ETSI
  • TTCN-3 defines also a test execution framework and deals with the test verdict ar ⁇ bitration when executing concurrent test actions (challenge 4) .
  • challenge 4 the test verdict ar ⁇ bitration when executing concurrent test actions
  • TTCN-3 it does not offer proper solutions for coordination and fault localiza ⁇ tion (challenges 3 and 5) which need to be developed ad hoc for a given SUT and test case.
  • test cases specified in TTCN-3 tend to be complex and have a low readability for the aforementioned reasons (challenge 6) .
  • FIG. 2 an exemplary embodiment of a test design and test execution process is depicted as a concurrent flow diagram. It defines the following test process for testing distributed software through STEPs 1 to 4 that is suitable for the manual creation of test cases:
  • STEP 1 An identification of the relevant SUT interfaces, which are fixed and will not change during test execution, and test actions, i.e. the stimuli and responses, at each in ⁇ terface is performed.
  • test actions i.e. the stimuli and responses
  • auxiliary actions are iden ⁇ tified that will be executed within the test system itself and support the test case execution, e.g. actions that trig- ger directly or indirectly execution of the SUT via hidden interfaces that are not directly accessible to the test sys ⁇ tem.
  • STEP 2 The implementation of test actions and auxiliary ac- tions as test procedures takes place in STEP 2, which can be executed in parallel with STEP 3, i.e. the test implementa ⁇ tion is largely independent from the test case specification.
  • a test procedure implementing test actions or an auxiliary action is called a local event or event for short.
  • a local event a of interface n shall be written as a@n.
  • a test proce ⁇ dure usually implements one test action.
  • certain stimulus/response pairs might be implemented in a single test procedure when appropriate, i.e. stimulus and response occur at the same interface and the response immediately follows the stimulus. Therefore the following types of events can be distinguished : (A) Stimulus event: Test procedure that provides a stimulus test action to the SUT;
  • Stimulus/response event Test procedure that provides a stimulus test action and immediately evaluates the re ⁇ sponse test action in a single step;
  • Test system event Test procedure that implements an aux ⁇ iliary action of the test system.
  • test procedures refer to test actions that serve a local interface for a given SUT compo ⁇ nent .
  • each test procedure com ⁇ prises four parts, namely (1) a communication interface to call the test procedure remotely from the master test compo ⁇ nent; (2) the implementation of the test action itself; (3) the calculation of the local test verdict from the execution of the test action that is based on an evaluation of the re ⁇ sponse received from the SUT, in case of event types B and C; (4) error and exception handling.
  • i 1, i.e., the con ⁇ current event contains only one local event, the concurrent event degrades to this local event.
  • the specification of test cases using concurrent events is a sufficient means to address the needs for testing distributed software because it captures concurrency between local events in a single concurrent event and allows coordination and syn ⁇ chronization between two concurrent events.
  • a relation in time and a relation be ⁇ tween local events at different client test components is de- scribed.
  • the consecutive ordering of local events can be var ⁇ ied. Grouping to concurrent events is done by considering their dependencies among each other to meet different cover ⁇ age criteria. For example, on one extreme the degree of con- currency can be maximised by attempting to parallelize execu ⁇ tion of all types of events A to D to the largest possible extend. On the other extreme a strict-sequential ordering can be imposed for all events that are controllable by the tester, i.e. events of type A, C, and D.
  • Re ⁇ sponse events (type B) refer to response test actions that are uncontrollable by the tester or testing software; their ordering in a test case cannot be enforced. Therefore a pos ⁇ sible strategy to treat them is to execute a response event as soon as it becomes enabled.
  • STEP 4 In step 4 the test case execution takes place.
  • the execution of a test case described as a sequence of concur ⁇ rent events is based on a test architecture that is described in the following.
  • the SUT is a piece of distributed soft ⁇ ware which implies that it comprises components distributed over a computer network.
  • one SUT component shall be equipped with one interface used for testing and that one SUT compo ⁇ nent shall be deployed on one computer platform each, cf. Fig. 1.
  • the test system is preferably to be decomposed into several test components that reflect the decomposition of the SUT.
  • one test component is assigned to one SUT compo ⁇ nent to perform tests via its interface and resides on the same computer platform.
  • a so-called Master Test Component MTC is introduced that controls and coordinates the execution of tests of the former test components which are henceforth called Client Test Components (CTCs) .
  • the MTC is assumed to be deployed on a separate computer platform. It is connected to all client test components via a proper communi ⁇ cation link, cf. Fig. 3.
  • Each client test component inte ⁇ grates its own test framework to execute tests at the inter ⁇ face of the respective SUT component.
  • Fig. 3 there exists exactly one master test component MTC and three client test components CTC1, CTC2 and CTC 3. According to another embodiment at least some of the client test components have or are connected to a GUI .
  • each or some of the client test component (s) is integrated with a chosen GUI test frame- work that performs the execution of stimuli and evaluates the responses at the respective local GUI.
  • This test framework can be one of the tools commonly used for local testing, e.g. an existing GUI test framework.
  • STEP 4a This variant of STEP 4 refers to the so-called nor ⁇ mal test case execution mode. It comprises the execution of a test case specified as a concurrent event sequence. In this mode, the master test component MTC and the client test com ⁇ ponents perform the following algorithms.
  • the master test component MTC algorithm takes the test case specified as a sequence of concurrent events as input; reads the next concurrent event; dispatches the local events of a concurrent event to the respective client test components; collects back their local verdicts; and, at the end of the test case, returns the final verdict from its execution which is one of the following values: PASS, INCONCLUSIVE, FAIL or ERROR.
  • the client test component algorithm takes a lo ⁇ cal event from a concurrent event as input, executes the test procedures associated to this local event and returns an in ⁇ termediate local test verdict from the execution to the mas ⁇ ter test component MTC which is one of the following values: PASS, FAIL or ERROR.
  • test verdict values are available: NONE, PASS, INCONCLUSIVE, FAIL, and ERROR.
  • setting the local test verdict in a client test component is defined in the following rules: A test pro ⁇ cedure implementing event types A or D shall return NONE if execution was successful or ERROR if an error occurred during the execution. A test procedure implementing event types B or C shall return PASS if it could successfully validate the ex ⁇ pected response test action, FAIL if the validation fails, or ERROR if an error occurred during execution of the test pro ⁇ cedure .
  • the master test component MTC reads the test case specification and dispatches events to the respec ⁇ tive client test components according to the test case speci ⁇ fication. If a concurrent event contains more than one event, it dispatches the events concurrently to the different client test components. After dispatching the events, the master test component MTC waits for termination of the events and collects the local test verdicts from the executing client test components. Based on these local results, the master test component MTC arbitrates the global test verdict for this test case.
  • test verdict values are ordered as follows (from low to high) :
  • the global test verdict has the value NONE.
  • the new global test verdict is updated by calculating the maximum value from the current global test verdict and the received local test ver ⁇ dict. This approach ensures that, for example, the current global verdict FAIL cannot be overwritten by a lower local verdict value such as PASS.
  • a client test component algorithm comprises the following steps:
  • STEP 4b is a variant of STEP 4 that refers to the so-called debug test case execution mode.
  • the normal test execution mode processes all concurrent events in a test case in a se ⁇ quential order.
  • a step-wise execution approach is often desirable.
  • the master test component MTC al ⁇ gorithm presented above can be extended to include a proce ⁇ dure that takes a snapshot of the SUT in a well-defined global state. The necessary extension requires inserting the following line after line 1 and before line 2.
  • the client test case component execution algorithm is not affected. la If there are no enabled local events of type (B) then take a snapshot.
  • An exemplary way of executing a master test component MTC algorithm in debug mode consists in that if the selected con ⁇ current event contains no local events of type B, i.e. re ⁇ sponse events, then a snapshot is taken. This approach re- quires that a response event is specified within a concurrent event as soon as it becomes enabled and is, thus, executable.
  • the procedure "Take a snapshot” implements all means that are necessary to collect the current global state of the SUT. This can comprise the collection of logs from the execution of the distributed SUT components, the taking of screenshots of the GUIs of the SUT and other things. Moreover, the con ⁇ trol over the test case execution can be returned to the test engineer who conducts the test during a snapshot. In this way the test execution becomes interactive.
  • the test architecture contains one or more client test components that execute the dis ⁇ patched events independently from each other and perform the test action attached to this event using a GUI test frame- work.
  • a client test component executes an algorithm that waits for a dispatched event from the master test component MTC and then, when it is received, executes the designated test procedure within a given GUI test framework. The algo ⁇ rithm reports also the local test verdict as the result from this execution back to the master test component MTC.
  • Fig. 3 an exemplary architecture of a test system that is capable of implementing and running test cases specified as a sequence of concurrent events is given. It consists of one master test component MTC that implements the functionality as described under STEP 4 above.
  • the test system consists of as many client test components as interfaces are specified and used in the test case. Typically one client test component is assigned to test one interface of a compo- nent of the SUT . Additional client test components are needed to cover test system events (of type D) executed at specific tester components.
  • the master test component MTC and all cli ⁇ ent test components are interconnected star-wise.
  • the implementation of the master test component MTC can be made generic and inde ⁇ pendent from a concrete test case and the piece of distrib ⁇ uted software to be tested.
  • the implementation of the client test component contains a SUT-independent part, which is the communication interface to the master test component MTC, and a SUT-dependent part, which is the implementation of the test event as a test procedure.
  • SUT-independent part which is the communication interface to the master test component MTC
  • SUT-dependent part which is the implementation of the test event as a test procedure.
  • existing test frameworks can be reused to ease the implemen ⁇ tation effort of the test procedure. It is not detailed how this reuse is supported inside a client test component.
  • test system is a "divide and con ⁇ quer" approach that preserves investments in stand-alone test automation solutions developed for testing non-distributed systems.
  • the complexity of distributing, coordinating and synchronizing test events for a distributed SUT as well as test verdict arbitration and a snapshot mechanism is solely managed by the master test component MTC.
  • the master test component MTC allows for the automatic execution of a test case and provides additional means of distributed testing such as synchronization of the execution order of concurrent events, coordination with the client test components, test verdict arbitration, and a snapshot mechanism.
  • the client test component algo- rithm on a local computer platform can be integrated with the master test component MTC that typically runs on another com ⁇ puter platform. It offers minimal overhead over testing single GUI software and a seamless integration with the test system. Due to the chosen design of the test system, it is possible to offer the calculation of valid snapshots, i.e. the characterization of the global state of the system under test after the execution of a given sequence of concurrent events .
  • Fig. 4 shows a schematic of a test case of an incoming emergency call to a call centre. It is described as a UML (Unified Modelling Language) activity diagram.
  • the columns refer (from left to right) to an auxiliary component of the test system, a SIP Call Generator, and components of the SUT, Call Clients 1-3. All components, the SIP and the SUT Call Clients, provide appropriate interfaces used for testing.
  • Each row denotes a concurrent event of the test case, denoted CO-EV 1-6, consisting of one to three local events executed at the respective interfaces.
  • the local events in rounded rectangles are marked with their types, here types A, B, and D are used.
  • the bars between concurrent events denote syn ⁇ chronization between them and the arrows denote the control flow from top to down needed to respect the chosen UML activity diagram notation.
  • test case can be described in a more abstract notation as the following sequence of concurrent events: ⁇ "make
  • test case requires the provision of a test system with 4 client test components interconnected with one master test component MTC as illustrated in Fig. 3. Each client test component will test the local events of a particular interface it is assigned to.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

L'invention concerne une unité de test comprenant un composant de test maître et au moins deux composants de test clients pour l'exécution de tests automatisés de logiciel sur un logiciel distribué comprenant au moins deux composants possédant une interface distribués sur un réseau informatique conçu l'exécution des étapes suivantes : la lecture de spécifications d'un test élémentaire décrivant au moins une procédure de test pour au moins une partie des composants de test clients, comprenant au moins un événement concurrent qui comprend au moins un événement local correspondant à une procédure de test sur un composant de test client pour une exécution coordonnée en temps voulu ; l'exécution du test élémentaire par répartition, par le composant de test maître, d'au moins un événement local de l'événement concurrent pour l'adressage des composants de test clients respectifs ; la réception, par le composant de test maître, d'au moins un verdict de test local à partir de l'exécution de l'événement local sur le composant de test client respectif ; le calcul, par le composant de test maître, d'un verdict de test à l'aide d'au moins un verdict local. L'invention concerne en outre un procédé correspondant.
PCT/EP2015/052261 2015-02-04 2015-02-04 Procédé pour tests automatisés de logiciels distribués et unité de test WO2016124230A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2015/052261 WO2016124230A1 (fr) 2015-02-04 2015-02-04 Procédé pour tests automatisés de logiciels distribués et unité de test

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2015/052261 WO2016124230A1 (fr) 2015-02-04 2015-02-04 Procédé pour tests automatisés de logiciels distribués et unité de test

Publications (1)

Publication Number Publication Date
WO2016124230A1 true WO2016124230A1 (fr) 2016-08-11

Family

ID=52450110

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2015/052261 WO2016124230A1 (fr) 2015-02-04 2015-02-04 Procédé pour tests automatisés de logiciels distribués et unité de test

Country Status (1)

Country Link
WO (1) WO2016124230A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107861880A (zh) * 2017-11-28 2018-03-30 曲明成 基于递归在线算法的进程任务块可疑依赖关系的优化方法
US10649884B2 (en) 2018-02-08 2020-05-12 The Mitre Corporation Methods and system for constrained replay debugging with message communications
CN111190822A (zh) * 2019-12-26 2020-05-22 曙光信息产业股份有限公司 一种自动化测试分布式系统软件的方法及装置
US11762858B2 (en) 2020-03-19 2023-09-19 The Mitre Corporation Systems and methods for analyzing distributed system data streams using declarative specification, detection, and evaluation of happened-before relationships

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090249216A1 (en) * 2008-03-28 2009-10-01 International Business Machines Corporation Interacting with multiple browsers simultaneously using linked browsers controlled from a primary browser interface
US20100070230A1 (en) * 2008-09-16 2010-03-18 Verizon Data Services Llc Integrated testing systems and methods
US20110289489A1 (en) * 2010-05-20 2011-11-24 Verizon Patent And Licensing Inc. Concurrent cross browser testing

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090249216A1 (en) * 2008-03-28 2009-10-01 International Business Machines Corporation Interacting with multiple browsers simultaneously using linked browsers controlled from a primary browser interface
US20100070230A1 (en) * 2008-09-16 2010-03-18 Verizon Data Services Llc Integrated testing systems and methods
US20110289489A1 (en) * 2010-05-20 2011-11-24 Verizon Patent And Licensing Inc. Concurrent cross browser testing

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107861880A (zh) * 2017-11-28 2018-03-30 曲明成 基于递归在线算法的进程任务块可疑依赖关系的优化方法
CN107861880B (zh) * 2017-11-28 2021-06-22 曲明成 基于递归在线算法的进程任务块可疑依赖关系的发现方法
US10649884B2 (en) 2018-02-08 2020-05-12 The Mitre Corporation Methods and system for constrained replay debugging with message communications
CN111190822A (zh) * 2019-12-26 2020-05-22 曙光信息产业股份有限公司 一种自动化测试分布式系统软件的方法及装置
US11762858B2 (en) 2020-03-19 2023-09-19 The Mitre Corporation Systems and methods for analyzing distributed system data streams using declarative specification, detection, and evaluation of happened-before relationships

Similar Documents

Publication Publication Date Title
US20180196739A1 (en) System and method for safety-critical software automated requirements-based test case generation
US5371883A (en) Method of testing programs in a distributed environment
CN111190812A (zh) 基于嵌入式设备的自动化测试框架
WO2016124230A1 (fr) Procédé pour tests automatisés de logiciels distribués et unité de test
Arora et al. Web application testing: A review on techniques, tools and state of art
US20130159774A1 (en) Dynamic reprioritization of test cases during test execution
Sanches et al. J-swfit: A java software fault injection tool
US8752007B2 (en) Automatic generation of run-time instrumenter
CN109947535A (zh) 面向虚拟机的故障注入套件
CN105739481B (zh) 工控软件的测试方法、装置及系统
Yang et al. Efsm-based test case generation: Sequence, data, and oracle
CN110519107A (zh) 城域网电路扩容方法及装置
CN113485928A (zh) 一种交换机自动化测试方法及装置
CN114238127A (zh) 接口测试方法、装置、设备及存储介质
CN114090423A (zh) 一种芯片验证自动化控制方法
WO2014075471A1 (fr) Système et procédé de génération d'application intégrée pour un terminal de l'internet des objets
Abdul et al. Implementing Continuous Integration towards rapid application development
Ahmadi et al. mCUTE: a model-level concolic unit testing engine for UML state machines
JP5545087B2 (ja) 分散制御システム試験実行管理装置
Saddler et al. EventFlowSlicer: a tool for generating realistic goal-driven GUI tests.
Bucchiarone et al. Model-checking plus testing: from software architecture analysis to code testing
Andrade et al. Testing interruptions in reactive systems
Murthy et al. Distributed GUI test automation
Machado et al. Component-based integration testing from UML interaction diagrams
Hassine et al. Applying reduction techniques to software functional requirement specifications

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15702757

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15702757

Country of ref document: EP

Kind code of ref document: A1