WO2014174362A1 - Feature model based testing - Google Patents

Feature model based testing Download PDF

Info

Publication number
WO2014174362A1
WO2014174362A1 PCT/IB2014/000605 IB2014000605W WO2014174362A1 WO 2014174362 A1 WO2014174362 A1 WO 2014174362A1 IB 2014000605 W IB2014000605 W IB 2014000605W WO 2014174362 A1 WO2014174362 A1 WO 2014174362A1
Authority
WO
WIPO (PCT)
Prior art keywords
features
test cases
mpfm
perspectives
variation
Prior art date
Application number
PCT/IB2014/000605
Other languages
English (en)
French (fr)
Inventor
Sachin Patel
Priya Gupta
Vipul Arvind SHAM
Sampatkumar N. DIXIT
Original Assignee
Tata Consultancy Services Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tata Consultancy Services Limited filed Critical Tata Consultancy Services Limited
Publication of WO2014174362A1 publication Critical patent/WO2014174362A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases

Definitions

  • the present subject matter relates, in general, to testing of software product(s) and particularly to testing of software product(s) based on a Feature Model (FM).
  • FM Feature Model
  • SPL Software Product Lines
  • Feature Model which is a representation of the features of the products of the SPL, is generated to determine the valid feature combinations for testing.
  • Fig. la illustrates a feature map of a Multi-Perspective Feature
  • MPFM Model
  • Fig. lb illustrates components of a software testing system, in accordance with an embodiment of the present subject matter.
  • FIG. 2 illustrates a method to perform software testing based on a
  • MPFM Multiple Perspective Feature Model
  • MPFM Multiple Perspective Feature Model
  • Feature Modeling is one popular technique to model variability in the SPL.
  • Feature Modeling is a tool to develop Feature Models (FMs) where the developed FM's represent a product in a SPL by means of features.
  • FMs Feature Models
  • inclusion of feature(s) in the SPL results in interactions among the features, thus varying the behavior of other features.
  • developing a FM for complex systems, such as the entire SPL is a tedious and a cumbersome task. obtain the MPFM.
  • the Feature Model (FM) associated with a SPL is generated, wherein the FM is representative of
  • the generation of a FM may be achieved by means of tools existing in the state of the art. For example, Feature modeler.
  • the generated FM associated with the SPL may have variability information scattered all over the FM. Therefore, Separation of Concerns (SoC) in the FM associated with the SPL is performed, in order to modularize the FM.
  • SoC Separation of Concerns
  • the SoC in the FM is based on identifying a source of variation with a common cause of variation.
  • the features that are sources of variations may have a common cause of variation.
  • a FM for an internationalized website may include Stock Quote and Gold rate chart as features under a perspective domain.
  • the FM may also include Indian Rupee, Euro and Pound as sub-features of the features Stock Quote and Gold rate chart.
  • the determined common cause of variation in the FM may be identified as perspective.
  • the common cause of variation 'currency' may be identified as a new perspective of the FM; Since the SPL may implement a plurality of such perspectives where multiple new perspectives may be identified base on SoC, the generated FM is referred to as a MPFM.
  • each identified perspective is further modeled to one or more feature.
  • test cases may also be identified based on the perspectives and the features of the MPFM.
  • identification [0013] More often than not, SPLs once developed, are tested to validate the correctness and to ensure that a desired output is generated. This implies that when a new feature is included or changed, it has to be ensured that the new feature works in all configurations in conjugation with all the other variations that each configuration might have.
  • inclusion of a feature in like currency in an internationalized website can affect several other features like Stock Quote, Gold Rate Chart, etc., which are affected by the change in currency.
  • the new feature has to be tested in combination with other features in different browsers like Internet Explorer, Google Chrome, Firefox, different Operating Systems (OS), such as Windows, Linux, Android, and different devices, such as Desktop, Tablet and Smartphone.
  • OS Operating Systems
  • the feature interactions in FM are highly convoluted in a real world SPL and can prove to be a daunting task, where the number of variations and the configurations is large.
  • a Multiple Perspective Feature Model relates to modularization of the FM by performing Separation of Concerns (SoC) in the FM to of test cases is based on perspective selection parameters.
  • SoC Separation of Concerns
  • a MPFM includes perspectives, such as domain, internationalization, architecture and operating environment- and features, such as Weather Update, Celsius, Blogs, Articles, 3-column page layout, 2-column page layout, Internet Explorer, Windows, Linux and Firefox.
  • test case for a user planning on testing Weather Update in Celsius, in a 3-column page layout, on an Internet Explorer Browser, on a device running on a Windows Operating System has to be generated, the test case can be identified as [Weather Update, Celsius, 3-column, Internet Explorer, Windows].
  • Such a selection of test cases based on perspectives and features may allow effective testing of the SPL.
  • the implementation of the described system and methods of the present subject matter may provide an alternative way to model variations and measure test coverage.
  • the present subject matter leverages the basic technology of modularization of redundant features by creating a MPFM, thus creating a model that is modular and is easy to maintain. Further, since the existing method relies on the use of MPFM in testing by generating a pair-wise combination, an exhaustive coverage of possible contexts and notion of complete testing done is available. Furthermore, prioritization of the test cases helps the software testers to sort the features based on their importance and helps the software testers to arrive at a minimum set of tests that provide maximum coverage of variability, thereby reducing the effort.
  • Fig. la illustrates a feature map of a Multi-perspective feature Model
  • the MPFM 100 is a representative of a plurality of perspectives 102-1, 102-2, 102-3 and 102-4 and may be collectively referred to as perspectives 102. Each of these perspectives 102 may be further associated with features 104-1, 104-2,...104-12 and may be collectively referred to as features 104.
  • the features 102 may further include sub-features 106-1, 106-2, 106-3, 106-22 and may be collectively referred to as sub- features (106).
  • the generation of a FM may be achieved by means of tools existing in the state of the art. For example, Feature modeler.
  • the feature map herein mentioned shall be used for the purposes of explanation of Fig lb.
  • Fig. lb illustrates components of a software testing system 152, in accordance with an embodiment of the present subject matter.
  • the software testing system 152 includes processor(s) 154.
  • the processor 154 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions.
  • the processor(s) is to fetch and execute computer-readable instructions stored in the memory.
  • the software testing system 152 may also include a memory 158.
  • the memory 158 may be coupled to the processor 154.
  • the memory 158 can include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non- volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
  • the software testing system 152 may include module(s) 160 and data 162. The modules 160 and the data 162 may be coupled to the processors 154.
  • the modules 160 include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types.
  • the modules 160 may also be implemented as, signal processor(s), state machine(s), logic circuitries, and/or any other device or component that manipulate signals based on operational instructions.
  • the module(s) 160 include a modeling module
  • the data 162 includes modular data 176, test data 178, and other data 180.
  • the other data 180 amongst other things, may serve as a repository for storing data that is processed, received, or generated as a result of the execution of one or more modules in the module(s).
  • the data is shown internal to the software testing system 152, it may be understood that the data 162 can reside in an external repository (not shown in the figure), which may be coupled to the software testing system 152.
  • the software testing system 152 may communicate with the external repository through the interface(s) 156 to obtain information from the data 162.
  • the software- testing system l T 52 ⁇ facilitates software application testing based on Multiple Perspective Feature Model (MPFM).
  • the modeling module 164 of the software testing system 152 is to generate a Feature Model (FM) associated with a Software product Line (SPL), where the generated FM is representative of a plurality of perspectives 102, features 104 and sub-features 106.
  • the generation of the FM may be performed by existing tools available in the state of the art, such as a feature modeler.
  • the FM for an internationalized website (Site X) may contain a domain perspective 102-1.
  • the domain 102-1 can further include features like Stock Quote 104-2 and Gold Rate chart 104-4 and each of these features 104 can include further sub-features 106 like Indian Rupee 106 -3, Dollar 106-4 and Pound 106-5.
  • the FM for a website may contain Operating Environment 102-4 as a perspective.
  • the Operating Environment 102-4 may further include features 106 like browser 104-10, Operating System (OS) 104- 1 1 and device 104-12.
  • Each of these features may include further sub- features 106, such as the browser 104-10 may include sub-features 106 like Internet Explorer (IE) 106-14, Opera 106-16, Firefox 106-15 and different configurations of each the listed sub-features like IE 7, IE 8 and so on.
  • the OS may further include sub-features like Windows 106-17, Linux 106- 18 and Android 106-19.
  • the modularizing module 166 of the software testing system 152 may modularize the generated FM by performing Separation of Concerns (SoC) in the FM.
  • SoC Separation of Concerns
  • the SoC may be performed based on identifying a common cause of variation for a source of variation.
  • the source of variation is considered as a relationship between a source feature and a target feature, such that, the source feature is the feature of the perspective that is implemented with many variations and the target feature is variations of the source feature.
  • the features 104 may be referred to as the source features, while the sub-features 106 may be referred to as the target features.
  • the source of variation here is, the relationship between source features, i.e., Stock Quote 104-2 and the target features i.e., the Rupee 106-3, Dollar 106-4 and Pound 106-5 .
  • another source of variation is the relationship between source features i.e., Gold Rate Chart 104-4and the target features i.e.
  • the software testing system 152 may further modularize the generated FM, by generating 'currency 104-6' to be the common cause of variation.
  • the modularizing module 166 of the software testing system 152 may determine the common cause of variation as a perspective to generate a Multiple Perspective Feature Model (MPFM).
  • MPFM Multiple Perspective Feature Model
  • each perspective 102 of the MPFM 100 includes one or more features 104.
  • the MPFM thus generated is stored in the modular data 176.
  • 'currency 104-6' which is identified as the common cause of variation, is determined as a perspective of the FM for site X by the modularizing module 166.
  • the perspective currency 104- 6 may further be linked to the sub-features the Rupee 106-3, Dollar 106-4 and Pound 106-5.
  • test cases are identified based on utilizing perspectives 102 as parameters and one or more features 104 from the perspectives 102 as values.
  • the validating module 168 may identify test cases and validate the identified test cases and store the validated test cases in the test data 178.
  • the identification of the test cases is based on perspective selection parameters.
  • a MPFM of a SPL consists of perspectives like Operating System, Browser, Protocol, Central Processing Unit (CPU) and Data Base Management Systems (DBMS) and each of these perspectives include one or more features as listed below.
  • the validating module 168 may identify test cases based on utilizing perspectives as parameters and features as values.
  • the various perspectives and features under each of the perspectives may include operating system (Windows, iOS, Android), a browser (Internet Explorer, Firefox), protocol stack (IPv4, IPv6), a processor (Intel, AMD), and a database (MySQL, Sybase, Oracle). 10 possible test cases that may cover all combinations of test cases may be identified and the cases are thus listed below in Table 1
  • test cases once identified may be validated by the validating module 168 of the software testing system 152, where every feature of the identified test case may be compared to features of a product configuration to generate constraints such that only valid test cases are identified.
  • An exact matching of the features of the test cases to features of the product configuration is indicative of a valid test case.
  • test case 1 as listed in Table 1.
  • the test case is identified to be [Windows, Internet Explorer, IPv4, Intel, and MySQL]. Now, each feature of the test case namely, Windows, Internet Explorer, IPv4, Intel, MySQL, is mapped to the feature of the product configuration.
  • test case is indicative of a valid test case and further steps can be taken to ensure that all possible pair-wise combinations of features in the validated test case are tested. However, if the features of the identified test case do not exist in the product configuration, the test case is determined to be invalid and further steps to perform testing of such an invalid test case are aborted. For example, in a situation where the Android Operating System and iOS in the product configuration does not support Internet Explorer browser, the test cases 5 and 7 may be discarded as these test cases are deemed to be invalid.
  • a combinatorial test of the validated test cases may be performed, wherein the combinatorial test includes generation of pair- wise combination of features of the validated test cases to ensure that all possible combinations in the validated test case are tested.
  • the testing module 170 of the software testing system 152 may test feature interactions by generating combinations of the features of the MPFM.
  • the generated combination is a pair-wise combination and the generated combinations are tested by passing each perspective of the MPFM as a parameter and each feature of the perspective of the MPFM is passed as a value.
  • the validated test cases may be prioritized based on statistical parameters.
  • the ranking module 172 of the software testing system 152 may prioritize test cases based on statistical parameters.
  • the statistical parameters include either probability of error and probability of usage.
  • a social networking application might have to be tested in different browsers like Internet Explorer, Firefox, Google Chrome, and different configurations of each of the browsers, devices like tablets, desktop, and the like, different screens with different resolutions, different page layouts, different portals, languages, time zones and so on and so forth. It may not be practically possible to perform testing under all conditions. Under such scenarios, it may be helpful for a software tester to prioritize the test cases.
  • testing based on the probability of usage suggests prioritization of testing those features that are widely used by people of a particular geographic location prior to testing of the less used features,. For example, consider a scenario where a particular social networking application has to be tested in different Operating Systems like Windows Operating System, Blackberry Operating System, Android Operating System and iOS. If this particular application is to be launched in a country where the usage of Windows Operating System and Android Operating System is wide, the features of the application are first tested in Windows Operating System and Android Operating System prior to testing the features of the application on the Blackberry Operating System and iOS.
  • prioritizing based on the probability of error suggests prioritization of testing of those features that have failed to generate the desired output prior to testing of those features that have generated an expected output, based on performance history of the features in past test cases. For example, consider a feature like the ability to open multiple tabs in a browser. If this particular feature has to be tested in different browsers like Internet Explorer, Firefox, Chrome, etc, it is first tested on those browsers where the ability to open multiple tabs has failed repeatedly in the past test cases. [0037] Further, the prioritized test cases are tested to determine if the developed SPL generates the desired output. In one implementation, the testing module 170 may test the prioritized test cases based on combinatorial testing method to determine the success of the developed SPL. [0038] Fig. 2 illustrates a method 200 to generate a Multiple Perspective
  • MPFM Feature Model
  • the order in which the method is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method, or any alternative methods. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware.
  • the method may be described in the general context of computer executable instructions.
  • computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types.
  • the method may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network.
  • computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
  • steps of the method can be performed by programmed computers.
  • program storage devices for example, digital data storage media, which are machine or computer readable and encode machine-executable or computer-executable programs of instructions, where said instructions perform some or all of the steps of the described method.
  • the program storage devices may be, for example, digital memories, magnetic storage media, such as a magnetic disks and magnetic tapes, hard drives, or optically readable digital data storage media.
  • the embodiments are also intended to cover both communication network and communication devices to perform said steps of the described method.
  • a Feature Model (FM) associated with a Software Product Line (SPL) is generated, wherein the FM is a representative of the features of the SPL.
  • the FM is generated by means of tools existing in the state of the art.
  • the Modeling Module 164 is to generate a FM based on the features of a SPL.
  • SoC Separation of Concerns
  • the SoC is performed by identifying at least one source of variation with a common cause of variation.
  • the source of variation is identified as those features that may be implemented with several variations.
  • the sources of variation that have a common cause of variation are separated out as concerns.
  • the modularizing module 166 may perform SoC in the FM to generate a model that is modular and easy to maintain.
  • the common cause of variation that was identified as a concern is determined to be a perspective. Since several such concerns identified in a real world system for a SPL, several perspectives are determined and the model thus generated is a Multiple Perspective Feature Model (MPFM).
  • MPFM Multiple Perspective Feature Model
  • each perspective of the MPFM further includes at least one feature.
  • the modularizing module 166 may generate a MPFM from a FM associated with a SPL.
  • test cases are identified based on the perspectives and one or more features of the MPFM and the identified test cases, where the identificatiOn-of-the-test-cases-is-based-on-utilizing-perspectives-as-parameters-and features from the perspective are utilized as values.
  • the test cases may be identifies based on perspective selection parameters and the identified test cases are further validated based on comparing the features of the identified test cases to features of the product configuration.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Stored Programmes (AREA)
  • Debugging And Monitoring (AREA)
PCT/IB2014/000605 2013-04-25 2014-04-24 Feature model based testing WO2014174362A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN1526/MUM/2013 2013-04-25
IN1526MU2013 IN2013MU01526A (de) 2013-04-25 2014-04-24

Publications (1)

Publication Number Publication Date
WO2014174362A1 true WO2014174362A1 (en) 2014-10-30

Family

ID=51791129

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2014/000605 WO2014174362A1 (en) 2013-04-25 2014-04-24 Feature model based testing

Country Status (2)

Country Link
IN (1) IN2013MU01526A (de)
WO (1) WO2014174362A1 (de)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106021107A (zh) * 2016-05-19 2016-10-12 浪潮电子信息产业股份有限公司 一种模块化的测试用例分发方法
CN107002662A (zh) * 2014-11-05 2017-08-01 Avl里斯脱有限公司 用于运行泵的方法和装置
EP3299963A1 (de) * 2016-09-26 2018-03-28 Wipro Limited Verfahren und systeme zum generieren von testszenarien, um anwendungen, die menschliche sinne anwenden, zu testen
WO2019104917A1 (zh) * 2017-11-29 2019-06-06 平安科技(深圳)有限公司 基金系统测试用例的测试方法、装置、设备及存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5652835A (en) * 1992-12-23 1997-07-29 Object Technology Licensing Corp. Method and apparatus for generating test data for an automated software testing system
US20080313501A1 (en) * 2007-06-14 2008-12-18 National Tsing Hua University Method and system for assessing and analyzing software reliability
US20090018811A1 (en) * 2007-07-09 2009-01-15 International Business Machines Corporation Generation of test cases for functional testing of applications

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5652835A (en) * 1992-12-23 1997-07-29 Object Technology Licensing Corp. Method and apparatus for generating test data for an automated software testing system
US20080313501A1 (en) * 2007-06-14 2008-12-18 National Tsing Hua University Method and system for assessing and analyzing software reliability
US20090018811A1 (en) * 2007-07-09 2009-01-15 International Business Machines Corporation Generation of test cases for functional testing of applications

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107002662A (zh) * 2014-11-05 2017-08-01 Avl里斯脱有限公司 用于运行泵的方法和装置
CN107002662B (zh) * 2014-11-05 2019-02-19 Avl里斯脱有限公司 用于运行泵的方法和装置
CN106021107A (zh) * 2016-05-19 2016-10-12 浪潮电子信息产业股份有限公司 一种模块化的测试用例分发方法
EP3299963A1 (de) * 2016-09-26 2018-03-28 Wipro Limited Verfahren und systeme zum generieren von testszenarien, um anwendungen, die menschliche sinne anwenden, zu testen
US10255169B2 (en) 2016-09-26 2019-04-09 Wipro Limited Testing applications using application features grouped into categories of human senses
WO2019104917A1 (zh) * 2017-11-29 2019-06-06 平安科技(深圳)有限公司 基金系统测试用例的测试方法、装置、设备及存储介质

Also Published As

Publication number Publication date
IN2013MU01526A (de) 2015-04-10

Similar Documents

Publication Publication Date Title
US10565097B2 (en) Orchestrating and providing a regression test
US10713690B2 (en) Configurable relevance service test platform
JP6518794B2 (ja) プラグイン化パッケージング方法、装置及び端末
US10042903B2 (en) Automating extract, transform, and load job testing
US10366112B2 (en) Compiling extract, transform, and load job test data cases
US9652368B2 (en) Using linked data to determine package quality
US11816190B2 (en) Systems and methods to analyze open source components in software products
US10324710B2 (en) Indicating a trait of a continuous delivery pipeline
WO2019078954A1 (en) PRODUCTION OF INDEPENDENT BATTERY TRACE SUMMARY OF USER-LEGIBLE LANGUAGE
CN106873960A (zh) 一种应用软件的更新方法和设备
WO2014174362A1 (en) Feature model based testing
JP7155626B2 (ja) フィールドデバイスコミッショニングシステムおよびフィールドデバイスコミッショニング方法
CN111797312A (zh) 模型训练的方法及装置
JP2017174418A (ja) モデルチェックのためのデータ構造抽象化
CN111813739A (zh) 数据迁移方法、装置、计算机设备及存储介质
CN115794641A (zh) 基于业务流程的造数方法、装置、设备及存储介质
US20150287059A1 (en) Forecasting device return rate
CN112541688B (zh) 业务数据校验方法、装置、电子设备及计算机存储介质
US20160019564A1 (en) Evaluating device readiness
Markiegi et al. Dynamic test prioritization of product lines: An application on configurable simulation models
US10176276B2 (en) Determining an optimal global quantum for an event-driven simulation
CN109376285A (zh) 基于json格式的数据排序验证方法、电子设备及介质
CN109697141B (zh) 用于可视化测试的方法及装置
CA2947893C (en) Orchestrating and providing a regression test
US11474816B2 (en) Code review using quantitative linguistics

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14788748

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14788748

Country of ref document: EP

Kind code of ref document: A1