WO2018060777A1 - Method and system for optimizing software testing - Google Patents

Method and system for optimizing software testing Download PDF

Info

Publication number
WO2018060777A1
WO2018060777A1 PCT/IB2017/050103 IB2017050103W WO2018060777A1 WO 2018060777 A1 WO2018060777 A1 WO 2018060777A1 IB 2017050103 W IB2017050103 W IB 2017050103W WO 2018060777 A1 WO2018060777 A1 WO 2018060777A1
Authority
WO
WIPO (PCT)
Prior art keywords
set
test cases
elements
software
test
Prior art date
Application number
PCT/IB2017/050103
Other languages
French (fr)
Inventor
Hajimastan SHAIK
Nivashkumar Ramachandran
Original Assignee
Yokogawa Electric Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to IN201641033388 priority Critical
Priority to IN201641033388 priority
Application filed by Yokogawa Electric Corporation filed Critical Yokogawa Electric Corporation
Publication of WO2018060777A1 publication Critical patent/WO2018060777A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F40/205
    • G06F40/30

Abstract

The invention enables optimizing test case coverage for improved software testing. The invention comprises (i) parsing text of a requirements specification, (ii) performing parts-of-speech (POS) tagging of the parsed text, (iii) correlating a first set of elements comprising a plurality of POS tagged words with a second set of elements represented within a software ontology, (iv) categorizing each element within the first set of elements as a state or as an event, and (v) generating a set of test cases, wherein generating said set of test cases comprises generating a cartesian product of the set of determined states and the set of determined events.

Description

METHOD AND SYSTEM FOR OPTIMIZING SOFTWARE TESTING

Field of the Invention

[001] The present invention relates to optimization of software testing. In particular, the invention provides methods, systems and computer programs for optimizing test case coverage for improved software testing.

Background

[002] The role of software testing is critical to development and implementation of a software product or system. Software testing enables engineers to identify problems that require rectification prior to implementation or release of a product. Software testing inter alia involves testing multiple scenarios (referred to as test cases) that involve certain functions, features or lines of code relating to product or system under test. These test cases are run and the observed outcomes of the test cases are compared against predicted / intended outcomes to ascertain whether the software product is operating in an intended and predictable manner.

[003] The reliability of the testing process is directly correlated to the scope and comprehensiveness of the test cases that are identified and run during the testing process. Conventionally, test cases are developed by programmers / the software development team - by identifying various features or components of the software product or system that require to be tested and by specifying corresponding test cases. This approach presently multiple drawbacks.

[004] First, the identification of features that require testing is typically on an anecdotal or ad-hoc basis, resulting in incomplete test case coverage as a consequence of inadvertent omission of test cases corresponding to one or more software features or software modules. Further, where test cases are being generated for software that has been developed by a third party software developer, the testing team may have insufficient information or knowledge about the software product or system to develop test cases that ensure appropriate test coverage of all features or modules of the product or system under test. [005] There is accordingly a need for methods and systems for optimizing test case coverage for a software product under test.

Summary

[006] The invention presents methods, systems and computer program products for optimizing testing of a software system.

[007] In an embodiment, a method according to the teachings of the present invention comprises implementing on at least one processor, the steps of (i) parsing text of a requirements specification corresponding to the software system, (ii) performing parts-of-speech (POS) tagging of parsed text, (iii) correlating a first set of elements comprising a plurality of POS tagged words with a second set of elements comprising one or more entities, relationships or actions represented within a software ontology, (iv) categorizing each element within the first set of elements as a state or as an event, wherein: (a) categorization of each element within the first set of elements is based on attributes of a correlated element within the second set of elements, and (b) said categorization of each element within the first set of elements results in a set of determined states and a set of determined events, and (v) generating a set of test cases for testing the software system, wherein generating said set of test cases comprises generating a cartesian product of the set of determined states and the set of determined events.

[008] In a method embodiment, each generated test case within the set of test cases comprises a unique state-event combination. The method may additionally comprise implementation of software testing for one or more test cases within the set of test cases.

[009] The method step of correlating the first set of elements with the second set of elements may in an embodiment include (i) semantic analysis of one or more POS tagged word within the first set of elements and (ii) identification of entities, relationships or actions that are represented in the software ontology that share an identical or similar meaning or context as the one or more semantically analysed POS tagged words.

[0010] The method may additionally include generating a state transition diagram based on the identified states and events.

[0011] The generated set of test cases for testing the software system may comprise a first set of test cases and a second set of test cases, wherein: (i) said first set of test cases comprises combinations of states and events that are intended to be implemented by the software system, and (ii) said second set of test cases comprises combinations of states and events that are intended not to be implemented by the software system.

[0012] The second set of test cases may include one or more state-event combinations that are not represented within a state transition diagram generated based on the identified states and events.

[0013] In an embodiment of the method, (i) a first test method is implemented for testing one or more test cases selected from within the first set of test cases, and (ii) a second test method is implemented for testing one or more test cases selected from within the second set of test cases, wherein said first and second test methods are distinct.

[0014] The first test method may comprise a test method for signaling an error state when the software system fails to achieve an expected outcome in a test case selected from within the first set of test cases, and the second test method may comprise a test method for signaling an error state when the software system detects occurrence of a combination of state and event corresponding to a test case selected from within the second set of test cases.

[0015] The invention additionally includes a system for optimizing testing of a software system. In an embodiment, the system comprises (i) at least one processor (ii) a parsing controller configured to parse text of a requirements specification corresponding to the software system, (iii) a parts-of-speech (POS) tagging controller configured to tag parsed text, (iv) a correlation controller configured to correlate a first set of elements comprising a plurality of POS tagged words, with a second set of elements comprising one or more entities, relationships or actions represented within a software ontology, (v) a state-event identification controller configured to categorize each element within the first set of elements as a state or as an event, wherein (a) categorization of each element within the first set of elements is based on attributes of a correlated element within the second set of elements, and (b) said categorization of each element within the first set of elements results in a set of determined states and a set of determined events, and (vi) a test case generator configured to generate a set of test cases for testing the software system, wherein generating said set of test cases comprises generating a cartesian product of the set of determined states and the set of determined events.

[0016] The test case generator may be configured such that each generated test case within the set of test cases comprises a unique state-event combination.

[0017] The correlation controller is configured such that correlating the first set of elements with the second set of elements includes (i) semantic analysis of one or more POS tagged word within the first set of elements and (ii) identification of entities, relationships or actions that are represented in the software ontology that share an identical or similar meaning or context as the one or more semantically analysed POS tagged words.

[0018] The state-event identification controller may be configured to generate a state transition diagram based on the identified states and events.

[0019] In a system embodiment, the test case generator may be configured such that the generated set of test cases for testing the software system comprises a first set of test cases and a second set of test cases, and wherein (i) said first set of test cases comprises combinations of states and events that are intended to be implemented the software system, and (ii) said second set of test cases comprises combinations of states and events that are intended not to be implemented by the software system. [0020] The second set of test cases may in an embodiment include one or more state- event combinations that are not represented within a state transition diagram generated based on the identified states and events.

[0021] The system may be further configured to (i) signal an error state when the software system fails to achieve an expected outcome in a test case selected from within the first set of test cases, and (ii) signal an error state when the software system detects occurrence of a combination of state and event corresponding to a test case selected from within the second set of test cases.

[0022] The invention additionally provides a computer program product for optimizing testing of a software system, comprising a non-transitory computer usable medium having a computer readable program code embodied therein, the computer readable program code comprising instructions for (i) parsing text of a requirements specification corresponding to the software system, (ii) performing parts-of-speech (POS) tagging of parsed text, (iii) correlating a first set of elements comprising a plurality of POS tagged words with a second set of elements comprising one or more entities, relationships or actions represented within a software ontology, (iv) categorizing each element within the first set of elements as a state or as an event, wherein (a) categorization of each element within the first set of elements is based on attributes of a correlated element within the second set of elements and (b) said categorization of each element within the first set of elements results in a set of determined states and a set of determined events, and (v) generating a set of test cases for testing the software system, wherein generating said set of test cases comprises generating a cartesian product of the set of determined states and the set of determined events.

Brief Description of the Accompanying Drawings

[0023] Figure 1 illustrates an exemplary task flow of processes and operations associated with a software application.

[0024] Figure 2 illustrates an ontology of a common or generic set of functions associated with a software application or system. [0025] Figure 3 provides an illustration of a parse tree output of lexical and / or syntactical analysis implemented by a POS tagger.

[0026] Figure 4 illustrates a method of generating an optimized set of test cases based on natural language processing (NLP) of information extracted from a software requirements specification.

[0027] Figure 5 illustrates an exemplary state transition diagram.

[0028] Figure 6 illustrates an exemplary system configured to implement one or more method embodiments of the present invention.

[0029] Figure 7 illustrates an exemplary computing system for implementing the present invention.

Detailed Description

[0030] The present invention provides novel and inventive methods and systems for optimizing test case coverage for testing a software product.

[0031] Figure 1 illustrates an exemplary task flow of processes and operations associated with a software application- which task flow is represented in the form of process flow chart 100. Process flow chart comprises nodes 102, 104, 106, 108 and 110, respectively interconnected by edges 103, 105, 107 and 109. Nodeswithin the process flow chart represent states that the exemplary software application is configured to assume, while edges are representative of events or actions that effect a change in state of the software application. In the illustration, node 102 represents a state where the software application is ready to be run / launched. Edge 103 represents an event where by double clicking on the application icon, the software application launches and opens a login page, as represented by node 104. Edge 105 represents an event where the user inputs a username and password at the login page - responsive to which the software application proceeds to node 106 - comprising display of information related to a user session e.g. user account information. Edge 107 represents the user clicking on a logout button - causing the software application to log out from the user session and to display a logout page at node 108. Edge 109 represents the user clicking on an "X" button (i.e. a button configured to close a software application window/page/screen) to close the software application - resulting in the application transitioning to a closed state represented at node 110.

[0032] The present invention uses task flow information corresponding to a software application for generating an optimized set of test cases for software testing. For this purpose, the invention relies on information extracted from (i) one or more software requirements specifications as well as (ii) one or more software ontologies for the purpose of generating the optimized set of test cases.

[0033] For the purposes of the invention, the term "ontology" shall be understood to mean a data model that represents a domain of interest, and that is used to reason or explain the behaviour of objects in that domain and the relationships between them. An ontology comprises a set of interconnected concepts and relationships within a particular domain. Concepts may be understood as classes of instances /things, while relationships signify the relation or linkages between such classes. By way of example, an ontology for traffic may include concepts such as vehicles, street lights, geographical locations, time of data and traffic volume. The traffic ontology also includes relationships between concepts. For example, a specific vehicle may be "halted" at a particular street light that is in turn "located within" a specified geographical location. Each instance of a concept may have different attributes - for example, vehicles may be classified as cars, buses, two wheeled vehicles etc. Attributes of concept instances may be defined in terms of corresponding attribute information - for example, in the case of vehicles, each concept instance may be defined in terms of attribute information such as size, make, model, number of wheels, public or private transport etc.

[0034] Figure 2 illustrates an ontology of a common or generic set of functions associated with a software application or system. The illustrated ontology includes a plurality of nodes representing discrete functional modules or entities related to a software application, including the application itself (illustrated as the "Application"), components of said software application (i.e. "login page", "submit button", "close button", "logout button", and "information screen"), environment associated with the software application (i.e. the application is launched from the desktop), entities permitted to operate the software application (i.e. "user") and information that is used as input to the software application (e.g. "user name" and "password"). For each node, the corresponding edges or links represent and describe the relationships (or events that have cause or effect relationships with states represented by such nodes) between a module / entity represented by such node and at least one other entity represented by another node. For example,the ontology represents at least the following relationships : (i) user name and password are input into the login page (ii) the submit button and close button are part of or associated with the login page (iii) a user may launch the application (iv) the application is launched from an operating system desktop (v) information screens and close buttons are both part of or both associated with the application, and (vi) the logout button is part of or belongs to an information screen.

[0035] For the purposes of the invention, the terms "software requirements specification" or "requirements specification" shall be understood to mean a specification setting out a description of the software application or software system under development - including by setting out functional and / or non-functional requirements of the software system, and optionally defining use cases that describe user interactions that the software must provide.

[0036] The present invention relies on extraction of information from a software requirements specification associated with the software system or software application for obtaining and interpreting task flow information for the purpose of generating an optimized set of test cases. Specifically, the invention relies on natural language processing for extraction and analysis of such information.

[0037] For the purposes of the present invention, it would be understood that references to natural language processing refers to conventionally understood processing steps including one or more of sentence segmentation and tokenization, lexical analysis syntactical analysis and/or semantic analysis of sentences, and topic identification to generate a logically structured and tagged representations of human readable text - which representations may thereafter be used for machine processing of natural language statements. Briefly, "sentence segmentation" comprises delimiting the human readable text characters into one or more sentences. "Tokenization" comprises segmenting each sentence into a list of tokens (each token comprising a sequence of characters that are grouped together as useful semantic unit for processing - for example, in a sentence, tokens typically correspond to discrete words or terms). "Lexical analysis" (or "morphological analysis") of sentences comprises tagging each word with its corresponding part of speech - for example using a processor implemented Parts of Speech (POS) tagger. "Syntactical analysis" comprises assigning a syntactic structure or a parse tree to a given natural language sentence (an illustration of lexical and syntactical analysis that results in a parse tree is exemplarily described in more detail in connection with Figures 3 and 4). "Semantic analysis" comprises translating a syntactic structure of words / sentences into a semantic representation that is a precise and unambiguous representation of the meaning expressed by said words / sentences. Semantic analysis may be achieved based on knowledge about the structure of words and sentences - with a view to stipulate the meaning of words, phrases, sentences and texts and subsequently their purpose and consequences.

[0038] Figure 3 provides an illustration of a parse tree output of lexical and / or syntactical analysis implemented by a POS tagger based on an input sentence - "The user launches the application from desktop." As illustrated in Figure 3, the POS tagger extracts each word / token from the input sentence and tags them with their respective categories and sub-categories of parts of speech. The POS results in the parse tree structure illustrated in Figure 3, wherein:

• "The" is tagged as a determiner within a noun phrase

• "user" is tagged as a noun within the noun phrase

• "launches" is tagged as a verb, 3 person singular present within a verb phrase

• "the" is tagged as a determiner within a noun phrase within the verb phrase

• "application" is tagged as a noun within the noun phrase within the verb phrase

• "from" is tagged as a preposition within a preposition phrase within the verb phrase • "desktop" is tagged as a noun within the preposition phrase within the verb phrase

[0039] As discussed below in connection with Figure 4, the present invention implements natural language processing methods wherein the output of lexical and syntactical parsing is used for generating an optimized set of test cases for software testing.

[0040] Figure 4 illustrates a method of generating an optimized set of test cases based on natural language processing (NLP) of information extracted from a software requirements specification associated with the software under test. For the purposes of explaining the method steps of Figure 4, we shall discuss the method steps in the context of the following extract from an exemplary software requirements specification:

SENTENCE 1 : The user launches the application from desktop

SENTENCE 2 : The user will provide a valid user name and a valid password

SENTENCE 3 : The user will click on the submit button

SENTENCE 4 : The application displays information associated to the user

SENTENCE 5 : The user clicks on the logout button to exit the application

SENTENCE 6 : The user will close the application

[0041] Steps 402 and 404 of Figure 4 respectively comprise(i) parsing text of the software requirements specification and (ii) performing POS tagging of the parsed text. It would be understood that in an embodiment, steps 402 and 404 may implement the lexical and syntactical analysis stages of natural language processing procedures. In the exemplary case, where steps 402 and 404 parse Sentence 1 (i.e. "The user launches the application from desktop"), said steps result in the parse tree illustrated in Figure 3 - where each word from the sentence is tagged with its respective categories and sub-categories of parts of speech. Likewise, implementation of steps 402 and 404 on each of Sentences 2 to 6 from the exemplary text reproduced above would result in a corresponding parse tree structure for each sentence. [0042] Step 406 subsequently identifies a set of states (comprising at least one state and preferably a plurality of states) and a set of events or actions (comprising at least one event / action and preferably a plurality of events / actions) that result in transitions between states- which states and events correspond to, or are represented by, the tagged words arising from the parse tree(s) of steps 402 and 404. The identification of said states and events (or actions) that can result in transitions between states, may in an embodiment be achieved by:

• correlating (i) a first set of elements comprising a plurality of the tagged words with (ii) a second set of elements comprising entities, actions and/or events represented in the software ontology. Said correlation is implemented based on (i) semantic analysis of each tagged word to identify a representation of the meaning or context expressed by said tagged word and (ii) identification of entities, relationships, actions and / or events that are represented in the software ontology having an identical or similar meaning or context;

• identifying classes, concepts, attributes, properties or relationships corresponding to elements of the second set of elements (i.e. entities, actions and /or events); and

• using the identified classes, concepts, attributes, properties or relationships (associated with each element of the second set of elements) to categorize each corresponding element of the first set of elements as (i) a state or (ii) an event (or action) that results in a transition from a first state to a second state.

[0043] In an embodiment of the invention, in addition to relying on correlation with one or more software ontologies, the identification of states and events may be based on one or more predefined rules or one or more retrievable items of knowledge or expert information that are relevant for identifying states and / or events based on semantic analysis of output from a POS tagger.

[0044] Step 408 thereafter comprises generating a state transition diagram based on the states and events identified at step 406. For the purposes of the invention, the term "state transition diagram" shall mean a representation of a software product or system or any part thereof, in terms of a finite number of states that can be achieved by the software or system, and in terms of relationships (i.e. events)relevant to such states. The state transition diagram may be recorded or represented using any appropriate date structure known in the art, including without limitation, one or more of a directed graph, a tree structure, or any other appropriately configured record data structure.

[0045] Figure 5 illustrates an exemplary state transition diagram that has been generated by:

(i) implementing steps 402 and 404 of Figure 4 by parsing and POS tagging Sentences 1 to 6 described above,

(ii) implementing step 406 of Figure 4 to identify states and events corresponding to the tagged words in each of the resulting parse trees - wherein the identification of states and events is based on information extracted from the exemplary software ontology represented in Figure 2, and

(iii) representing the identified states and events in the state transition diagram of Figure 5.

[0046] A brief explanation of one embodiment of the method in which the exemplary state transition diagram of Figure 5 may be derived is provided below.

• At steps 402 and 404, each of Sentences 1 to 6 are parsed and tagged using a POS tagger to generate a corresponding parse tree of the type illustrated in Figure 3

• Step 406 may thereafter be repeated for each parse tree generated at steps 402 and 404 - wherein based on each parse tree and a software ontology (which for the purposes of this example is the ontology of Figure 2), step 406 (i) correlates a plurality of tagged words from the parse tree with entities, actions and / or events represented in the software ontology, (ii) identifies one or more of classes, concepts, attributes, properties or relationships corresponding to said entities, actions and / or events, and (iii) uses the identified classes, concepts, attributes, properties or relationships to categorize tagged words from the parse tree as a state or as an event. o Taking Sentence 1 as an example, implementation of steps 402 and 404 using a POS tagger results in generation of the parse tree represented in Figure 3. Analysis of the parse tree establishes that "user", "application" and "desktop" are nouns, while "launches" is a verb. The action in Sentence 1 is therefore the action of launching. Thereafter, by applying semantic analysis to correlate the plurality of tagged words from the parse tree with the ontology of Figure 2, and examining attributes and relationships between the correlated nodes of the ontology, it can be determined that:

• a relationship exists between the user and the application (which is determinable from the ontology)

• a relationship exists between the desktop and the application (which is determinable from the ontology)

• the user is the actor and the application is the object of the action (which is determinable by the word "launch" in the parse tree - which indicates a relationship between the user and the application)

• the application can be launched from the desktop (which is determinable from the ontology) o Taking Sentence 2, implementation of steps 402 and 404 using a POS tagger would result in generation of a parse tree of the type represented in Figure 3. Lexical and syntactical analysis of Sentence 2 using a parse tree establishes that "user", "username" and "password" are nouns, while "provides" is a verb. The presence of the adjective "valid" indicates a condition that translates to existence of an already known condition, or to a requirement for meeting a certain criteria. o Taking Sentence 3, implementation of steps 402 and 404 using a POS tagger would result in generation of a parse tree of the type represented in Figure 3. Lexical and syntactical analysis of Sentence 3 using a parse tree would establish that "user", and "button" are nouns, while "click" and "submit" are verbs. Thereafter, by applying semantic analysis to correlate the plurality of tagged words with the ontology of Figure 2, and examining attributes and relationships between the correlated nodes of the ontology, it can be determined that:

• a relationship exists between the user and the button (which is determinable from the ontology)

• the relationship between the user and the button is the action of clicking (which is determinable from the ontology)

• the user is the actor and the application is the object of the action (which is determinable by the word "launch" in the parse tree - which indicates the relationship between the user and the application)

• the application can be launched from the desktop (which is determinable from the ontology) o Similarly, Sentences 4 to 6 may be analyzed for the purpose of identifying states and events corresponding to the contents of said sentences. [0047] Based on the above analysis of Sentences 1 to 6, a state transition diagram of the type illustrated in Figure 5 may be generated. The illustrated state transition diagram comprises Nodes 1 to 5 and Edges [1] to [11].

[0048] Nodes 1 to 5 of the illustrated state transition diagram respectively represent the following states:

• Node 1 : represents an application launch state

• Node 2: represents display of the login page/screen

• Node 3: represents display of user session information

• Node 4: represents a logged out state

• Node 5: represents a closed application state

[0049] Edges [1] to [11] of the illustrated state transition diagram respectively represent the following events:

• Edge [1]: Double click fail event - in which the software application does not launch despite double clicking the application icon - and which results in state of the software application remaining unchanged

• Edge [2]: Double click succeed event - in which the software application launches in response to double clicking the application icon - and which results in a state transition from Node 1 to Node 2

• Edge [3]: Selection of the [No] option in response to presentation of a "confirm" dialog box that seeks to confirm a user's selection / clicking of the "X" button at the login page/screen - and which results in state of the software application remaining unchanged Edge [4]: Input of an invalid user name and / or invalid password at the login page/screen - which results in the software application not proceeding to a user session, and the state of the software application remaining unchanged

Edge [5]: Selection of the [Yes] option in response to presentation of a "confirm" dialog box that seeks to confirm a user's selection / clicking of the "X" button at the login page/screen - and which results in a state transition from Node 2 to Node 5 i.e. results in closure of the software application

Edge [6]: Input of a valid user name and valid password at the login page/screen - which results in a state transition from Node 2 to Node 3 i.e. wherein the software application proceeds to display user session information

Edge [7]: Selection of the [No] option in response to presentation of a "confirm" dialog box that seeks to confirm a user's selection / clicking of the "X" button at a page/screen displaying user session information - which results in state of the software application remaining unchanged

Edge [8]: Clicking of the logout button - which results in state of the software application transitioning from Node 3 to Node 4 i.e. which results in the application logging out of a user session

Edge [9]: Selection of the [Yes] option in response to presentation of a "confirm" dialog box that seeks to confirm a user's selection / clicking of the "X" button at a page displaying user session information- and which results in a state transition from Node 3 to Node 5 i.e. results in closure of the software application

Edge [10]: Selection of the [No] option in response to presentation of a "confirm" dialog box that seeks to confirm a user's selection / clicking of the "X" button at a logout page/screen - which results in state of the software application remaining unchanged • Edge [11]: Selection of the [Yes] option in response to presentation of a "confirm" dialog box that seeks to confirm a user's selection / clicking of the "X" button at a logout page / screen - and which results in a state transition from Node 4 to Node 5 i.e. results in closure of the software application

[0050] Step 410 comprises generating an optimized set of test cases for testing the software product or system, wherein the optimized set of test cases comprises a cartesian product of the set of states and the set of events identified at step 406. In an illustrative example, based on the states and events represented in the state transition diagram of Figure 5, if Table A is representative of the set of states identified at step 406 and Table B is representative of the set of events identified at step 406, then Table C records the optimized set of test cases comprising a cartesian product of the set of states and set of events respectively.

[0051] Table A

Figure imgf000019_0001

[0052] Table B

Figure imgf000019_0002
Double click succeed event

Selection of the [No] option in response to presentation of a "confirm" dialog box that seeks to confirm a user's selection / clicking of the "X" button at the login page/screen

Selection of the [Yes] option in response to presentation of a "confirm" dialog box that seeks to confirm a user's selection / clicking of the "X" button at the login page/screen

Input of an invalid user name and / or invalid password at the login page/screen

Input of a valid user name and valid password at the login page/screen

Selection of the [No] option in response to presentation of a "confirm" dialog box that seeks to confirm a user's selection / clicking of the "X" button at a page/screen displaying user session information

Selection of the [Yes] option in response to presentation of a "confirm" dialog box that seeks to confirm a user's selection / clicking of the "X" button at a page displaying user session information

Clicking of the logout button

Selection of the [No] option in response to presentation of a "confirm" dialog box that seeks to confirm a user's selection / clicking of the "X" button at a logout page/screen

Selection of the [Yes] option in response to presentation of a "confirm" dialog box that seeks to confirm a user's selection / clicking of the "X" button at a logout page / screen

[0053] Table C

Figure imgf000021_0001
6 Input of a valid user name and Undefined Undefined valid password at the login

page/screen

7 Selection of the [No] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks

to confirm a user's selection /

clicking of the "X" button at a

page/screen displaying user

session information

8 Selection of the [Yes] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks

to confirm a user's selection /

clicking of the "X" button at a

page displaying user session

information

9 Clicking of the logout button Undefined Undefined

10 Selection of the [No] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks

to confirm a user's selection /

clicking of the "X" button at a

logout page/screen

11 Selection of the [Yes] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks

to confirm a user's selection /

clicking of the "X" button at a logout page / screen

Display of Double click fail event Undefined Undefined login page /

screen Double click succeed event Undefined Undefined

Selection of the [No] option in Remain at the Display of response to presentation of a login page / the login "confirm" dialog box that seeks screen page / screen to confirm a user's selection /

clicking of the "X" button at

the login page/screen

Selection of the [Yes] option in Application Closed response to presentation of a closed application "confirm" dialog box that seeks state to confirm a user's selection /

clicking of the "X" button at

the login page/screen

Input of an invalid user name Display invalid Display of and / or invalid password at the user name / the login login page/screen invalid page / screen password

message

Input of a valid user name and Display user Display user valid password at the login session session page/screen information / information / screen screen

Selection of the [No] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks

to confirm a user's selection / clicking of the "X" button at a

page/screen displaying user

session information

Selection of the [Yes] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks

to confirm a user's selection /

clicking of the "X" button at a

page displaying user session

information

Clicking of the logout button Undefined Undefined

Selection of the [No] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks

to confirm a user's selection /

clicking of the "X" button at a

logout page/screen

Selection of the [Yes] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks

to confirm a user's selection /

clicking of the "X" button at a

logout page / screen

Display of Double click fail event Undefined Undefined user session

information Double click succeed event Undefined Undefined

Selection of the [No] option in Undefined Undefined response to presentation of a "confirm" dialog box that seeks

to confirm a user's selection /

clicking of the "X" button at

the login page/screen

26 Selection of the [Yes] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks

to confirm a user's selection /

clicking of the "X" button at

the login page/screen

27 Input of an invalid user name Undefined Undefined and / or invalid password at the

login page/screen

28 Input of a valid user name and Undefined Undefined valid password at the login

page/screen

29 Selection of the [No] option in Remain at the Display of response to presentation of a page / screen page / screen "confirm" dialog box that seeks displaying user with user to confirm a user's selection / session session clicking of the "X" button at a information information page/screen displaying user

session information

30 Selection of the [Yes] option in Application Closed

response to presentation of a closed application "confirm" dialog box that seeks state to confirm a user's selection /

clicking of the "X" button at a page displaying user session

information

Clicking of the logout button Log out Logged out successfully state

Selection of the [No] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks

to confirm a user's selection /

clicking of the "X" button at a

logout page/screen

Selection of the [Yes] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks

to confirm a user's selection /

clicking of the "X" button at a

logout page / screen

Logged out Double click fail event Undefined Undefined state

Double click succeed event Undefined Undefined

Selection of the [No] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks

to confirm a user's selection /

clicking of the "X" button at

the login page/screen

Selection of the [Yes] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks to confirm a user's selection /

clicking of the "X" button at

the login page/screen

38 Input of an invalid user name Undefined Undefined and / or invalid password at the

login page/screen

39 Input of a valid user name and Undefined Undefined valid password at the login

page/screen

40 Selection of the [No] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks

to confirm a user's selection /

clicking of the "X" button at a

page/screen displaying user

session information

41 Selection of the [Yes] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks

to confirm a user's selection /

clicking of the "X" button at a

page displaying user session

information

42 Clicking of the logout button Undefined Undefined

43 Selection of the [No] option in Remain at the Display of response to presentation of a logout page / logout page / "confirm" dialog box that seeks screen screen to confirm a user's selection / clicking of the "X" button at a

logout page/screen

Selection of the [Yes] option in Application Closed response to presentation of a closed application "confirm" dialog box that seeks state to confirm a user's selection /

clicking of the "X" button at a

logout page / screen

Closed Double click fail event Not able to Application application launch - display Launch State state contact

administrator

message

Double click succeed event Login page Login page launched

Selection of the [No] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks

to confirm a user's selection /

clicking of the "X" button at

the login page/screen

Selection of the [Yes] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks

to confirm a user's selection /

clicking of the "X" button at

the login page/screen 49 Input of an invalid user name Undefined Undefined and / or invalid password at the

login page/screen

50 Input of a valid user name and Undefined Undefined valid password at the login

page/screen

51 Selection of the [No] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks

to confirm a user's selection /

clicking of the "X" button at a

page/screen displaying user

session information

52 Selection of the [Yes] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks

to confirm a user's selection /

clicking of the "X" button at a

page displaying user session

information

53 Clicking of the logout button Undefined Undefined

54 Selection of the [No] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks

to confirm a user's selection /

clicking of the "X" button at a

logout page/screen

55 Selection of the [Yes] option in Undefined Undefined response to presentation of a

"confirm" dialog box that seeks

to confirm a user's selection /

clicking of the "X" button at a

logout page / screen

[0054] In the exemplary data illustrated in Table C, in addition to presenting an optimized set of test cases, the table additionally represents an expected outcome for each combination of state and event (i.e. each test case), and an expected new state for each test case (i.e. in case of occurrence of such combination of state and event). In an embodiment of the invention, the expected outcomes and new states corresponding to each test case are predicted based on information extracted from a pre-existing software ontology and / or retrievable predefined sets of rules and / or retrievable expert knowledge stored in an expert knowledge repository. By way of example, a predefined rule or an item of expert knowledge retrieved from an expert repository may comprise the information that implementing an action (such as clicking) on a user interface control element (such as a button) will result in either "success" or a "failure" of the action sought to be implemented by clicking on the control element.

[0055] In an embodiment of the invention, the data structure or table (for example Table C) representing the set of test cases generated at step 410 of Figure 4 may additionally identify combinations of states and events that are legitimately permitted by the software product or system, and/or combinations of states and events that are impermissible or not contemplated for implementation by the software product or system. For example in Table C, combinations of states and events that are impermissible or not intended for implementation by the software product or system are identified by tagging the expected outcome and new state data records corresponding to such combinations of states and events as "undefined". In an embodiment of the invention, the data structure or table representing the set of test cases generated at step 401 of Figure 4 may include a first set of test cases comprising combinations of states and events that are legitimately permitted or contemplated for implementation by the software or software system and a second set of test cases comprising combinations of states and events that are impermissible or not intended for implementation by the software or software system. In one embodiment of the invention, the set of test cases comprising combinations of states and events that are impermissible or not intended for implementation by the software or software system may include one or more state-event combinations that are not represented within the state transition diagram generated at step 408.

[0056] Step 412 of Figure 4 comprises selecting one or more test cases from the optimized set of test cases generated at step 410 for software testing. In a preferred embodiment of the invention (i) the data structure or table representing the set of test cases generated at step 410 of Figure 4 may include a first set of test cases comprising combinations of states and events that are legitimately permitted or contemplated for implementation by the software or software system and a second set of test cases combinations of states and events that are impermissible or not intended for implementation by the software or software system and (ii) step 412 comprises executing a first test method on one or more test cases selected from within the first set of test cases and a second test method on one or more test cases selected from within the second set of test cases. In an embodiment, the first test method is a test method capable of identifying or signaling an error state when the software or software system fails to achieve an expected outcome or an expected new state in a test case selected from within the first set of test cases. In a further embodiment, the second test method is a test method capable of identifying or signaling an error state in response to detecting occurrence of a combination of state and event corresponding to a test case selected from within the second set of test cases. In an embodiment, the first test method and the second test method are different from each other.

[0057] It would be understood that for the purposes of the invention, the first and second test methods may comprise any test methods capable of achieving the intended testing objective, including without limitation one or more of equivalence partitioning, boundary value analysis, all-pairs testing, state transition tables, decision table testing, fuzz testing, model-based testing, use case testing, exploratory testing and specification-based testing. [0058] Figure 6 illustrates an exemplary system 600 configured to implement one or more method embodiments of the present invention. As illustrated in Figure 6, system 600 comprises a parsing controller 604 and POS tagging controller 606 that are communicably coupled to each other and configured to extract text from software requirements specification 602 and to implement steps 402 and 404 of Figure 4 using the extracted text. POS tagging controller 606 is in turn communicably coupled with semantic analyzer and correlation controller 608 that is configured to receive the tagged text output from POS tagging controller 606, and to compare and correlate the tagged text output against entities, actions and /or events that are represented in one or more software ontologies extracted from ontology repository 610. The output from semantic analyzer and correlation controller 608 is processed by state-event identification controller 612, which is configured to identify a first set of states and a second state of events associated with the software product or system (in accordance with the teachings of step 406) for which an optimized set of test cases is intended to be generated. State-event identification controller 612 may in an embodiment additionally be configured to generate a state transition diagram, or a data structure representing a state transition diagram in accordance with the teachings of step 408 of Figure 4. In an embodiment, state-event identification controller 612 is communicably coupled to test case generator 614, wherein test case generator 614 may be configured to generate an optimized set of test cases in accordance with the teachings of step 410 of Figure 4.

[0059] For the purposes of the invention, it would be understood that each of components 604, 606, 608, 610, 612 and 614 of system 600 as illustrated in Figure 6 may be implemented using one or more processors and one or more transient or non transient computer readable memories.

[0060] It has been found that following the teachings of the present invention results in generation of an optimized set of test cases, which optimized set of test cases provides significantly improved test case coverage in comparison with conventional methods of generating test cases for software testing. It has additionally been found that generating test cases in accordance with the teachings of the present invention enables a software tester to differentiate between events that are permissible under the software requirements specification and events that are impermissible under the software requirements specification, and to apply different tests to each type of event - thereby resulting in more meaningful and accurate test results.

[0061] Figure 7 illustrates an exemplary computing system for implementing the present invention.

[0062] The computer system 702 comprises one or more processors 704 and at least one memory 706. Processor 704 is configured to execute program instructions - and may be a real processor or a virtual processor. It will be understood that computer system 702 does not suggest any limitation as to scope of use or functionality of described embodiments. The computer system 702 may include, but is not limited to, one or more of a general-purpose computer, a programmed microprocessor, a microcontroller, an integrated circuit, and other devices or arrangements of devices that are capable of implementing the steps that constitute the method of the present invention. Exemplary embodiments of a system 702 in accordance with the present invention may include one or more servers, desktops, laptops, tablets, smart phones, mobile phones, mobile communication devices, tablets, phablets and personal digital assistants. In an embodiment of the present invention, the memory 706 may store software for implementing various embodiments of the present invention. The computer system 702 may have additional components. For example, the computer system 702 may include one or more communication channels 708, one or more input devices 710, one or more output devices 712, and storage 714. An interconnection mechanism (not shown) such as a bus, controller, or network, interconnects the components of the computer system 702. In various embodiments of the present invention, operating system software (not shown) provides an operating environment for various softwares executing in the computer system 702 using a processor 704, and manages different functionalities of the components of the computer system 702.

[0063] The communication channel(s) 708 allow communication over a communication medium to various other computing entities. The communication medium provides information such as program instructions, or other data in a communication media. The communication media includes, but is not limited to, wired or wireless methodologies implemented with an electrical, optical, RF, infrared, acoustic, microwave, Bluetooth or other transmission media. [0064] The input device(s) 710 may include, but is not limited to, a touch screen, a keyboard, mouse, pen, joystick, trackball, a voice device, a scanning device, or any another device that is capable of providing input to the computer system 702. In an embodiment of the present invention, the input device(s) 1010 may be a sound card or similar device that accepts audio input in analog or digital form. The output device(s) 712 may include, but not be limited to, a user interface on CRT, LCD, LED display, or any other display associated with any of servers, desktops, laptops, tablets, smart phones, mobile phones, mobile communication devices, tablets, phablets and personal digital assistants, printer, speaker, CD/DVD writer, or any other device that provides output from the computer system 702.

[0065] The storage 714 may include, but not be limited to, magnetic disks, magnetic tapes, CD-ROMs, CD-RWs, DVDs, any types of computer memory, magnetic stripes, smart cards, printed barcodes or any other transitory or non-transitory medium which can be used to store information and can be accessed by the computer system 1002. In various embodiments of the present invention, the storage 1014 may contain program instructions for implementing any of the described embodiments.

[0066] In an embodiment of the present invention, the computer system 702 is part of a distributed network or a part of a set of available cloud resources.

[0067] The present invention may be implemented in numerous ways including as a system, a method, or a computer program product such as a computer readable storage medium or a computer network wherein programming instructions are communicated from a remote location.

[0068] The present invention may suitably be embodied as a computer program product for use with the computer system 702. The method described herein is typically implemented as a computer program product, comprising a set of program instructions which is executed by the computer system 702 or any other similar device. The set of program instructions may be a series of computer readable codes stored on a tangible medium, such as a computer readable storage medium (storage 714), for example, diskette, CD-ROM, ROM, flash drives or hard disk, or transmittable to the computer system 702, via a modem or other interface device, over either a tangible medium, including but not limited to optical or analogue communications channel(s) 708. The implementation of the invention as a computer program product may be in an intangible form using wireless techniques, including but not limited to microwave, infrared, Bluetooth or other transmission techniques. These instructions can be preloaded into a system or recorded on a storage medium such as a CD-ROM, or made available for downloading over a network such as the Internet or a mobile telephone network. The series of computer readable instructions may embody all or part of the functionality previously described herein.

While the exemplary embodiments of the present invention are described and illustrated herein, it will be appreciated that they are merely illustrative. It will be understood by those skilled in the art that various modifications in form and detail may be made therein without departing from or offending the spirit and scope of the invention as defined by the appended claims.

Claims

We claim:
1. A method for optimizing testing of a software system, comprising implementing on at least one processor, the steps of:
parsing text of a requirements specification corresponding to the software system;
performing parts-of-speech (POS) tagging of parsed text;
correlating a first set of elements comprising a plurality of POS tagged words with a second set of elements comprising one or more entities, relationships or actions represented within a software ontology;
categorizing each element within the first set of elements as a state or as an event, wherein:
categorization of each element within the first set of elements is based on attributes of a correlated element within the second set of elements; and said categorization of each element within the first set of elements results in a set of determined states and a set of determined events; and
generating a set of test cases for testing the software system, wherein generating said set of test cases comprises generating a cartesian product of the set of determined states and the set of determined events.
2. The method as claimed in claim 1, wherein each generated test case within the set of test cases comprises a unique state-event combination.
3. The method as claimed in claim 1, comprising implementation of software testing for one or more test cases within the set of test cases.
4. The method as claimed in claim 1, wherein the step of correlating the first set of elements with the second set of elements includes (i) semantic analysis of one or more POS tagged word within the first set of elements and (ii) identification of entities, relationships or actions that are represented in the software ontology that share an identical or similar meaning or context as the one or more semantically analyzed POS tagged words.
5. The method as claimed in claim 1, further comprising generating a state transition diagram based on the identified states and events.
6. The method as claimed in claim 1, wherein the generated set of test cases for testing the software system comprises a first set of test cases and a second set of test cases, wherein:
said first set of test cases comprises combinations of states and events that are intended to be implemented by the software system; and
said second set of test cases comprises combinations of states and events that are intended not to be implemented by the software system.
7. The method as claimed in claim 6, wherein the second set of test cases includes one or more state-event combinations that are not represented within a state transition diagram generated based on the identified states and events.
8. The method as claimed in claim 6, wherein
a first test method is implemented for testing one or more test cases selected from within the first set of test cases;
a second test method is implemented for testing one or more test cases selected from within the second set of test cases;
and wherein said first and second test methods are distinct.
9. The method as claimed in claim 8, wherein:
the first test method is a test method for signaling an error state when the software system fails to achieve an expected outcome in a test case selected from within the first set of test cases; and
the second test method is a test method for signaling an error state when the software system detects occurrence of a combination of state and event corresponding to a test case selected from within the second set of test cases.
10. A system for optimizing testing of a software system, the system comprising: at least one processor;
a parsing controller configured to parse text of a requirements specification corresponding to the software system; a parts-of-speech (POS) tagging controller configured to tag parsed text;
a correlation controller configured to correlate a first set of elements comprising a plurality of POS tagged words, with a second set of elements comprising one or more entities, relationships or actions represented within a software ontology;
a state-event identification controller configured to categorize each element within the first set of elements as a state or as an event, wherein:
categorization of each element within the first set of elements is based on attributes of a correlated element within the second set of elements; and said categorization of each element within the first set of elements results in a set of determined states and a set of determined events; and
a test case generator configured to generate a set of test cases for testing the software system, wherein generating said set of test cases comprises generating a cartesian product of the set of determined states and the set of determined events.
11. The system as claimed in claim 10, wherein the test case generator is configured such that each generated test case within the set of test cases comprises a unique state-event combination.
12. The system as claimed in claim 10, wherein the correlation controller is configured such that correlating the first set of elements with the second set of elements includes (i) semantic analysis of one or more POS tagged word within the first set of elements and (ii) identification of entities, relationships or actions that are represented in the software ontology that share an identical or similar meaning or context as the one or more semantically analyzed POS tagged words.
13. The system as claimed in claim 10, wherein the state-event identification controller is configured to generate a state transition diagram based on the identified states and events.
14. The system as claimed in claim 10, wherein the test case generator is configured such that the generated set of test cases for testing the software system comprises a first set of test cases and a second set of test cases, and wherein:
said first set of test cases comprises combinations of states and events that are intended to be implemented by the software system; and said second set of test cases comprises combinations of states and events that are intended not to be implemented by the software system.
15. The system as claimed in claim 14, wherein the second set of test cases includes one or more state-event combinations that are not represented within a state transition diagram generated based on the identified states and events.
16. The system as claimed in claim 14, wherein the system is further configured to:
signal an error state when the software system fails to achieve an expected outcome in a test case selected from within the first set of test cases; and
signal an error state when the software system detects occurrence of a combination of state and event corresponding to a test case selected from within the second set of test cases.
17. A computer program product for optimizing testing of a software system, comprising a non-transitory computer usable medium having a computer readable program code embodied therein, the computer readable program code comprising instructions for:
parsing text of a requirements specification corresponding to the software system; performing parts-of-speech (POS) tagging of parsed text;
correlating a first set of elements comprising a plurality of POS tagged words with a second set of elements comprising one or more entities, relationships or actions represented within a software ontology;
categorizing each element within the first set of elements as a state or as an event, wherein:
categorization of each element within the first set of elements is based on attributes of a correlated element within the second set of elements; and said categorization of each element within the first set of elements results in a set of determined states and a set of determined events; and
generating a set of test cases for testing the software system, wherein generating said set of test cases comprises generating a cartesian product of the set of determined states and the set of determined events.
PCT/IB2017/050103 2016-09-29 2017-01-10 Method and system for optimizing software testing WO2018060777A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
IN201641033388 2016-09-29
IN201641033388 2016-09-29

Publications (1)

Publication Number Publication Date
WO2018060777A1 true WO2018060777A1 (en) 2018-04-05

Family

ID=61762556

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2017/050103 WO2018060777A1 (en) 2016-09-29 2017-01-10 Method and system for optimizing software testing

Country Status (1)

Country Link
WO (1) WO2018060777A1 (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6289304B1 (en) * 1998-03-23 2001-09-11 Xerox Corporation Text summarization using part-of-speech
US6505342B1 (en) * 2000-05-31 2003-01-07 Siemens Corporate Research, Inc. System and method for functional testing of distributed, component-based software
US7007007B2 (en) * 1998-05-14 2006-02-28 Microsoft Corporation Test generator for database management systems providing tight joins
US7027974B1 (en) * 2000-10-27 2006-04-11 Science Applications International Corporation Ontology-based parser for natural language processing
US7836346B1 (en) * 2007-06-11 2010-11-16 Oracle America, Inc. Method and system for analyzing software test results
US20110161067A1 (en) * 2009-12-29 2011-06-30 Dynavox Systems, Llc System and method of using pos tagging for symbol assignment
US8286133B2 (en) * 2007-12-19 2012-10-09 Microsoft Corporation Fuzzing encoded data
US20140325486A1 (en) * 2013-04-28 2014-10-30 International Business Machines Corporation Techniques for testing software
US9038026B2 (en) * 2011-10-17 2015-05-19 International Business Machines Corporation System and method for automating test automation
US20150324274A1 (en) * 2014-05-09 2015-11-12 Wipro Limited System and method for creating universal test script for testing variants of software application
US9244819B2 (en) * 2011-10-31 2016-01-26 International Business Machines Corporation Attribute value properties for test selection with cartesian product models

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6289304B1 (en) * 1998-03-23 2001-09-11 Xerox Corporation Text summarization using part-of-speech
US7007007B2 (en) * 1998-05-14 2006-02-28 Microsoft Corporation Test generator for database management systems providing tight joins
US6505342B1 (en) * 2000-05-31 2003-01-07 Siemens Corporate Research, Inc. System and method for functional testing of distributed, component-based software
US7027974B1 (en) * 2000-10-27 2006-04-11 Science Applications International Corporation Ontology-based parser for natural language processing
US7836346B1 (en) * 2007-06-11 2010-11-16 Oracle America, Inc. Method and system for analyzing software test results
US8286133B2 (en) * 2007-12-19 2012-10-09 Microsoft Corporation Fuzzing encoded data
US20110161067A1 (en) * 2009-12-29 2011-06-30 Dynavox Systems, Llc System and method of using pos tagging for symbol assignment
US9038026B2 (en) * 2011-10-17 2015-05-19 International Business Machines Corporation System and method for automating test automation
US9244819B2 (en) * 2011-10-31 2016-01-26 International Business Machines Corporation Attribute value properties for test selection with cartesian product models
US20140325486A1 (en) * 2013-04-28 2014-10-30 International Business Machines Corporation Techniques for testing software
US20150324274A1 (en) * 2014-05-09 2015-11-12 Wipro Limited System and method for creating universal test script for testing variants of software application

Similar Documents

Publication Publication Date Title
Moreno et al. Automatic generation of natural language summaries for java classes
Overmyer et al. Conceptual modeling through linguistic analysis using LIDA
US9229800B2 (en) Problem inference from support tickets
US20090024385A1 (en) Semantic parser
US8543913B2 (en) Identifying and using textual widgets
US20120303661A1 (en) Systems and methods for information extraction using contextual pattern discovery
US8561014B2 (en) Extracting a system modelling meta-model language model for a system from a natural language specification of the system
JP2015505082A (en) Generation of natural language processing model for information domain
Linares-Vásquez et al. Mining android app usages for generating actionable gui-based execution scenarios
US7840521B2 (en) Computer-based method and system for efficient categorizing of digital documents
Leopold et al. Supporting process model validation through natural language generation
US8370278B2 (en) Ontological categorization of question concepts from document summaries
WO2014100475A1 (en) Editor visualizations
US9665826B2 (en) Automated problem inference from bug repositories
Gu et al. " What Parts of Your Apps are Loved by Users?"(T)
Bajwa et al. Object oriented software modeling using NLP based knowledge extraction
Ciurumelea et al. Analyzing reviews and code of mobile apps for better release planning
EP2664997B1 (en) System and method for resolving named entity coreference
CN1627300A (en) Learning and using generalized string patterns for information extraction
US8468391B2 (en) Utilizing log event ontology to deliver user role specific solutions for problem determination
US20160306846A1 (en) Visual representation of question quality
US8650022B2 (en) Method and an apparatus for automatic semantic annotation of a process model
Antunes et al. The process model matching contest 2015
US20120089394A1 (en) Visual Display of Semantic Information
US20070179777A1 (en) Automatic Grammar Generation Using Distributedly Collected Knowledge

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17855085

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17855085

Country of ref document: EP

Kind code of ref document: A1