WO2019193479A1 - Cognitive robotic system for test data management activities and method employed thereof - Google Patents

Cognitive robotic system for test data management activities and method employed thereof Download PDF

Info

Publication number
WO2019193479A1
WO2019193479A1 PCT/IB2019/052651 IB2019052651W WO2019193479A1 WO 2019193479 A1 WO2019193479 A1 WO 2019193479A1 IB 2019052651 W IB2019052651 W IB 2019052651W WO 2019193479 A1 WO2019193479 A1 WO 2019193479A1
Authority
WO
WIPO (PCT)
Prior art keywords
cognitive
user
data
artificial intelligence
test data
Prior art date
Application number
PCT/IB2019/052651
Other languages
French (fr)
Inventor
Venkata Krishna Pratyusha Challa
Original Assignee
Venkata Krishna Pratyusha Challa
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Venkata Krishna Pratyusha Challa filed Critical Venkata Krishna Pratyusha Challa
Publication of WO2019193479A1 publication Critical patent/WO2019193479A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases

Definitions

  • the disclosed subject matter relates generally to the field of software testing. More particularly, the present disclosure relates to an artificial intelligence cognitive robotic system for test data management activities and method employed thereof.
  • Quality assurance testers require data to perform testing and execute the test scenarios in order to assess the correctness of the software product having a software code.
  • the QA testers mainly commonly depend on multiple resources.
  • the multiple resources include, but are not limited to, database administrators, operating system teams, data analysts and data engineers to discuss the requirements, analyze the requirements, load the data to a staging database, cleanse the data to de sensitize any sensitive information, generation of data for scenarios, and the like.
  • testing tools which perform data management activities.
  • the existing testing tools which perform data management activities.
  • the testing tools need a very high proficiency for a user to get accustomed to and will again require skilled resource to operate them.
  • These tools do not help a novice in operating them and will definitely need a thorough understanding of data management activities.
  • An objective of the present disclosure is directed towards identifying the data relevant for the software testing and other factors such as infrastructure costs, adhering to personal data protection guidelines, challenges in identifying resources with the test data management capabilities.
  • Another objective of the present disclosure is directed towards providing a service line to steer the overall software testing in an optimized, complaint and a cost effective manner.
  • Another objective of the present disclosure is directed towards advising the user on the test data management strategy for the software testing or the quality analysis framework.
  • Another objective of the present disclosure is directed towards guiding the user in taking right or appropriate actions related to the test data management.
  • Another objective of the present disclosure is directed towards performing various activities such as designing test data management strategy, demand capturing, estimations for project planning, engineering the test data, data analysis and identification, data subset, data masking, and database comparison.
  • Another objective of the present disclosure is directed towards overcoming the challenges using artificial intelligence abilities.
  • Another objective of the present disclosure is directed towards assisting the users to determine the appropriate test data strategy for the software testing and also executing the actions which the users get convinced as appropriate.
  • Another objective of the present disclosure is directed towards reducing the failed scenarios over a period of time by the consistent learning exercise.
  • Another objective of the present disclosure is directed towards making easy to handle a complex test data management life cycle and also acts as virtual assistant to resolve queries around the test data management activities.
  • a method comprising a step of conversing with a user after providing an input data by the user.
  • the method further comprising a step of analyzing the requirement and gathering essential and any missing information.
  • the method further comprising a step of connecting to an application source database, and connecting to an application target database.
  • the method further comprising a step of comparing the tables in the databases relevant to the requirement.
  • the method further comprising a step of generating information pertaining to various stages of test data management life cycle.
  • the method further comprising a step of presenting the information to the user based on the queries as a part of the conversation, perform test data reservation to block the data for a test case and informing the user (e.g., the admin) about a failed or succeeded scenario, ability to learn from the conversation logs and enhance its rule dictionary as well as bag of keywords to reduce the number of failure scenarios.
  • the user e.g., the admin
  • a cognitive robotic system comprises at least one processing device, at least one computer-readable medium comprising computer- executable instructions, that when executed by the at least one processing device, instruct at least one end-user device to carry out test data management activities within the computer- implemented environment.
  • the cognitive robotic system comprises a cognitive robotic test data management module stored in the at least one computer-readable medium, the cognitive robotic test data management module configured to interact with a user via a user interface accessible to the user via the at least one end-user device, the cognitive robotic test data management module configured to give a response in return to the user after providing an input data by the user, the response depends on user requirements and based on the same interaction with various downstream modules, the cognitive robotic test data management module comprises a cognitive or artificial intelligence engine configured to process the user input, analyzes and assists respond to the user with answers to queries, the cognitive or artificial intelligence engine also configured to understand the requirement scope and estimate the effort to complete the end to end activities within a task.
  • FIG. 1 is a block diagram representing an example environment in which aspects of the present disclosure can be implemented. Specifically, FIG. 1 depicts a schematic representation of environment for test data management activities according to an embodiment of the present disclosure.
  • FIG. 2 is a block diagram depicting the cognitive robotic test data management module 108 shown in FIG. 1, in accordance with one or more embodiments.
  • FIG. 3 is a block diagram depicting the data masking module 208 shown in FIG. 2, in accordance with one or more exemplary embodiments.
  • FIG. 4 is a block diagram depicting the data analysis and identification module 210, shown in FIG. 2, in accordance with one or more exemplary embodiments.
  • FIG. 5 is a block diagram depicting the data subset module 206, shown in FIG. 2, in accordance with one or more exemplary embodiments.
  • FIG. 6 is a block diagram depicting the data generation module 212 shown in FIG. 2, in accordance with one or more exemplary embodiments.
  • FIG. 7 is a flow diagram depicting an exemplary method for performing test data management activities by the cognitive or artificial intelligence engine 202, in accordance with one or more exemplary embodiments.
  • FIG. 8 is a flow diagram depicting an exemplary method for performing the actions with the data analysis and identification module 210 by the cognitive or artificial intelligence engine 202, in accordance with one or more exemplary embodiments.
  • FIG. 9 is a flow diagram depicting an exemplary method for performing the actions with data subset module 206 by the cognitive or artificial intelligence engine, in accordance with one or more exemplary embodiments.
  • FIG. 10 is a flow diagram depicting an exemplary method for performing the actions with data generation module 212 by the cognitive or artificial intelligence engine 202, in accordance with one or more exemplary embodiments.
  • FIG. 11 is a flow diagram depicting an exemplary method for performing the actions with data masking module 208 by the cognitive or artificial intelligence engine 202, in accordance with one or more exemplary embodiments.
  • FIG. 12 is a block diagram illustrating the details of digital processing system in which various aspects of the present disclosure are operative by execution of appropriate software instructions.
  • FIG. 1 is a block diagram 100 representing an example environment in which aspects of the present disclosure can be implemented.
  • FIG. 1 depicts a schematic representation of environment for test data management activities according to an embodiment of the present disclosure.
  • the example environment 100 is shown containing only representative devices and systems for illustration. However, real-world environments may contain more or fewer systems or devices.
  • FIG. 1 depicts an end-user device 102, a processing device 104, and a computer-readable medium 106.
  • the end-user device 102 may include a display which may not be limited to a cathode ray tube (CRT), a liquid crystal display (LCD), a light-emitting diode (LED), a plasma, a smart phone, a tablet personal computer (“PC"), a mobile phone, a video telephone, an electronic book (e-book) reader, a desktop PC, a laptop PC, a netbook computer, a personal digital assistant ("PDA”), and so forth.
  • the processing device 104 may include data processing device for executing program components for executing user or system generated requests.
  • the end-user device 102 may include a user interface for a natural language conversation with a cognitive robotic test data management module 108.
  • the cognitive robotic test data management module 108 may be stored in the computer-readable medium 106.
  • the end-user device 102 supported by the environment 100 is realized as a computer- implemented or computer-based device having the hardware or firmware, software, and/or processing logic needed to carry out the intelligent test data management techniques and computer-implemented methodologies described in more detail herein.
  • the processing device 104 and/or the computer-readable medium 106 may be connected to the end-user device 102 over a network (not shown).
  • the end-user device 102 may be configured to display features by the cognitive robotic test data management module 108.
  • the computer-readable medium 106 may include the cognitive robotic test data management module 108, which is accessed as a mobile application, web application, software that offers the functionality of accessing mobile applications, and viewing/processing of interactive pages, for example, are implemented in the end-user device 102 as will be apparent to one skilled in the relevant arts by reading the disclosure provided herein.
  • the cognitive robotic test data management module 108 may be downloaded from the cloud server (not shown).
  • the cognitive robotic test data management module 108 may be any suitable application downloaded from GOOGLE PLAY® (for Google Android devices), Apple Inc.'s APP STORE® (for Apple devices), or any other suitable database.
  • the cognitive robotic test data management module 108 may be software, firmware, or hardware that is integrated into the end-user device 102.
  • the cognitive robotic test data management module 108 may be an artificial intelligence powered, needs-based, social networking service to enable real-time conversations between users.
  • the cognitive robotic test data management module 108 may be configured to interact with users via the user interface accessible to the users via the end-user device 102.
  • the users may include, but are not limited to, QA testers, data analysts, data analysts, database administrators, technical subject-matter expects, developers, business analysts, and the like.
  • the end-user device 102 may include a medium where one or more users can initiate and have a conversation with the cognitive robotic test data management module 108.
  • the interface may be developed in JAVA or Python where any input data by the user may be analyzed by the cognitive robotic test data management module 108.
  • the cognitive robotic test data management module 108 may be configured to virtually assist the users in each activity across the entire test data life cycle.
  • FIG. 2 is a block diagram 200 depicting the cognitive robotic test data management module 108 shown in FIG. 1, in accordance with one or more embodiments.
  • the cognitive robotic test data management module 108 may include a cognitive or artificial intelligence engine 202, a database 204, a data subset module 206, a data masking module 208, a data analysis and identification module 210, a data generation module 212, a data comparison module 214, and a data strategy module 216.
  • the cognitive robotic test data management module 108 may further include an application source database 218 and an application target database 220.
  • the cognitive robotic test data management module 108 may be configured to give a response in return to the user after providing an input data by the user.
  • the responses of the cognitive robotic test data management module 108 may depend on a user’s requirement and based on the same interaction with various downstream modules is established by the cognitive or artificial intelligence engine 202.
  • the cognitive or artificial intelligence engine 202 may be a key component of the cognitive robotic test data management module 108 which processes the user inputs, analyzes and assists respond to the user with answers to the queries. All the conversational logs may be stored in the database 204 which are used by the user for learning and enhancing its knowledge repository which is also available in the database 204.
  • the database 204 may be a solution database.
  • the database 204 may include the essentials for the cognitive or the artificial intelligence engine 202 to perform actions using each module. These essentials may vary for different business requirements.
  • the database 204 may further include a bag-of-keywords corresponding to business requirements. The bag-of-keywords may be established including some of the discriminative words identified for each business requirement. Each keyword may have an action mapped against it which assists the cognitive or artificial intelligence engine 202 to interpret which module 206-216 needs to be triggered to respond to the user as a part of its conversation.
  • the cognitive or artificial intelligence engine 202 may be configured to understand the test data requirements out of functional test requirements and propose a test data strategy.
  • the cognitive or artificial intelligence engine 202 may also be configured to automate the creation of reference information and rules that are essential to perform the test data management activities on the non-production environment as well as for data analysis in general.
  • the cognitive or artificial intelligence engine 202 that may read through the requirements written in particular language (for example, English), interpret them in the application terminology and advise the user about the data provisioning approach.
  • the cognitive or artificial intelligence engine 202 may also be configured to understand the requirement scope and estimate the effort (in a number of days) to complete the end to end activities within the task.
  • the cognitive or artificial intelligence engine 202 may engage in conversation with the users in order to understand what user requires and if needed guide the user on actions required to accomplish the task and also predict the test data based on the learnings from user conversations for a given functional testing scenario and also create the data set for the regression test cycles.
  • the cognitive or artificial intelligence engine 202 may be configured to engage in conversations with automation hots and supply the information required for functional test automation and defect management.
  • the cognitive or artificial intelligence engine 202 may be configured to allow the user to build his own conditions on a graphical user interface. This feature assists the cognitive or artificial intelligence engine 202 learn the scenarios which may not be interpreted.
  • the cognitive or artificial intelligence engine 202 brings in a data reservation functionality which ensures the test data is not duplicated across the users while the non production environment is shared.
  • the cognitive or artificial intelligence engine 202 may be configured to generate a template in various formats and load the data into the database 204.
  • the formats may include, but are not limited to, pdf, .xls, .csv, .txt, .xml, and so forth.
  • the cognitive or artificial intelligence engine 202 may be configured to enable the formats by gathering the required information from the rule or domain module (not shown), information obtained by reading through the schema of the database 204 and requirements obtained from the user.
  • the cognitive or artificial intelligence engine 202 may be configured to connect to the database 204 and upon command, gather the details of the schema, the tables in the database, 204 the columns in the tables and the nature of the data and feeds it to the cognitive or artificial intelligence engine 202.
  • the database 204 may be configured to capture the conversation logs happening between the users and the cognitive robotic test data management module 108. For example, these logs comprise of successful scenarios as well as failed scenarios. The successful scenarios are the ones where the cognitive robotic test data management module 108 is able to respond to the user with appropriate response and the failed scenarios are where the cognitive robotic test data management module 108 could not interpret the user requirement. These logs are used by the cognitive or artificial intelligence engine 202 to learn and enhance the bag of keywords and the actions corresponding to the business requirements.
  • the data analysis and identification module 210 may be configured to assist the user to analyze the requirements which user mentions in the conversations with the cognitive or artificial intelligence engine 202 using the end-user device 102.
  • the cognitive or artificial intelligence engine 202 may be configured to understand the missing information (if any), questions the user to obtain the missing information (if any) and process the requirement.
  • the data subset module 206 may be configured to assist the users to analyze the data subset requirements which the user mentions in the conversation with the cognitive or artificial intelligence engine 202 using the end-user device 102.
  • the cognitive or artificial intelligence engine 202 may be configured to analyze the data size in the application source database 218 and the capacity in the application target database 220.
  • the cognitive or artificial intelligence engine 202 may be configured to advise the user of the status of the database 204 and then based on the decision obtained through its conversation with the user and its own analysis of the requirement designs a data subset criteria.
  • the data generation module 212 may be configured to assist the users to analyze the data generation requirements which the user mentions in the conversation with the cognitive or artificial intelligence engine 202 using the end-user device 102. Furthermore the cognitive or artificial intelligence engine 202 may be configured to assess the requirement to identify if any essential information is missing. Engine obtains the missing details from the user through its conversation. In addition cognitive or artificial intelligence engine 202 may also be configured to analyze database structure, relations across tables and referential Integrity. The cognitive or artificial intelligence engine 202 may then create data generation scripts ensuring the RI (Referential Integrity) is not impacted and inserts the data into the databases 204, 218 and 220.
  • RI Referential Integrity
  • the data comparison module 214 may be configured to assist the users to analyze the database comparison requirements which the user mentions in the conversation with the cognitive or artificial intelligence engine 202 through the end-user device 102. Furthermore the cognitive or artificial intelligence engine 202 may be configured to assess the information given by the user, identifies if there are any missing details and obtains them by questioning back the user through the conversation on the end-user device 102. In addition the cognitive or artificial intelligence engine 202 may be configured to obtain information about tables relevant to the requirement, its parent - child relationships and designs a script to insert the data or calls an API to populate the data in the tables. [0048] The cognitive or artificial intelligence engine 202 may be configured to connect to the application target database 218 via communication channels to access database schemas and/or models.
  • the cognitive or artificial intelligence engine 202 may be configured to extract the list of entities and the attributes within each of those entities, primary key and foreign key details, and the relationships across each of the entity. Furthermore the cognitive or artificial intelligence engine 202 may also be configured to store the details in the database 204. These details may be used by the cognitive or artificial intelligence engine 202 to perform the actions corresponding to the keywords which are extracted from the user conversations.
  • the cognitive robotic test data management module 108 may ask the user to enter the details which may enable it to connect to the respective databases 204, 218 and 220.
  • the users may be required to register with the application source database 218 and the application target database 220 to assist the cognitive or artificial intelligence engine 202 connect to the users and execute the actions. These actions may be performed by the cognitive or artificial intelligence engine 202 using one of the multiple modules 206-216.
  • the data strategy module 216 may be configured to determine the appropriate test data strategy to the users for the software testing and also execute the actions which the users get convinced as appropriate.
  • the users may initiate the conversation with the cognitive robotic test data management module 108 using the end-user device 102 for the natural language conversation.
  • the cognitive or artificial intelligence engine 202 may be configured to analyze the user conversations using the rules and keywords available in the database 204.
  • the cognitive or artificial intelligence engine 202 may be configured to decide which module 206-216 needs to be triggered for based on the action defined against the keywords and executes it.
  • the cognitive or artificial intelligence engine 202 may be configured to understand the user requirement and perform the actions with the data masking module 208.
  • the cognitive or artificial intelligence engine 202 may also be configured to identify the requirement relevant to the data masking module 208 based on the keywords extracted from the user requirement and compare the extracted keywords with the bag of keywords and its corresponding action available in the database 204.
  • the cognitive or artificial intelligence engine 202 may also be configured to acquire the entire essential required data using the data masking module 208 to execute the action.
  • FIG. 3 is a block diagram 300 depicting the data masking module 208 shown in FIG. 2, in accordance with one or more exemplary embodiments.
  • the data masking module 208 may include sub-modules which may not be limited to a rule or domain module 302, a profile module 304, a database connection module 306, and a masking module 308.
  • the cognitive or artificial intelligence engine 202 may understand the requirement and triggers the rule or domain module 302 through the data masking module 208.
  • the rule or domain module 302 may be configured to perform the function of displaying the rules which are already configured in the database 204, add or edit the masking rules.
  • the rules may assist the cognitive or artificial intelligence engine 202 in determining the sensitive attributes in the application databases 218 and 220 upon a request from the users.
  • the cognitive or artificial intelligence engine 202 may be configured to feed back the analysis to the end-user device 102 for the natural language conversation and the users may be prompted with follow-up queries as in case of any missing information to obtain few more details in order to execute the request.
  • the cognitive or artificial intelligence engine 202 may be configured to prompt the user via the database connection module 306 to provide the connection details in order to connect to the application source database 218 and the application target database 220.
  • the cognitive or artificial intelligence engine 202 may be configured to trigger the profile module 304 if the user requests to perform metadata profiling, data profiling or generation of a sensitivity report.
  • the profile module 304 may use the rules which are existing or added using the rules or domain module 302 and run the rules against the application source database 218 and the application target database 220.
  • the profile module 304 may be configured to use the regular expressions and scans the metadata or data depending on the user requirement.
  • the profile module 304 may also be configured to generate a report which lists down the attributes corresponding to the entities within the application source database 218 and the application target database 220 that are sensitive in nature and may be de-sensitized or obfuscated.
  • the data masking module 208 may be configured to help in reading the sensitive report, assessing the appropriate masking model for the column deemed sensitive, generate a masking script, identity parent or child relationships, generate follow-up masking scripts may ensure the same values are used to mask the attributes across the entities which are similar, merge the scripts as a stored procedure and execute the script.
  • the cognitive or artificial intelligence engine 202 may be configured to perform all the actions upon receiving the requirement from the users using the end-user device 102 for the natural language conversation.
  • FIG. 4 is a block diagram 400 depicting the data analysis and identification module 210, shown in FIG. 2, in accordance with one or more exemplary embodiments.
  • the cognitive or artificial intelligence engine 202 may be configured to determine the requirement data as data subset requirement from the conversation (e.g., the conversation between the user and the cognitive robotic test data management module 108) through the end-user device 102 for the natural language conversation.
  • the cognitive or artificial intelligence engine 202 may be configured to trigger the data analysis and identification module 210 for processing the request.
  • the data analysis and identification module 210 may include sub-modules which may not be limited to, a data analyzer module 402, a script designer module 404, and a data extraction module 406.
  • the cognitive or artificial intelligence engine 202 may be configured to the trigger the data analyzer module 402 which gathers all the keywords from the user conversation. Furthermore the cognitive or artificial intelligence engine 202 may determine if the input is valid based on the bag of keywords available in the database 204.
  • the data analyzer module 402 may be configured to analyze if any essential information is missing and obtains the same through its conversation with the user. Once all the essential details are available, the data analyzer module 402 may be configured to pass the input data to the script designer module 404.
  • the script designer module 404 may be configured to perform the function of gathering the information from the data analyzer module 402 and parse through the bag of keywords in the database 204.
  • the script designer module 404 may be configured to gather the actions against the keywords, entities, attribute details corresponding to the entities, parent-child relations across the entities and connects to the application target database 220.
  • the cognitive or artificial intelligence engine 202 may be configured to prompt the user through the conversations about the connection details of the application target database 220.
  • the script designer module 404 may be configured to use the all information gathered to design a data extraction script.
  • the script designer module 404 may be configured to pass the designed script to the data extractor module 406.
  • the data extractor module 406 may be configured to execute the script against the application target database 220 or triggers the application programming interface or message queue interface (API/MQ) which is configured by the user (e.g., the application team) to extract the details as an alternative option.
  • the API/MQ details which are used as an alternative option may be obtained from the database 204.
  • the data which is extracted may be displayed for the user on the end-user device 102 for the natural language conversation or stored in an excel spreadsheet in the working dictionary.
  • FIG. 5 is a block diagram 500 depicting the data subset module 206, shown in FIG. 2, in accordance with one or more exemplary embodiments.
  • the cognitive or artificial intelligence engine 202 may be configured to determine the requirement as data subset requirement from the conversation (e.g., the conversation between the user and the cognitive robotic test data management module 108) through the end-user device 102 for the natural language conversation.
  • the cognitive or artificial intelligence engine 202 may be configured to trigger the data subset module 206 for processing the request.
  • the data subset module 206 may include sub-modules which may not be limited to a data storage and capacity analyzer module 502 and a driver file generation module 504.
  • the cognitive or artificial intelligence engine 202 may be configured to identify the requirement relevant to the data subset module 206 based on the keywords extracted from the user requirement and comparing them with the bag of keywords and its corresponding action available in the database 204.
  • the data subset module 206 may be configured to establish connection with the application source database 218 and the application target database 220. The users may be prompted through the conversation for the database connection details.
  • the data subset module 206 may be configured to trigger the sub-modules which may not be limited to the data storage and capacity analyzer module 502.
  • the data storage and capacity analyzer module 502 may be configured to analyze the data size in the application source database 218 which in general is measured in bytes.
  • the data storage and capacity analyzer module 502 may also be configured to measure the size of the application target database 220.
  • the data storage and capacity analyzer module 502 may be presented the storage and capacity analysis to the user through the conversation on the end-user device 102 for the natural language conversation.
  • the cognitive or artificial intelligence engine 202 may further be configured to collect the opinion of the user to understand whether to proceed with complete database restore for the application target database 220 from the application source database 218.
  • the data storage and capacity analyzer module 502 may be passed on the user feedback to the driver file generation module 504.
  • the driver file generation module 504 may be configured to gather information from the user through the conversations (e.g., the conversation between the user and the cognitive robotic test data management module 108).
  • the gathered information may include, but is not limited to the entities or database tables essential for the software testing phase, any conditions at database attribute or database row level to filter down the data and therefore reducing the volume and size of the data.
  • the driver file generation module 504 may also be configured to gather the information and design a driver file which includes a script necessary to extract the cut down version of the data. Further the driver file generation module 504 may be configured to trigger the data storage and capacity analyzer module 502 to validate if the application target database 220 may accommodate the reduced volumes. In the event of application target database 220 having enough space to accommodate the newer version of data files, the data subset module 206 may be configured to progress with the execution of the driver file generation module 504 output. The driver file may be used to export the data from the application source database 218 and import it to the application target database 220.
  • FIG. 6 is a block diagram 600 depicting the data generation module 212 shown in FIG. 2, in accordance with one or more exemplary embodiments.
  • the data generation module 212 may include sub-modules which may not be limited to a data generation analyzer module 602, a generation script designer module 604, and a data generator module 606.
  • the cognitive or artificial intelligence engine 202 may be configured to identify the requirement relevant to the data generation module 212 based on the keywords extracted from the user requirement and comparing the extracted keywords with the bag of keywords and its corresponding action available in the database 204.
  • the cognitive or artificial intelligence engine 202 may be configured to trigger the data analyzer sub-module which gather all the keywords from the user conversation.
  • the cognitive or artificial intelligence engine 202 may be configured to determine if the input is valid based on the bag of keywords available in the database 204.
  • the data generation analyzer module 602 may be configured to analyze if any essential information is missing and obtain the same through its conversations with the user. Once all the essential details are available, the data generation analyzer module 602 may be configured to pass the input data to the generation script designer module 604.
  • the generation script designer module 604 may be configured to perform the function of gathering the information from the data generation analyzer module 602 and parse through the bag of keywords in the database 204.
  • the generation script designer module 604 may be configured to gather the actions against the keywords, entities, attribute details corresponding to the entities, parent-child relations across the entities, attribute details corresponding to the entities, parent-child relations across the entities and connects to the application target database 220.
  • the cognitive or artificial intelligence engine 202 may be configured to prompt the user through the conversation about the connection details of the application target database 220.
  • the generation script designer module 604 may be configured to use all the information gathered above to design a data generation script.
  • the script is then passed on by the generation script designer module 604 to the data generator module 606.
  • the data generator module 606 may be configured to execute the script against the application target database 218 or triggers the API/MQ which is configured by the user (e.g., application team) to insert details as an alternative option.
  • the API/MQ details which may be used as an alternative option may be obtained from the database 204.
  • FIG. 7 is a flow diagram 700 depicting an exemplary method for performing test data management activities by the cognitive or artificial intelligence engine 202, in accordance with one or more exemplary embodiments.
  • the method 700 may be carried out in the context of the details of FIG. 1, FIG. 2, FIG. 3, and FIG. 4, FIG. 5, and FIG. 6.
  • the method 700 may also be carried out in any desired environment. Further, the aforementioned definitions may equally apply to the description below.
  • the exemplary method 700 includes the step of conversing with the user after providing an input data by the user at step 702.
  • the exemplary method 700 further includes the step of analyzing the requirement at step 704.
  • the exemplary method 700 further includes the step of gathering essential and any missing information at step 706.
  • the exemplary method 700 further includes the step of connecting to the application source database at step 708.
  • the exemplary method 700 further includes the step of connecting to the application target database at step 710.
  • the exemplary method 700 further includes the step of comparing the tables in the databases relevant to the requirement at step 712.
  • the exemplary method 700 further includes the step of generating the report with the details of the comparison at step 714.
  • the exemplary method 700 further includes the step of determining if the performed test data management activities are succeeded at step 716.
  • the exemplary method 700 further includes the step of stop at step 718. If the performed activities are not succeeded at step 716, the exemplary method 700 further includes the step of informing the user (e.g., the admin) about the failed scenario at step 720.
  • the user e.g., the admin
  • FIG. 8 is a flow diagram 800 depicting an exemplary method for performing the actions with the data analysis and identification module 210 by the cognitive or artificial intelligence engine 202, in accordance with one or more exemplary embodiments.
  • the method 800 may be carried out in the context of the details of FIG. 1, FIG. 2, FIG. 3, and FIG. 4, FIG. 5, FIG. 6, and FIG. 7.
  • the method 800 may also be carried out in any desired environment. Further, the aforementioned definitions may equally apply to the description below.
  • the exemplary method 800 includes the step of conversing with the user after providing an input data by the user at step 802.
  • the exemplary method 800 further includes the step of parsing the user input through the reference information in the database at step 804.
  • the exemplary method 800 further includes the step of determining if the input data is valid at step 806. If the input data is valid at step 806, the exemplary method 800 further includes a step of determining if the receipt of all essential input data is checked at step 808. If the receipt of all essential input data is checked at step 808, the exemplary method 800 further includes a step of designing the scripts for the data extraction at step 810.
  • the exemplary method 800 further includes the step of determining if the data is available at step 812. If the data is available at step 812, the exemplary method 800 includes the step of appearing the data on the end-user device 102 with excel expert option at step 814. The exemplary method 800 further includes the step of storing the conversational log repository in the database 204 at step 816. The exemplary method 800 further includes the step of learning and enhancing the bag of keywords and actions using the conversational log repository in the database 204 at step 818. If the data is not available at step 812, the exemplary method 800 includes the step of triggering communications to the user (e.g. an admin) with the scenario at step 820.
  • the user e.g. an admin
  • the communications may include, but are not limited to, e-mails, SMS text messages, video calls, audio calls, and the like. If the receipt of all essential input data is not checked at step 808, then the exemplary method 800 continues at step 804. If the input data is not valid at step 806, then the exemplary method 800 continues at step 816.
  • FIG. 9 is a flow diagram 900 depicting an exemplary method for performing the actions with data subset module 206 by the cognitive or artificial intelligence engine, in accordance with one or more exemplary embodiments.
  • the method 900 may be carried out in the context of the details of FIG. 1, FIG. 2, FIG. 3, and FIG. 4, FIG. 5, FIG. 6, FIG. 7and FIG. 8.
  • the method 900 may also be carried out in any desired environment. Further, the aforementioned definitions may equally apply to the description below.
  • the exemplary method 900 includes the step of conversing with the user after providing an input data by the user at step 902.
  • the exemplary method 900 further includes the step of gathering information from the conversation at step 904.
  • the exemplary method 900 further includes the step of analyzing the data storage and capacity analyzer module 502 at step 906.
  • the exemplary method 900 further includes the step of presenting the analysis data to the user on the end-user device 102 at step 908.
  • the exemplary method 900 further includes the step of obtaining the user’s opinion on the subset analysis at step 910.
  • the subset analysis is required based on the database and test requirement analysis.
  • the exemplary method 900 further includes the step of determining if the subset analysis and test requirement analysis are required at step 912.
  • the exemplary method 900 includes the step of designing the subset criteria at step 914.
  • the exemplary method 900 further includes the step of connecting to the application source database 218 and exporting the data files based on the subset criteria at step 916.
  • the exemplary method 900 further includes the step of importing or loading the data files to the application target database 220 at step 918. If the subset analysis and test requirement analysis are not required at step 912, performing the data loads without any subset criteria at step 920.
  • FIG. 10 is a flow diagram 1000 depicting an exemplary method for performing the actions with data generation module 212 by the cognitive or artificial intelligence engine 202, in accordance with one or more exemplary embodiments.
  • the method 1000 may be carried out in the context of the details of FIG. 1, FIG. 2, FIG. 3, and FIG. 4, FIG. 5, FIG. 6, FIG. 7, FIG. 8, and FIG. 9.
  • the method 1000 may also be carried out in any desired environment. Further, the aforementioned definitions may equally apply to the description below.
  • the exemplary method 1000 includes the step of conversing with the user after providing an input data by the user at step 1002.
  • the exemplary method 1000 further includes the step of analyzing the requirement at step 1004.
  • the exemplary method 1000 further includes the step of gathering essential and any missing information at step 1006.
  • the exemplary method 1000 further includes the step of parsing through the database with the keywords at step 1008.
  • the exemplary method 1000 further includes the step of obtaining the entity details and actions corresponding to the entities at step 1010.
  • the exemplary method 1000 further includes the step of identifying the referential integrity (RI) and parent or child table details at step 1010.
  • the exemplary method 1000 further includes the step of designing a script to insert the data or calls the application programming interface (API) which inserts the data at step 1012.
  • API application programming interface
  • the exemplary method 1000 further includes the step of informing the user about the successful data generation at step 1014.
  • the exemplary method 1000 further includes the step of assisting the user to connect to the database and validate the data created at step 1016.
  • the exemplary method 1000 further includes the step of collecting the feedback about the activity performed at step 1018.
  • the exemplary method 1000 further includes the step of determining if the performed activity is succeeded at step 1020. If the performed activity is succeeded at step 1020, the exemplary method 1000 includes the step of stop at step 1022. If the performed activity is not succeeded at step 1020, the exemplary method 1000 further includes the step of informing the user (e.g., the admin) about the failed scenario at step 1024.
  • FIG. 11 is a flow diagram 1100 depicting an exemplary method for performing the actions with data masking module 208 by the cognitive or artificial intelligence engine 202, in accordance with one or more exemplary embodiments.
  • the method 1100 may be carried out in the context of the details of FIG. 1, FIG. 2, FIG. 3, and FIG. 4, FIG. 5, FIG. 6, FIG. 7, FIG. 8, FIG. 9 and FIG. 10.
  • the method 1100 may also be carried out in any desired environment. Further, the aforementioned definitions may equally apply to the description below.
  • the exemplary method 1100 includes the step of conversing with the user after providing an input data by the user at step 1102.
  • the exemplary method 1100 further includes the step of analyzing the requirement at step 1104.
  • the exemplary method 1100 further includes the step of interacting with the data masking module 208 at step 1106.
  • the exemplary method 1100 further includes the step of performing the function of displaying the rules in the database 204 at step 1108.
  • the exemplary method 1100 further includes the step of adding or editing the masking rules at step 1110.
  • the exemplary method 1100 further includes the step of providing the connection details in order to connect to the application databases 218 and 220 at step 1112.
  • connection details may include, but are not limited to, host name, port details, server name, user name and password, and the like.
  • the exemplary method 1100 further includes the step of using the regular expressions and scanning the metadata and data depending on the user requirement at step 1114.
  • the exemplary method 1100 further includes the step of generating the sensitivity report which lists down the attributes corresponding to the entities within the application databases 218 and 220 at step 1116.
  • the exemplary method 1100 further includes the step of reading the sensitivity report, assessing the appropriate masking model for the column deemed sensitive, generating a masking script, identifying parent/child relationships, generating follow-up masking scripts, merging the scripts as a stored procedure and executing the script at step 1118.
  • FIG. 12 is a block diagram illustrating the details of digital processing system 1200 in which various aspects of the present disclosure are operative by execution of appropriate software instructions.
  • Digital processing system 1200 may be used for implementing the cognitive robotic test data management module 108 (or any other system in which the various features disclosed above can be implemented).
  • Digital processing system 1200 may contain one or more processors such as a central processing unit (CPU) 1210, random access memory (RAM) 1220, secondary memory 1227, graphics controller 1260, display unit 1270, network interface 1280, an input interface 1290. All the components except display unit 1270 may communicate with each other over communication path 1250, which may contain several buses as is well known in the relevant arts. The components of Figure 12 are described below in further detail.
  • processors such as a central processing unit (CPU) 1210, random access memory (RAM) 1220, secondary memory 1227, graphics controller 1260, display unit 1270, network interface 1280, an input interface 1290. All the components except display unit 1270 may communicate with each other over communication path 1250, which may contain several buses as is well known in the relevant arts. The components of Figure 12 are described below in further detail.
  • CPU 1210 may execute instructions stored in RAM 1220 to provide several features of the present disclosure.
  • CPU 1210 may contain multiple processing units, with each processing unit potentially being designed for a specific task.
  • CPU 1210 may contain only a single general-purpose processing unit.
  • RAM 1220 may receive instructions from secondary memory 1230 using communication path 1250.
  • RAM 1220 is shown currently containing software instructions, such as those used in threads and stacks, constituting shared environment 1225 and/or user programs 1226.
  • Shared environment 1225 includes operating systems, device drivers, virtual machines, etc., which provide a (common) run time environment for execution of user programs 1226.
  • Graphics controller 1260 generates display signals (e.g., in RGB format) to display unit 1270 based on data/instructions received from CPU 1210.
  • Display unit 1270 contains a display screen to display the images defined by the display signals.
  • Input interface 1290 may correspond to a keyboard and a pointing device (e.g., touch-pad, mouse) and may be used to provide inputs.
  • Network interface 1280 provides connectivity to a network (e.g., using Internet Protocol), and may be used to communicate with other systems (such as those shown in Figure 1, network 104) connected to the network.
  • Secondary memory 1230 may contain hard drive 1235, flash memory 1236, and removable storage drive 1237. Secondary memory 1230 may store the data software instructions (e.g., for performing the actions noted above with respect to the Figures), which enable digital processing system 1200 to provide several features in accordance with the present disclosure.
  • removable storage drive 1237 may be provided on the removable storage unit 1240, and the data and instructions may be read and provided by removable storage drive 1237 to CPU 1210.
  • Floppy drive, magnetic tape drive, CD-ROM drive, DVD Drive, Flash memory, a removable memory chip (PCMCIA Card, EEPROM) are examples of such removable storage drive 1237.
  • removable storage unit 1240 may be implemented using medium and storage format compatible with removable storage drive 1237 such that removable storage drive 1237 can read the data and instructions.
  • removable storage unit 1240 includes a computer readable (storage) medium having stored therein computer software and/or data.
  • the computer (or machine, in general) readable medium can be in other forms (e.g., non-removable, random access, etc.).
  • Non-volatile media includes, for example, optical disks, magnetic disks, or solid-state drives, such as storage memory 1230.
  • Volatile media includes dynamic memory, such as RAM 1220.
  • storage media include, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.
  • Storage media is distinct from but may be used in conjunction with transmission media.
  • Transmission media participates in transferring information between storage media.
  • transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 1250.
  • transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

Exemplary embodiments of the present disclosure are directed towards a cognitive robotic test data management system and method for test data management activities, comprising: a processing device, a computer-readable medium comprising computer-executable instructions, that when executed by the processing device, and a cognitive robotic test data management module configured to interact with a user via a user interface accessible to the user via an end- user device, cognitive robotic test data management module configured to give a response in return to the user after providing an input data by the user, the cognitive robotic test data management module comprises a cognitive or artificial intelligence engine configured to process the user input, analyzes and assists respond to the user with answers to queries, the cognitive or artificial intelligence engine also configured to understand the requirement scope and estimate the effort to complete the end to end activities within a task.

Description

“COGNITIVE ROBOTIC SYSTEM FOR TEST DATA MANAGEMENT ACTIVITIES
AND METHOD EMPLOYED THEREOF”
TECHNICAL FIELD
[001] The disclosed subject matter relates generally to the field of software testing. More particularly, the present disclosure relates to an artificial intelligence cognitive robotic system for test data management activities and method employed thereof.
BACKGROUND
[002] Need for quality assurance of a software product is very essential to the success of information technology organizations. The quality assurance mainly involves testing of the software product at various stages to minimize the defects. Quality assurance (QA) testers require data to perform testing and execute the test scenarios in order to assess the correctness of the software product having a software code. The QA testers mainly commonly depend on multiple resources. The multiple resources include, but are not limited to, database administrators, operating system teams, data analysts and data engineers to discuss the requirements, analyze the requirements, load the data to a staging database, cleanse the data to de sensitize any sensitive information, generation of data for scenarios, and the like. However, QA testers require a lot of technical skills such as understanding the test cases, understand the business flow or logic, understand the database structures and ability to design masking scripts or stored procedures, and assess the complexity of requirements in terms of writing queries on databases. Thus there is a challenge in identifying right set of resources that can support the end to end data management activities which involves a data requirement analysis, a data subset, a data masking, a data generation. All the end to end data management activities play a major role in successful completion of software testing phase and earlier time to market.
[003] In this context, automation in the area of software testing has grown and a lot of testing tools are now available to increase efficiency and reduce cost. The existing testing tools which perform data management activities. However the testing tools need a very high proficiency for a user to get accustomed to and will again require skilled resource to operate them. These tools do not help a novice in operating them and will definitely need a thorough understanding of data management activities.
[004] Existing software testing tools do not help the user with all the data management related activities. The existing software testing tools can’t learn from the user conversations to enhance the rules and its ability to provide a right advice. The available software tools frequently depend on the resources with a combination of skills around functional testing, data management, database administration and test data provisioning. Additionally, there is a need for a solution to assess the situation and advise or guide the user in taking right or appropriate actions related to the test data management activities.
[005] In the light of the aforementioned discussion, there exists a need for a certain system with novel methodologies that would overcome the above-mentioned disadvantages.
SUMMARY
[006] The following presents a simplified summary of the disclosure in order to provide a basic understanding of the reader. This summary is not an extensive overview of the disclosure and it does not identify key/critical elements of the invention or delineate the scope of the invention. Its sole purpose is to present some concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.
[007] An objective of the present disclosure is directed towards identifying the data relevant for the software testing and other factors such as infrastructure costs, adhering to personal data protection guidelines, challenges in identifying resources with the test data management capabilities.
[008] Another objective of the present disclosure is directed towards providing a service line to steer the overall software testing in an optimized, complaint and a cost effective manner. [009] Another objective of the present disclosure is directed towards advising the user on the test data management strategy for the software testing or the quality analysis framework.
[0010] Another objective of the present disclosure is directed towards guiding the user in taking right or appropriate actions related to the test data management.
[0011] Another objective of the present disclosure is directed towards performing various activities such as designing test data management strategy, demand capturing, estimations for project planning, engineering the test data, data analysis and identification, data subset, data masking, and database comparison.
[0012] Another objective of the present disclosure is directed towards overcoming the challenges using artificial intelligence abilities.
[0013] Another objective of the present disclosure is directed towards assisting the users to determine the appropriate test data strategy for the software testing and also executing the actions which the users get convinced as appropriate.
[0014] Another objective of the present disclosure is directed towards reducing the failed scenarios over a period of time by the consistent learning exercise.
[0015] Another objective of the present disclosure is directed towards making easy to handle a complex test data management life cycle and also acts as virtual assistant to resolve queries around the test data management activities.
[0016] In an embodiment of the present disclosure, a method comprising a step of conversing with a user after providing an input data by the user.
[0017] In another embodiment of the present disclosure, the method further comprising a step of analyzing the requirement and gathering essential and any missing information. [0018] In another embodiment of the present disclosure, the method further comprising a step of connecting to an application source database, and connecting to an application target database.
[0019] In another embodiment of the present disclosure, the method further comprising a step of comparing the tables in the databases relevant to the requirement.
[0020] In another embodiment of the present disclosure, the method further comprising a step of generating information pertaining to various stages of test data management life cycle.
[0021] In another embodiment of the present disclosure, the method further comprising a step of presenting the information to the user based on the queries as a part of the conversation, perform test data reservation to block the data for a test case and informing the user (e.g., the admin) about a failed or succeeded scenario, ability to learn from the conversation logs and enhance its rule dictionary as well as bag of keywords to reduce the number of failure scenarios.
[0022] In another embodiment of the present disclosure, a cognitive robotic system comprises at least one processing device, at least one computer-readable medium comprising computer- executable instructions, that when executed by the at least one processing device, instruct at least one end-user device to carry out test data management activities within the computer- implemented environment.
[0023] In another embodiment of the present disclosure, the cognitive robotic system comprises a cognitive robotic test data management module stored in the at least one computer-readable medium, the cognitive robotic test data management module configured to interact with a user via a user interface accessible to the user via the at least one end-user device, the cognitive robotic test data management module configured to give a response in return to the user after providing an input data by the user, the response depends on user requirements and based on the same interaction with various downstream modules, the cognitive robotic test data management module comprises a cognitive or artificial intelligence engine configured to process the user input, analyzes and assists respond to the user with answers to queries, the cognitive or artificial intelligence engine also configured to understand the requirement scope and estimate the effort to complete the end to end activities within a task.
BRIEF DESCRIPTION OF THE DRAWINGS
[0024] In the following, numerous specific details are set forth to provide a thorough description of various embodiments. Certain embodiments may be practiced without these specific details or with some variations in detail. In some instances, certain features are described in less detail so as not to obscure other aspects. The level of detail associated with each of the elements or features should not be construed to qualify the novelty or importance of one feature over the others.
[0025] FIG. 1 is a block diagram representing an example environment in which aspects of the present disclosure can be implemented. Specifically, FIG. 1 depicts a schematic representation of environment for test data management activities according to an embodiment of the present disclosure.
[0026] FIG. 2 is a block diagram depicting the cognitive robotic test data management module 108 shown in FIG. 1, in accordance with one or more embodiments.
[0027] FIG. 3 is a block diagram depicting the data masking module 208 shown in FIG. 2, in accordance with one or more exemplary embodiments.
[0028] FIG. 4 is a block diagram depicting the data analysis and identification module 210, shown in FIG. 2, in accordance with one or more exemplary embodiments.
[0029] FIG. 5 is a block diagram depicting the data subset module 206, shown in FIG. 2, in accordance with one or more exemplary embodiments.
[0030] FIG. 6 is a block diagram depicting the data generation module 212 shown in FIG. 2, in accordance with one or more exemplary embodiments. [0031] FIG. 7 is a flow diagram depicting an exemplary method for performing test data management activities by the cognitive or artificial intelligence engine 202, in accordance with one or more exemplary embodiments.
[0032] FIG. 8 is a flow diagram depicting an exemplary method for performing the actions with the data analysis and identification module 210 by the cognitive or artificial intelligence engine 202, in accordance with one or more exemplary embodiments.
[0033] FIG. 9 is a flow diagram depicting an exemplary method for performing the actions with data subset module 206 by the cognitive or artificial intelligence engine, in accordance with one or more exemplary embodiments.
[0034] FIG. 10 is a flow diagram depicting an exemplary method for performing the actions with data generation module 212 by the cognitive or artificial intelligence engine 202, in accordance with one or more exemplary embodiments.
[0035] FIG. 11 is a flow diagram depicting an exemplary method for performing the actions with data masking module 208 by the cognitive or artificial intelligence engine 202, in accordance with one or more exemplary embodiments.
[0036] FIG. 12 is a block diagram illustrating the details of digital processing system in which various aspects of the present disclosure are operative by execution of appropriate software instructions.
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
[0037] It is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the drawings. The present disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting.
[0038] The use of“including”,“comprising” or“having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. The terms“a” and“an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced item. Further, the use of terms“first”,“second”, and“third”, and the like, herein do not denote any order, quantity, or importance, but rather are used to distinguish one element from another.
[0039] Referring to FIG. 1 is a block diagram 100 representing an example environment in which aspects of the present disclosure can be implemented. Specifically, FIG. 1 depicts a schematic representation of environment for test data management activities according to an embodiment of the present disclosure. The example environment 100 is shown containing only representative devices and systems for illustration. However, real-world environments may contain more or fewer systems or devices. FIG. 1 depicts an end-user device 102, a processing device 104, and a computer-readable medium 106. The end-user device 102 may include a display which may not be limited to a cathode ray tube (CRT), a liquid crystal display (LCD), a light-emitting diode (LED), a plasma, a smart phone, a tablet personal computer ("PC"), a mobile phone, a video telephone, an electronic book (e-book) reader, a desktop PC, a laptop PC, a netbook computer, a personal digital assistant ("PDA"), and so forth. The processing device 104 may include data processing device for executing program components for executing user or system generated requests. The end-user device 102 may include a user interface for a natural language conversation with a cognitive robotic test data management module 108. The cognitive robotic test data management module 108 may be stored in the computer-readable medium 106. The end-user device 102 supported by the environment 100 is realized as a computer- implemented or computer-based device having the hardware or firmware, software, and/or processing logic needed to carry out the intelligent test data management techniques and computer-implemented methodologies described in more detail herein. The processing device 104 and/or the computer-readable medium 106 may be connected to the end-user device 102 over a network (not shown). The end-user device 102 may be configured to display features by the cognitive robotic test data management module 108.
[0040] The computer-readable medium 106 may include the cognitive robotic test data management module 108, which is accessed as a mobile application, web application, software that offers the functionality of accessing mobile applications, and viewing/processing of interactive pages, for example, are implemented in the end-user device 102 as will be apparent to one skilled in the relevant arts by reading the disclosure provided herein. The cognitive robotic test data management module 108 may be downloaded from the cloud server (not shown). For example, the cognitive robotic test data management module 108 may be any suitable application downloaded from GOOGLE PLAY® (for Google Android devices), Apple Inc.'s APP STORE® (for Apple devices), or any other suitable database. In some embodiments, the cognitive robotic test data management module 108 may be software, firmware, or hardware that is integrated into the end-user device 102.
[0041] The cognitive robotic test data management module 108 may be an artificial intelligence powered, needs-based, social networking service to enable real-time conversations between users. The cognitive robotic test data management module 108 may be configured to interact with users via the user interface accessible to the users via the end-user device 102. The users may include, but are not limited to, QA testers, data analysts, data analysts, database administrators, technical subject-matter expects, developers, business analysts, and the like. The end-user device 102 may include a medium where one or more users can initiate and have a conversation with the cognitive robotic test data management module 108. The interface may be developed in JAVA or Python where any input data by the user may be analyzed by the cognitive robotic test data management module 108. The cognitive robotic test data management module 108 may be configured to virtually assist the users in each activity across the entire test data life cycle.
[0042] Referring to FIG. 2 is a block diagram 200 depicting the cognitive robotic test data management module 108 shown in FIG. 1, in accordance with one or more embodiments. The cognitive robotic test data management module 108 may include a cognitive or artificial intelligence engine 202, a database 204, a data subset module 206, a data masking module 208, a data analysis and identification module 210, a data generation module 212, a data comparison module 214, and a data strategy module 216.
[0043] The cognitive robotic test data management module 108 may further include an application source database 218 and an application target database 220. The cognitive robotic test data management module 108 may be configured to give a response in return to the user after providing an input data by the user. The responses of the cognitive robotic test data management module 108 may depend on a user’s requirement and based on the same interaction with various downstream modules is established by the cognitive or artificial intelligence engine 202. The cognitive or artificial intelligence engine 202 may be a key component of the cognitive robotic test data management module 108 which processes the user inputs, analyzes and assists respond to the user with answers to the queries. All the conversational logs may be stored in the database 204 which are used by the user for learning and enhancing its knowledge repository which is also available in the database 204. Here, the database 204 may be a solution database. The database 204 may include the essentials for the cognitive or the artificial intelligence engine 202 to perform actions using each module. These essentials may vary for different business requirements. The database 204 may further include a bag-of-keywords corresponding to business requirements. The bag-of-keywords may be established including some of the discriminative words identified for each business requirement. Each keyword may have an action mapped against it which assists the cognitive or artificial intelligence engine 202 to interpret which module 206-216 needs to be triggered to respond to the user as a part of its conversation.
[0044] The cognitive or artificial intelligence engine 202 may be configured to understand the test data requirements out of functional test requirements and propose a test data strategy. The cognitive or artificial intelligence engine 202 may also be configured to automate the creation of reference information and rules that are essential to perform the test data management activities on the non-production environment as well as for data analysis in general. The cognitive or artificial intelligence engine 202 that may read through the requirements written in particular language (for example, English), interpret them in the application terminology and advise the user about the data provisioning approach. The cognitive or artificial intelligence engine 202 may also be configured to understand the requirement scope and estimate the effort (in a number of days) to complete the end to end activities within the task. The cognitive or artificial intelligence engine 202 that may engage in conversation with the users in order to understand what user requires and if needed guide the user on actions required to accomplish the task and also predict the test data based on the learnings from user conversations for a given functional testing scenario and also create the data set for the regression test cycles.
[0045] The cognitive or artificial intelligence engine 202 may be configured to engage in conversations with automation hots and supply the information required for functional test automation and defect management. The cognitive or artificial intelligence engine 202 may be configured to allow the user to build his own conditions on a graphical user interface. This feature assists the cognitive or artificial intelligence engine 202 learn the scenarios which may not be interpreted. The cognitive or artificial intelligence engine 202 brings in a data reservation functionality which ensures the test data is not duplicated across the users while the non production environment is shared. The cognitive or artificial intelligence engine 202 may be configured to generate a template in various formats and load the data into the database 204. The formats may include, but are not limited to, pdf, .xls, .csv, .txt, .xml, and so forth. The cognitive or artificial intelligence engine 202 may be configured to enable the formats by gathering the required information from the rule or domain module (not shown), information obtained by reading through the schema of the database 204 and requirements obtained from the user. The cognitive or artificial intelligence engine 202 may be configured to connect to the database 204 and upon command, gather the details of the schema, the tables in the database, 204 the columns in the tables and the nature of the data and feeds it to the cognitive or artificial intelligence engine 202.
[0046] The database 204 may be configured to capture the conversation logs happening between the users and the cognitive robotic test data management module 108. For example, these logs comprise of successful scenarios as well as failed scenarios. The successful scenarios are the ones where the cognitive robotic test data management module 108 is able to respond to the user with appropriate response and the failed scenarios are where the cognitive robotic test data management module 108 could not interpret the user requirement. These logs are used by the cognitive or artificial intelligence engine 202 to learn and enhance the bag of keywords and the actions corresponding to the business requirements. The data analysis and identification module 210 may be configured to assist the user to analyze the requirements which user mentions in the conversations with the cognitive or artificial intelligence engine 202 using the end-user device 102. The cognitive or artificial intelligence engine 202 may be configured to understand the missing information (if any), questions the user to obtain the missing information (if any) and process the requirement. The data subset module 206 may be configured to assist the users to analyze the data subset requirements which the user mentions in the conversation with the cognitive or artificial intelligence engine 202 using the end-user device 102. The cognitive or artificial intelligence engine 202 may be configured to analyze the data size in the application source database 218 and the capacity in the application target database 220. The cognitive or artificial intelligence engine 202 may be configured to advise the user of the status of the database 204 and then based on the decision obtained through its conversation with the user and its own analysis of the requirement designs a data subset criteria.
[0047] The data generation module 212 may be configured to assist the users to analyze the data generation requirements which the user mentions in the conversation with the cognitive or artificial intelligence engine 202 using the end-user device 102. Furthermore the cognitive or artificial intelligence engine 202 may be configured to assess the requirement to identify if any essential information is missing. Engine obtains the missing details from the user through its conversation. In addition cognitive or artificial intelligence engine 202 may also be configured to analyze database structure, relations across tables and referential Integrity. The cognitive or artificial intelligence engine 202 may then create data generation scripts ensuring the RI (Referential Integrity) is not impacted and inserts the data into the databases 204, 218 and 220. The data comparison module 214 may be configured to assist the users to analyze the database comparison requirements which the user mentions in the conversation with the cognitive or artificial intelligence engine 202 through the end-user device 102. Furthermore the cognitive or artificial intelligence engine 202 may be configured to assess the information given by the user, identifies if there are any missing details and obtains them by questioning back the user through the conversation on the end-user device 102. In addition the cognitive or artificial intelligence engine 202 may be configured to obtain information about tables relevant to the requirement, its parent - child relationships and designs a script to insert the data or calls an API to populate the data in the tables. [0048] The cognitive or artificial intelligence engine 202 may be configured to connect to the application target database 218 via communication channels to access database schemas and/or models. The cognitive or artificial intelligence engine 202 may be configured to extract the list of entities and the attributes within each of those entities, primary key and foreign key details, and the relationships across each of the entity. Furthermore the cognitive or artificial intelligence engine 202 may also be configured to store the details in the database 204. These details may be used by the cognitive or artificial intelligence engine 202 to perform the actions corresponding to the keywords which are extracted from the user conversations. The cognitive robotic test data management module 108 may ask the user to enter the details which may enable it to connect to the respective databases 204, 218 and 220. The users may be required to register with the application source database 218 and the application target database 220 to assist the cognitive or artificial intelligence engine 202 connect to the users and execute the actions. These actions may be performed by the cognitive or artificial intelligence engine 202 using one of the multiple modules 206-216. The data strategy module 216 may be configured to determine the appropriate test data strategy to the users for the software testing and also execute the actions which the users get convinced as appropriate.
[0049] The users may initiate the conversation with the cognitive robotic test data management module 108 using the end-user device 102 for the natural language conversation. The cognitive or artificial intelligence engine 202 may be configured to analyze the user conversations using the rules and keywords available in the database 204. The cognitive or artificial intelligence engine 202 may be configured to decide which module 206-216 needs to be triggered for based on the action defined against the keywords and executes it. The cognitive or artificial intelligence engine 202 may be configured to understand the user requirement and perform the actions with the data masking module 208. The cognitive or artificial intelligence engine 202 may also be configured to identify the requirement relevant to the data masking module 208 based on the keywords extracted from the user requirement and compare the extracted keywords with the bag of keywords and its corresponding action available in the database 204. The cognitive or artificial intelligence engine 202 may also be configured to acquire the entire essential required data using the data masking module 208 to execute the action. [0050] Referring to FIG. 3 is a block diagram 300 depicting the data masking module 208 shown in FIG. 2, in accordance with one or more exemplary embodiments. The data masking module 208 may include sub-modules which may not be limited to a rule or domain module 302, a profile module 304, a database connection module 306, and a masking module 308. The cognitive or artificial intelligence engine 202 may understand the requirement and triggers the rule or domain module 302 through the data masking module 208. The rule or domain module 302 may be configured to perform the function of displaying the rules which are already configured in the database 204, add or edit the masking rules. The rules may assist the cognitive or artificial intelligence engine 202 in determining the sensitive attributes in the application databases 218 and 220 upon a request from the users. The cognitive or artificial intelligence engine 202 may be configured to feed back the analysis to the end-user device 102 for the natural language conversation and the users may be prompted with follow-up queries as in case of any missing information to obtain few more details in order to execute the request. Furthermore the cognitive or artificial intelligence engine 202 may be configured to prompt the user via the database connection module 306 to provide the connection details in order to connect to the application source database 218 and the application target database 220.
[0051] The cognitive or artificial intelligence engine 202 may be configured to trigger the profile module 304 if the user requests to perform metadata profiling, data profiling or generation of a sensitivity report. The profile module 304 may use the rules which are existing or added using the rules or domain module 302 and run the rules against the application source database 218 and the application target database 220. The profile module 304 may be configured to use the regular expressions and scans the metadata or data depending on the user requirement. The profile module 304 may also be configured to generate a report which lists down the attributes corresponding to the entities within the application source database 218 and the application target database 220 that are sensitive in nature and may be de-sensitized or obfuscated. The data masking module 208 may be configured to help in reading the sensitive report, assessing the appropriate masking model for the column deemed sensitive, generate a masking script, identity parent or child relationships, generate follow-up masking scripts may ensure the same values are used to mask the attributes across the entities which are similar, merge the scripts as a stored procedure and execute the script. The cognitive or artificial intelligence engine 202 may be configured to perform all the actions upon receiving the requirement from the users using the end-user device 102 for the natural language conversation.
[0052] Referring to FIG. 4 is a block diagram 400 depicting the data analysis and identification module 210, shown in FIG. 2, in accordance with one or more exemplary embodiments. The cognitive or artificial intelligence engine 202 may be configured to determine the requirement data as data subset requirement from the conversation (e.g., the conversation between the user and the cognitive robotic test data management module 108) through the end-user device 102 for the natural language conversation. The cognitive or artificial intelligence engine 202 may be configured to trigger the data analysis and identification module 210 for processing the request. The data analysis and identification module 210 may include sub-modules which may not be limited to, a data analyzer module 402, a script designer module 404, and a data extraction module 406. The cognitive or artificial intelligence engine 202 may be configured to the trigger the data analyzer module 402 which gathers all the keywords from the user conversation. Furthermore the cognitive or artificial intelligence engine 202 may determine if the input is valid based on the bag of keywords available in the database 204. The data analyzer module 402 may be configured to analyze if any essential information is missing and obtains the same through its conversation with the user. Once all the essential details are available, the data analyzer module 402 may be configured to pass the input data to the script designer module 404. The script designer module 404 may be configured to perform the function of gathering the information from the data analyzer module 402 and parse through the bag of keywords in the database 204.
[0053] Furthermore the script designer module 404 may be configured to gather the actions against the keywords, entities, attribute details corresponding to the entities, parent-child relations across the entities and connects to the application target database 220. The cognitive or artificial intelligence engine 202 may be configured to prompt the user through the conversations about the connection details of the application target database 220. The script designer module 404 may be configured to use the all information gathered to design a data extraction script. The script designer module 404 may be configured to pass the designed script to the data extractor module 406. The data extractor module 406 may be configured to execute the script against the application target database 220 or triggers the application programming interface or message queue interface (API/MQ) which is configured by the user (e.g., the application team) to extract the details as an alternative option. The API/MQ details which are used as an alternative option may be obtained from the database 204. The data which is extracted may be displayed for the user on the end-user device 102 for the natural language conversation or stored in an excel spreadsheet in the working dictionary.
[0054] Referring to FIG. 5 is a block diagram 500 depicting the data subset module 206, shown in FIG. 2, in accordance with one or more exemplary embodiments. The cognitive or artificial intelligence engine 202 may be configured to determine the requirement as data subset requirement from the conversation (e.g., the conversation between the user and the cognitive robotic test data management module 108) through the end-user device 102 for the natural language conversation. The cognitive or artificial intelligence engine 202 may be configured to trigger the data subset module 206 for processing the request. The data subset module 206 may include sub-modules which may not be limited to a data storage and capacity analyzer module 502 and a driver file generation module 504. The cognitive or artificial intelligence engine 202 may be configured to identify the requirement relevant to the data subset module 206 based on the keywords extracted from the user requirement and comparing them with the bag of keywords and its corresponding action available in the database 204. The data subset module 206 may be configured to establish connection with the application source database 218 and the application target database 220. The users may be prompted through the conversation for the database connection details. The data subset module 206 may be configured to trigger the sub-modules which may not be limited to the data storage and capacity analyzer module 502. The data storage and capacity analyzer module 502 may be configured to analyze the data size in the application source database 218 which in general is measured in bytes. The data storage and capacity analyzer module 502 may also be configured to measure the size of the application target database 220. The data storage and capacity analyzer module 502 may be presented the storage and capacity analysis to the user through the conversation on the end-user device 102 for the natural language conversation. The cognitive or artificial intelligence engine 202 may further be configured to collect the opinion of the user to understand whether to proceed with complete database restore for the application target database 220 from the application source database 218. The data storage and capacity analyzer module 502 may be passed on the user feedback to the driver file generation module 504. The driver file generation module 504 may be configured to gather information from the user through the conversations (e.g., the conversation between the user and the cognitive robotic test data management module 108). The gathered information may include, but is not limited to the entities or database tables essential for the software testing phase, any conditions at database attribute or database row level to filter down the data and therefore reducing the volume and size of the data. The driver file generation module 504 may also be configured to gather the information and design a driver file which includes a script necessary to extract the cut down version of the data. Further the driver file generation module 504 may be configured to trigger the data storage and capacity analyzer module 502 to validate if the application target database 220 may accommodate the reduced volumes. In the event of application target database 220 having enough space to accommodate the newer version of data files, the data subset module 206 may be configured to progress with the execution of the driver file generation module 504 output. The driver file may be used to export the data from the application source database 218 and import it to the application target database 220.
[0055] Referring to FIG. 6 is a block diagram 600 depicting the data generation module 212 shown in FIG. 2, in accordance with one or more exemplary embodiments. The data generation module 212 may include sub-modules which may not be limited to a data generation analyzer module 602, a generation script designer module 604, and a data generator module 606. The cognitive or artificial intelligence engine 202 may be configured to identify the requirement relevant to the data generation module 212 based on the keywords extracted from the user requirement and comparing the extracted keywords with the bag of keywords and its corresponding action available in the database 204. The cognitive or artificial intelligence engine 202 may be configured to trigger the data analyzer sub-module which gather all the keywords from the user conversation. Furthermore the cognitive or artificial intelligence engine 202 may be configured to determine if the input is valid based on the bag of keywords available in the database 204. The data generation analyzer module 602 may be configured to analyze if any essential information is missing and obtain the same through its conversations with the user. Once all the essential details are available, the data generation analyzer module 602 may be configured to pass the input data to the generation script designer module 604. The generation script designer module 604 may be configured to perform the function of gathering the information from the data generation analyzer module 602 and parse through the bag of keywords in the database 204. Furthermore the generation script designer module 604 may be configured to gather the actions against the keywords, entities, attribute details corresponding to the entities, parent-child relations across the entities, attribute details corresponding to the entities, parent-child relations across the entities and connects to the application target database 220. The cognitive or artificial intelligence engine 202 may be configured to prompt the user through the conversation about the connection details of the application target database 220. The generation script designer module 604 may be configured to use all the information gathered above to design a data generation script. The script is then passed on by the generation script designer module 604 to the data generator module 606. The data generator module 606 may be configured to execute the script against the application target database 218 or triggers the API/MQ which is configured by the user (e.g., application team) to insert details as an alternative option. The API/MQ details which may be used as an alternative option may be obtained from the database 204.
[0056] Referring to FIG. 7 is a flow diagram 700 depicting an exemplary method for performing test data management activities by the cognitive or artificial intelligence engine 202, in accordance with one or more exemplary embodiments. As an option, the method 700 may be carried out in the context of the details of FIG. 1, FIG. 2, FIG. 3, and FIG. 4, FIG. 5, and FIG. 6. However, the method 700 may also be carried out in any desired environment. Further, the aforementioned definitions may equally apply to the description below.
[0057] As illustrated in the flow diagram, the exemplary method 700 includes the step of conversing with the user after providing an input data by the user at step 702. The exemplary method 700 further includes the step of analyzing the requirement at step 704. The exemplary method 700 further includes the step of gathering essential and any missing information at step 706. The exemplary method 700 further includes the step of connecting to the application source database at step 708. The exemplary method 700 further includes the step of connecting to the application target database at step 710. The exemplary method 700 further includes the step of comparing the tables in the databases relevant to the requirement at step 712. The exemplary method 700 further includes the step of generating the report with the details of the comparison at step 714. The exemplary method 700 further includes the step of determining if the performed test data management activities are succeeded at step 716. If the performed activities are succeeded at step 716, the exemplary method 700 further includes the step of stop at step 718. If the performed activities are not succeeded at step 716, the exemplary method 700 further includes the step of informing the user (e.g., the admin) about the failed scenario at step 720.
[0058] Referring to FIG. 8 is a flow diagram 800 depicting an exemplary method for performing the actions with the data analysis and identification module 210 by the cognitive or artificial intelligence engine 202, in accordance with one or more exemplary embodiments. As an option, the method 800 may be carried out in the context of the details of FIG. 1, FIG. 2, FIG. 3, and FIG. 4, FIG. 5, FIG. 6, and FIG. 7. However, the method 800 may also be carried out in any desired environment. Further, the aforementioned definitions may equally apply to the description below.
[0059] As illustrated in the flow diagram, the exemplary method 800 includes the step of conversing with the user after providing an input data by the user at step 802. In response to the conversation, the exemplary method 800 further includes the step of parsing the user input through the reference information in the database at step 804. The exemplary method 800 further includes the step of determining if the input data is valid at step 806. If the input data is valid at step 806, the exemplary method 800 further includes a step of determining if the receipt of all essential input data is checked at step 808. If the receipt of all essential input data is checked at step 808, the exemplary method 800 further includes a step of designing the scripts for the data extraction at step 810. In response to the data extraction progresses, the exemplary method 800 further includes the step of determining if the data is available at step 812. If the data is available at step 812, the exemplary method 800 includes the step of appearing the data on the end-user device 102 with excel expert option at step 814. The exemplary method 800 further includes the step of storing the conversational log repository in the database 204 at step 816. The exemplary method 800 further includes the step of learning and enhancing the bag of keywords and actions using the conversational log repository in the database 204 at step 818. If the data is not available at step 812, the exemplary method 800 includes the step of triggering communications to the user (e.g. an admin) with the scenario at step 820. Here, the communications may include, but are not limited to, e-mails, SMS text messages, video calls, audio calls, and the like. If the receipt of all essential input data is not checked at step 808, then the exemplary method 800 continues at step 804. If the input data is not valid at step 806, then the exemplary method 800 continues at step 816.
[0060] More illustrative information will now be set forth regarding various optional architectures and uses in which the foregoing method may or may not be implemented, per the desires of the user. It should be strongly noted that the following information is set forth for illustrative purposes and should not be construed as limiting in any manner. Any of the following features may be optionally incorporated with or without the exclusion of other features described.
[0061] Referring to FIG. 9 is a flow diagram 900 depicting an exemplary method for performing the actions with data subset module 206 by the cognitive or artificial intelligence engine, in accordance with one or more exemplary embodiments. As an option, the method 900 may be carried out in the context of the details of FIG. 1, FIG. 2, FIG. 3, and FIG. 4, FIG. 5, FIG. 6, FIG. 7and FIG. 8. However, the method 900 may also be carried out in any desired environment. Further, the aforementioned definitions may equally apply to the description below.
[0062] As illustrated in the flow diagram, the exemplary method 900 includes the step of conversing with the user after providing an input data by the user at step 902. The exemplary method 900 further includes the step of gathering information from the conversation at step 904. The exemplary method 900 further includes the step of analyzing the data storage and capacity analyzer module 502 at step 906. The exemplary method 900 further includes the step of presenting the analysis data to the user on the end-user device 102 at step 908. The exemplary method 900 further includes the step of obtaining the user’s opinion on the subset analysis at step 910. Here, the subset analysis is required based on the database and test requirement analysis. The exemplary method 900 further includes the step of determining if the subset analysis and test requirement analysis are required at step 912. If the subset analysis and test requirement analysis are required at step 912, the exemplary method 900 includes the step of designing the subset criteria at step 914. The exemplary method 900 further includes the step of connecting to the application source database 218 and exporting the data files based on the subset criteria at step 916. The exemplary method 900 further includes the step of importing or loading the data files to the application target database 220 at step 918. If the subset analysis and test requirement analysis are not required at step 912, performing the data loads without any subset criteria at step 920.
[0063] Referring to FIG. 10 is a flow diagram 1000 depicting an exemplary method for performing the actions with data generation module 212 by the cognitive or artificial intelligence engine 202, in accordance with one or more exemplary embodiments. As an option, the method 1000 may be carried out in the context of the details of FIG. 1, FIG. 2, FIG. 3, and FIG. 4, FIG. 5, FIG. 6, FIG. 7, FIG. 8, and FIG. 9. However, the method 1000 may also be carried out in any desired environment. Further, the aforementioned definitions may equally apply to the description below.
[0064] As illustrated in the flowchart, the exemplary method 1000 includes the step of conversing with the user after providing an input data by the user at step 1002. The exemplary method 1000 further includes the step of analyzing the requirement at step 1004. The exemplary method 1000 further includes the step of gathering essential and any missing information at step 1006. The exemplary method 1000 further includes the step of parsing through the database with the keywords at step 1008. The exemplary method 1000 further includes the step of obtaining the entity details and actions corresponding to the entities at step 1010. The exemplary method 1000 further includes the step of identifying the referential integrity (RI) and parent or child table details at step 1010. The exemplary method 1000 further includes the step of designing a script to insert the data or calls the application programming interface (API) which inserts the data at step 1012. The exemplary method 1000 further includes the step of informing the user about the successful data generation at step 1014. The exemplary method 1000 further includes the step of assisting the user to connect to the database and validate the data created at step 1016. The exemplary method 1000 further includes the step of collecting the feedback about the activity performed at step 1018. The exemplary method 1000 further includes the step of determining if the performed activity is succeeded at step 1020. If the performed activity is succeeded at step 1020, the exemplary method 1000 includes the step of stop at step 1022. If the performed activity is not succeeded at step 1020, the exemplary method 1000 further includes the step of informing the user (e.g., the admin) about the failed scenario at step 1024.
[0065] Referring to FIG. 11 is a flow diagram 1100 depicting an exemplary method for performing the actions with data masking module 208 by the cognitive or artificial intelligence engine 202, in accordance with one or more exemplary embodiments. As an option, the method 1100 may be carried out in the context of the details of FIG. 1, FIG. 2, FIG. 3, and FIG. 4, FIG. 5, FIG. 6, FIG. 7, FIG. 8, FIG. 9 and FIG. 10. However, the method 1100 may also be carried out in any desired environment. Further, the aforementioned definitions may equally apply to the description below.
[0066] As illustrated in the flowchart, the exemplary method 1100 includes the step of conversing with the user after providing an input data by the user at step 1102. The exemplary method 1100 further includes the step of analyzing the requirement at step 1104. The exemplary method 1100 further includes the step of interacting with the data masking module 208 at step 1106. The exemplary method 1100 further includes the step of performing the function of displaying the rules in the database 204 at step 1108. The exemplary method 1100 further includes the step of adding or editing the masking rules at step 1110. The exemplary method 1100 further includes the step of providing the connection details in order to connect to the application databases 218 and 220 at step 1112. Here, the connection details may include, but are not limited to, host name, port details, server name, user name and password, and the like. The exemplary method 1100 further includes the step of using the regular expressions and scanning the metadata and data depending on the user requirement at step 1114. The exemplary method 1100 further includes the step of generating the sensitivity report which lists down the attributes corresponding to the entities within the application databases 218 and 220 at step 1116. The exemplary method 1100 further includes the step of reading the sensitivity report, assessing the appropriate masking model for the column deemed sensitive, generating a masking script, identifying parent/child relationships, generating follow-up masking scripts, merging the scripts as a stored procedure and executing the script at step 1118. [0067] More illustrative information will now be set forth regarding various optional architectures and uses in which the foregoing method may or may not be implemented, per the desires of the user. It should be strongly noted that the following information is set forth for illustrative purposes and should not be construed as limiting in any manner. Any of the following features may be optionally incorporated with or without the exclusion of other features described.
[0068] Referring to FIG. 12 is a block diagram illustrating the details of digital processing system 1200 in which various aspects of the present disclosure are operative by execution of appropriate software instructions. Digital processing system 1200 may be used for implementing the cognitive robotic test data management module 108 (or any other system in which the various features disclosed above can be implemented).
[0069] Digital processing system 1200 may contain one or more processors such as a central processing unit (CPU) 1210, random access memory (RAM) 1220, secondary memory 1227, graphics controller 1260, display unit 1270, network interface 1280, an input interface 1290. All the components except display unit 1270 may communicate with each other over communication path 1250, which may contain several buses as is well known in the relevant arts. The components of Figure 12 are described below in further detail.
[0070] CPU 1210 may execute instructions stored in RAM 1220 to provide several features of the present disclosure. CPU 1210 may contain multiple processing units, with each processing unit potentially being designed for a specific task. Alternatively, CPU 1210 may contain only a single general-purpose processing unit.
[0071] RAM 1220 may receive instructions from secondary memory 1230 using communication path 1250. RAM 1220 is shown currently containing software instructions, such as those used in threads and stacks, constituting shared environment 1225 and/or user programs 1226. Shared environment 1225 includes operating systems, device drivers, virtual machines, etc., which provide a (common) run time environment for execution of user programs 1226. [0072] Graphics controller 1260 generates display signals (e.g., in RGB format) to display unit 1270 based on data/instructions received from CPU 1210. Display unit 1270 contains a display screen to display the images defined by the display signals. Input interface 1290 may correspond to a keyboard and a pointing device (e.g., touch-pad, mouse) and may be used to provide inputs. Network interface 1280 provides connectivity to a network (e.g., using Internet Protocol), and may be used to communicate with other systems (such as those shown in Figure 1, network 104) connected to the network.
[0073] Secondary memory 1230 may contain hard drive 1235, flash memory 1236, and removable storage drive 1237. Secondary memory 1230 may store the data software instructions (e.g., for performing the actions noted above with respect to the Figures), which enable digital processing system 1200 to provide several features in accordance with the present disclosure.
[0074] Some or all of the data and instructions may be provided on the removable storage unit 1240, and the data and instructions may be read and provided by removable storage drive 1237 to CPU 1210. Floppy drive, magnetic tape drive, CD-ROM drive, DVD Drive, Flash memory, a removable memory chip (PCMCIA Card, EEPROM) are examples of such removable storage drive 1237.
[0075] The removable storage unit 1240 may be implemented using medium and storage format compatible with removable storage drive 1237 such that removable storage drive 1237 can read the data and instructions. Thus, removable storage unit 1240 includes a computer readable (storage) medium having stored therein computer software and/or data. However, the computer (or machine, in general) readable medium can be in other forms (e.g., non-removable, random access, etc.).
[0076] In this document, the term "computer program product" is used to generally refer to the removable storage unit 1240 or hard disk installed in hard drive 1235. These computer program products are means for providing software to digital processing system 1200. CPU 1210 may retrieve the software instructions, and execute the instructions to provide various features of the present disclosure described above. [0077] The term“storage media/medium” as used herein refers to any non-transitory media that store data and/or instructions that cause a machine to operate in a specific fashion. Such storage media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical disks, magnetic disks, or solid-state drives, such as storage memory 1230. Volatile media includes dynamic memory, such as RAM 1220. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.
[0078] Storage media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between storage media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 1250. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
[0079] Reference throughout this specification to “one embodiment”, “an embodiment”, or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, appearances of the phrases“in one embodiment”,“in an embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
[0080] Although the present disclosure has been described in terms of certain preferred embodiments and illustrations thereof, other embodiments and modifications to preferred embodiments may be possible that are within the principles and spirit of the invention. The above descriptions and figures are therefore to be regarded as illustrative and not restrictive.
[0081] Thus the scope of the present disclosure is defined by the appended claims and includes both combinations and sub-combinations of the various features described hereinabove as well as variations and modifications thereof, which would occur to persons skilled in the art upon reading the foregoing description.

Claims

CLAIMS I Claim:
1. A cognitive robotic system for test data management activities, comprising: at least one processing device; at least one computer-readable medium comprising computer-executable instructions, that when executed by the at least one processing device, instruct at least one end-user device to carry out test data management activities within the computer- implemented environment; and a cognitive robotic test data management module stored in the at least one computer-readable medium, the cognitive robotic test data management module configured to interact with a user via a user interface accessible to the user via the at least one end-user device, the cognitive robotic test data management module is configured to give a response in return to the user after user provides an input data, whereby the response depends on user requirements and based on the interaction with various downstream modules, the cognitive robotic test data management module comprises a cognitive or artificial intelligence engine configured to process the user input, analyzes and assists in responding to the user with answers to queries, the cognitive or artificial intelligence engine is also configured to understand the requirement scope and estimate the effort to complete the end to end activities within a task.
2. The cognitive robotic system of claim 1 , wherein the cognitive robotic test data management module is configured to virtually assist the user who can be a QA engineer, an operations engineer, a test data engineer or a data analyst, a data steward etc., in each test data management activity.
3. The cognitive robotic system of claim 1, wherein the cognitive robotic test data management module comprises a data generation module configured to assist the users to analyze the data generation requirements which the user mentions in the conversation with the cognitive or artificial intelligence engine using the at least one end-user device.
4. The cognitive robotic system of claim 1 , wherein the cognitive robotic test data management module comprises a database configured to store conversational logs which are used by the user for learning and enhancing knowledge repository which is also available in the database.
5. The cognitive robotic system of claim 1, wherein the cognitive robotic test data management module comprises a data subset module configured to assist the users to analyze the data subset requirements which the user mentions in the conversation with the cognitive or artificial intelligence engine using the at least one end-user device.
6. The cognitive robotic system of claim 1, wherein the cognitive or artificial intelligence engine is configured to understand the test data requirements out of the functional test and propose a test data strategy and also allow the user to build his own conditions on a graphical user interface.
7. The cognitive robotic system of claim 1, wherein the cognitive or artificial intelligence engine that reads through the requirements written in particular language, interpret them in the application terminology and advice the user about the data provisioning approach.
8. The cognitive robotic system of claim 1, wherein the cognitive or artificial intelligence engine configured to engage in conversations with automation hots and supply the information required for functional test automation and defect management.
9. The cognitive robotic system of claim 1, wherein the cognitive or artificial intelligence engine brings in a data reservation functionality which ensures the test data is not duplicated across the users while the non-production environment is shared.
10. The cognitive robotic system of claim 1, wherein the cognitive or artificial intelligence engine configured to generate a template in various formats and load the data into the database and also enable the formats by gathering the required information from the rule or domain module, information obtained by reading through the schema of the database and requirements obtained from the user.
11. The cognitive robotic system of claim 1 , wherein the cognitive or artificial intelligence engine configured to connect to the database and upon command, gather the details of the schema, the tables in the database, the columns in the tables and the nature of the data and feeds it to the cognitive or artificial intelligence engine.
12. The cognitive robotic system of claim 1, wherein the cognitive or artificial intelligence engine configured to automate the creation of reference information and rules that are essential to perform the test data management activities on the non-production environment as well as for data analysis in general.
13. The cognitive robotic system of claim 1, wherein the cognitive or artificial intelligence engine configured to engage in conversation with the user in order to understand what user requires and if needed guide the user on actions required to accomplish the task and also predict the test data based on the learnings from the conversations for a given functional testing scenario.
14. The cognitive robotic system of claim 1, wherein the cognitive or artificial intelligence engine configured to analyze the data size in an application source database and the capacity in an application target database.
15. A method for test data management activities, comprising: conversing with a cognitive robotic test data management module after providing an input data by a user via at least one end-user device; analyzing the requirement, gathering essential and any missing information by the cognitive robotic test data management module, whereby the cognitive robotic test data management module interacts with the user via a user interface accessible to the user via the at least one end-user device, the cognitive robotic test data management module comprises a cognitive or artificial intelligence engine processes the user input, analyses, and assists respond to the user with answers to queries, the cognitive robotic test data management module comprises a data subset module assists the users to analyze the data subset requirements which the user mentions in the conversation with the cognitive or artificial intelligence engine using the at least one end-user device; connecting the cognitive or artificial intelligence engine to an application source database and an application target database, the cognitive or artificial intelligence engine comprises a data comparison module compares tables in the application source database and an application target database relevant to a requirement; generating information pertaining to various stages and presenting the generated information to the user based on the queries as part of the conversation by a data generation module; and performing test data reservation to block the data for a test case by the cognitive or artificial intelligence engine and informing the user about a failed or succeeded scenario, ability to learn from the conversation logs and enhance its rule dictionary as well as a bag of keywords to reduce the number of failure scenarios.
16. The method of claim 15, wherein the input data parses through reference information in a database.
17. The method of claim 15, wherein the cognitive robotic test data management module comprises a data analysis and identification module assists the user to analyze the requirements which user mentions in the conversations with the cognitive or artificial intelligence engine using the at least one end-user device.
18. The method of claim 17, wherein the data analysis and identification module comprises a data analyzer module gathers a plurality of keywords from the user conversation and also analyzes essential information and also obtains the same through its conversation with the user.
19. The method of claim 17, wherein the data analysis and identification module comprises a script designer module performs the function of gathering the information from the data analyzer module and parses through the plurality of keywords in the database.
20. The method of claim 19, wherein the script designer module passes the designed script to a data extractor module and the data extractor module executes the script against the application target database.
21. A computer program product comprising a non-transitory computer-readable medium having a computer-readable program code embodied therein to be executed by one or more processors, said program code including instructions to:
converse with a cognitive robotic test data management module after providing an input data by a user via at least one end-user device; analyze the requirement, gathering essential and any missing information by the cognitive robotic test data management module, whereby the cognitive robotic test data management module interacts with the user via a user interface accessible to the user via the at least one end-user device, the cognitive robotic test data management module comprises a cognitive or artificial intelligence engine processes the user input, analyses, and assists respond to the user with answers to queries, the cognitive robotic test data management module comprises a data subset module assists the users to analyze the data subset requirements which the user mentions in the conversation with the cognitive or artificial intelligence engine using the at least one end-user device; connect the cognitive or artificial intelligence engine to an application source database and an application target database, the cognitive or artificial intelligence engine comprises a data comparison module compares tables in the application source database and an application target database relevant to a requirement; generate information pertaining to various stages and present the generated information to the user based on queries as part of the conversation by a data generation module; and perform test data reservation to block the data for a test case by the cognitive or artificial intelligence engine and inform the user about a failed or succeeded scenario, ability to learn from the conversation logs and enhance its rule dictionary as well as a bag of keywords to reduce the number of failure scenarios.
22. The computer program product of claim 21 , wherein the data subset module comprises a data storage and capacity analyzer module measures the size of the application target database.
23. The computer program product of claim 22, wherein the data storage and capacity analyzer module presents the storage and capacity analysis to the user through the conversation on the at least one end-user device for the natural language conversation.
24. The computer program product of claim 22, wherein the data subset module comprises a driver file generation module gathers information from the user through the conversations and also designs driver file which includes a script necessary to extract the cut-down version of the data.
25. The computer program product of claim 24, wherein the driver file generation module triggers the data storage and capacity analyzer module to validate if the application target database accommodates the reduced volumes.
PCT/IB2019/052651 2018-04-05 2019-04-01 Cognitive robotic system for test data management activities and method employed thereof WO2019193479A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN201841013063 2018-04-05
IN201841013063 2018-04-05

Publications (1)

Publication Number Publication Date
WO2019193479A1 true WO2019193479A1 (en) 2019-10-10

Family

ID=68100163

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2019/052651 WO2019193479A1 (en) 2018-04-05 2019-04-01 Cognitive robotic system for test data management activities and method employed thereof

Country Status (1)

Country Link
WO (1) WO2019193479A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113051152A (en) * 2021-02-20 2021-06-29 武汉木仓科技股份有限公司 Task data generation method and device and processing equipment
US11907402B1 (en) 2021-04-28 2024-02-20 Wells Fargo Bank, N.A. Computer-implemented methods, apparatuses, and computer program products for frequency based operations

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106201898A (en) * 2016-07-26 2016-12-07 北京班墨科技有限责任公司 A kind of method and device of test software based on artificial intelligence
WO2017041372A1 (en) * 2015-09-07 2017-03-16 百度在线网络技术(北京)有限公司 Man-machine interaction method and system based on artificial intelligence

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017041372A1 (en) * 2015-09-07 2017-03-16 百度在线网络技术(北京)有限公司 Man-machine interaction method and system based on artificial intelligence
CN106201898A (en) * 2016-07-26 2016-12-07 北京班墨科技有限责任公司 A kind of method and device of test software based on artificial intelligence

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113051152A (en) * 2021-02-20 2021-06-29 武汉木仓科技股份有限公司 Task data generation method and device and processing equipment
CN113051152B (en) * 2021-02-20 2023-03-24 武汉木仓科技股份有限公司 Task data generation method and device and processing equipment
US11907402B1 (en) 2021-04-28 2024-02-20 Wells Fargo Bank, N.A. Computer-implemented methods, apparatuses, and computer program products for frequency based operations

Similar Documents

Publication Publication Date Title
US11099237B2 (en) Test prioritization and dynamic test case sequencing
US12001788B2 (en) Systems and methods for diagnosing problems from error logs using natural language processing
US10572360B2 (en) Functional behaviour test system and method
US9122804B2 (en) Logic validation and deployment
US8984489B2 (en) Quality on submit process
Krüger et al. Effects of explicit feature traceability on program comprehension
US9020944B2 (en) Systems and methods for organizing documented processes
US10459835B1 (en) System and method for controlling quality of performance of digital applications
US9311345B2 (en) Template based database analyzer
US10380011B2 (en) Method, apparatus, and computer-readable medium for performing functional testing of software
US9965252B2 (en) Method and system for generating stateflow models from software requirements
US20150074045A1 (en) Business Rule Management System
US9678856B2 (en) Annotated test interfaces
US20230289444A1 (en) Data traffic characterization prioritization
WO2019193479A1 (en) Cognitive robotic system for test data management activities and method employed thereof
Li et al. Testing machine learning systems in industry: an empirical study
Yu et al. Localizing function errors in mobile apps with user reviews
US10162849B1 (en) System, method, and computer program for automatic database validation associated with a software test
CN113051262A (en) Data quality inspection method, device, equipment and storage medium
Muratdağı IDENTIFYING TECHNICAL DEBT AND TOOLS FOR TECHNICAL DEBT MANAGEMENT IN SOFTWARE DEVELOPMENT
Cardenas On Supporting Android Software Developers and Testers
Li Mining development knowledge to understand and support software logging practices
Foushee Prevalence of Reflexivity and Its Impact on Success in Open Source Software Development: An Empirical Study

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19781100

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19781100

Country of ref document: EP

Kind code of ref document: A1