US20220239609A1 - System and method for executing operations in a performance engineering environment - Google Patents
System and method for executing operations in a performance engineering environment Download PDFInfo
- Publication number
- US20220239609A1 US20220239609A1 US17/248,461 US202117248461A US2022239609A1 US 20220239609 A1 US20220239609 A1 US 20220239609A1 US 202117248461 A US202117248461 A US 202117248461A US 2022239609 A1 US2022239609 A1 US 2022239609A1
- Authority
- US
- United States
- Prior art keywords
- operations
- application
- environment
- endpoint
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/50—Testing arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/3003—Monitoring arrangements specially adapted to the computing system or computing system component being monitored
- G06F11/302—Monitoring arrangements specially adapted to the computing system or computing system component being monitored where the computing system component is a software system
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3409—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
- G06F11/3419—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment by assessing time
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3664—Environments for testing or debugging software
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/30—Semantic analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/40—Processing or translation of natural language
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/46—Multiprogramming arrangements
- G06F9/54—Interprogram communication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/08—Monitoring or testing based on specific metrics, e.g. QoS, energy consumption or environmental parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/02—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail using automatic reactions or user delegation, e.g. automatic replies or chatbot-generated messages
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/04—Real-time or near real-time messaging, e.g. instant messaging [IM]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/16—Threshold monitoring
Definitions
- the following relates generally to executing operations in a performance engineering environment, such as in an application testing and/or application development environment.
- Mobile performance testing typically measures key performance indicators (KPIs) from three perspectives, namely the end-user perspective, the network perspective, and the server perspective.
- KPIs key performance indicators
- the end-user perspective looks at installation, launch, transition, navigation, and uninstallation processes.
- the network perspective looks at network performance on different network types.
- the server perspective looks at transaction response times, throughput, bandwidth, and latency. This type of testing is performed in order to identify root causes of application performance bottlenecks to fix performance issues, lower the risk of deploying systems that do not meet business requirements, reduce hardware and software costs by improving overall system performance, and support individual, project-based testing and centers of excellence.
- performance engineers are typically faced with several different testing and monitoring platforms and often require specialized knowledge or training in order to use the platforms and associated applications. This can limit the persons that are able to execute performance engineering tasks involved in implementing application testing and application development.
- performance engineering tasks can include interacting with various systems to load builds, initiate and execute tests, gather test results, analyze test results, and package or provide the results to interested parties.
- these various tasks may require access to several different systems and can require a technical understanding of how these systems work and what output they provide.
- many performance engineering tasks such as data gathering, and data analyses, can include significant manual efforts, consuming additional time and effort. With several systems being used, it can also be difficult for a single user to manage all of the tasks required.
- FIG. 1 is a schematic diagram of an example computing environment.
- FIG. 2 is a block diagram of an example configuration of an application development environment.
- FIG. 3 is a block diagram of an example configuration of an application testing environment.
- FIG. 4 is a schematic diagram of an example of a task automation system integrated with application development and testing environments.
- FIG. 5 is a block diagram of an example configuration of a task automation system.
- FIG. 6 is a block diagram of an example configuration of an enterprise system.
- FIG. 7 is a block diagram of an example configuration of a test device used to test an application build in the application testing environment.
- FIG. 8 is a block diagram of an example configuration of a client device used to interface with, for example, the task automation system.
- FIG. 9 is a flow diagram of an example of computer executable instructions for executing tasks in a performance engineering environment such as an application testing or development environment.
- FIG. 10 is a screen shot of an example of a graphical user interface (GUI) for a chat user interface.
- GUI graphical user interface
- FIG. 11 is a screen shot of the chat user interface of FIG. 10 after subsequent messaging.
- FIG. 12 is a flow diagram of an example of computer executable instructions for initiating periodic testing based on acquiring a latest build for an application.
- the following generally relates to a task automation system that can be integrated with or within a performance engineering environment such as a computing environment which includes an application testing environment and/or an application development environment to enable operations and instructions to be requested and executed using a conversational chat user interface (UI) to simplify the interactions with such an environment as well as simplify the gathering and consumption of data generated through performance engineering tasks and operations.
- a performance engineering environment such as a computing environment which includes an application testing environment and/or an application development environment to enable operations and instructions to be requested and executed using a conversational chat user interface (UI) to simplify the interactions with such an environment as well as simplify the gathering and consumption of data generated through performance engineering tasks and operations.
- UI conversational chat user interface
- the task automation system described herein provides both chat UI and background functionality to automate certain performance engineering tasks, in order to enable, for example, non-technical persons to execute technical or specialized work items, to enable more efficient operations within these environments, and to decrease time, effort and costs associated with manual efforts normally associated with performance engineering execution and monitoring.
- a conversational-based logic system is also provided that can be implemented with third party or built-in natural language processing (NLP) and/or natural language understanding (NLU), a user friendly GUI, and a conversation automation module or “dialog manager”.
- NLP natural language processing
- NLU natural language understanding
- GUI graphical user interface
- the logic system and GUI can allow a non-technical user to initiate tests, analyze results and trigger actions through a conversational exchange with a chatbot that is tied into a background system to automate as much of the desired process as possible.
- users can execute mobile device management, such as build download and installation for “build on build” performance engineering.
- Users can also be provided with a test execution assistant via the chatbot, to allow the user to request and initiate any type of test, such as performance tests, UI tests (for both browser and mobile versions), etc.
- the chatbot can also provide a conversational tool to allow users to conduct basic conversations about the performance testing, fetch or generate test results or status updates, determine what the results mean, what the user may be
- a “build” may refer to the process of creating an application program for a software release, by taking all the relevant source code files and compiling them and then creating build artifacts, such as binaries or executable program(s), etc.
- “Build data” may therefore refer to any files or other data associated with a build.
- the terms “build” and “build data” (or “build file”) may also be used interchangeably to commonly refer to a version or other manifestation of an application, or otherwise the code or program associated with an application that can be tested for performance related metrics.
- a device for executing operations in such a performance engineering environment includes a processor, a communications module coupled to the processor, and a memory coupled to the processor.
- the memory stores computer executable instructions that when executed by the processor cause the processor to receive via the communications module, a request to implement a task within the environment, from an input to a chat user interface.
- the computer executable instructions when executed, also cause the processor to communicate with a logic system to determine an intent from the request; determine one or more executable instructions to implement one or more operations associated with the task, based on the determined intent; and communicate via the communications module, with at least one endpoint to trigger execution of the one or more operations using the one or more executable instructions.
- the computer executable instructions when executed, also cause the processor to receive via the communications module, data from the at least one endpoint, the data associated with execution of the one or more operations; generate a conversational response to the request based on or including the data received from the at least one endpoint; and have the conversational response rendered in the chat user interface.
- a method of executing operations in a performance engineering environment is executed by a device having a communications module.
- the method includes receiving via the communications module, a request to implement a task within the environment, from an input to a chat user interface; communicating with a logic system to determine an intent from the request; and determining one or more executable instructions to implement one or more operations associated with the task, based on the determined intent.
- the method also includes communicating via the communications module, with at least one endpoint to trigger execution of the one or more operations using the one or more executable instructions; receiving via the communications module, data from the at least one endpoint, the data associated with execution of the one or more operations; generating a conversational response to the request based on or including the data received from the at least one endpoint; and having the conversational response rendered in the chat user interface.
- non-transitory computer readable medium for executing operations in a performance engineering environment.
- the computer readable medium includes computer executable instructions for receiving via a communications module, a request to implement a task within the environment, from an input to a chat user interface.
- the computer readable medium also includes instructions for communicating with a logic system to determine an intent from the request; determining one or more executable instructions to implement one or more operations associated with the task, based on the determined intent; and communicating via the communications module, with at least one endpoint to trigger execution of the one or more operations using the one or more executable instructions.
- the computer readable medium also includes instructions for receiving via the communications module, data from the at least one endpoint, the data associated with execution of the one or more operations; generating a conversational response to the request based on or including the data received from the at least one endpoint; and having the conversational response rendered in the chat user interface.
- the device can further have the logic system access a language server to determine the intent from the request. This may include having the logic system use the language server to apply a natural language understanding (NLU) process to correlate the request to the one or more operations associated with the task.
- NLU natural language understanding
- the device can generate the executable instructions to implement the one or more operations in a format that is understandable to the at least one endpoint, wherein the device is configured to generate executable instructions in formats understandable to a plurality of disparate systems.
- the device can have the logic system communicate with the at least one endpoint via a respective application programming interface (API) exposed by a respective endpoint, to trigger execution of the one or more operations.
- API application programming interface
- the one or more operations can include initiating an application build download process, wherein the corresponding endpoint returns an initiation status.
- the one or more operations can include initiating a performance test, wherein the corresponding endpoint returns test data.
- the one or more operations can include checking the status of an executed or currently executing performance test, wherein the corresponding endpoint returns a status indicator.
- the request can be translated from a first spoken language to a second spoken language processed by the logic system.
- the device can have the logic system access a model generated by a machine learning system.
- the machine learning system can use messages exchanged via the chat user interface to build and/or refine the model over time.
- the performance engineering environment includes an application testing environment and an application development environment.
- the device can be coupled to both the application testing and application development environments to provide a centrally placed chat user interface to execute operations and receive data and status information from a plurality of disparate systems.
- FIG. 1 illustrates an exemplary computing environment 8 .
- the computing environment 8 may include an application testing environment 10 , an application development environment 12 , and a communications network 14 connecting one or more components of the computing environment 8 .
- the computing environment 8 may also include or otherwise be connected to an application deployment environment 16 , which provides a platform, service, or other entity responsible for posting or providing access to applications that are ready for use by client devices.
- the computing environment 8 may also include or otherwise be connected to a task automation system 24 , which provides both chat UI and background functionality to automate certain performance engineering tasks, in order to enable, for example, non-technical workforce to execute technical or specialized work items, to enable more efficient operations within these environments, and to decrease time, effort and costs associated with manual efforts normally associated with performance engineering execution and monitoring.
- the application development environment 12 includes or is otherwise coupled to one or more repositories or other data storage elements for storing application build data 18 .
- the application build data 18 can include any computer code and related data and information for an application to be deployed, e.g., for testing, execution or other uses.
- the application build data 18 can be provided via one or more repositories and include the data and code required to perform application testing on a device or simulator.
- FIG. 1 illustrates a number of test devices 22 that resemble a mobile communication device, such testing devices 22 can also include simulators, simulation devices or simulation processes, all of which may be collectively referred to herein as “test devices 22 ” for ease of illustration.
- the application testing environment 10 may include or otherwise have access to one or more repositories or other data storage elements for storing application test data 20 , which includes any files, reports, information, results, metadata or other data associated with and/or generated during a test implemented within the application testing environment 10 .
- a client device 26 which may represent any electronic device that can be operated by a user to interact or otherwise use the task automation system 24 as herein described.
- the computing environment 8 may be part of an enterprise or other organization that both develops and tests applications.
- the communication network 14 may not be required to provide connectivity between the application development environment 12 , the task automation system 24 , and the application testing environment 10 , wherein such connectivity is provided by an internal network.
- the application development environment 12 , task automation system 24 , and application testing environment 10 may also be integrated into the same enterprise environment as subsets thereof. That is, the configuration shown in FIG. 1 is illustrative only.
- the computing environment 8 can include multiple enterprises or organizations, e.g., wherein separate organizations are configured to, and responsible for, application testing and application development. For example, an organization may contract a third-party to develop an app for their organization but perform testing internally to meet proprietary or regulatory requirements.
- the application deployment environment 16 may likewise be implemented in several different ways.
- the deployment environment 16 may include an internal deployment channel for employee devices, may include a public marketplace such as an app store, or may include any other channel that can make the app available to clients, consumers or other users.
- One example of the computing environment 8 may include a financial institution system (e.g., a commercial bank) that provides financial services accounts to users and processes financial transactions associated with those financial service accounts.
- a financial institution system e.g., a commercial bank
- Such a financial institution system may provide to its customers various browser-based and mobile applications, e.g., for mobile banking, mobile investing, mortgage management, etc.
- Test devices 22 can be, or be simulators for, client communication devices that would normally be associated with one or more users. Users may be referred to herein as customers, clients, correspondents, or other entities that interact with the enterprise or organization associated with the computing environment 8 via one or more apps. Such customer communication devices are not shown in FIG. 1 since such devices would typically be used outside of the computing environment 8 in which the development and testing occurs.
- Client device 26 shown in FIG. 1 may be a similar type of device as a customer communication device and is shown to illustrate a manner in which an individual can interact with the task automation system 24 . However, it may be noted that such customer communication devices and/or client device 26 may be connectable to the application deployment environment 16 , e.g., to download newly developed apps, to update existing apps, etc.
- a user may operate the customer communication devices such that customer device performs one or more processes consistent with what is being tested in the disclosed embodiments.
- the user may use customer device to engage and interface with a mobile or web-based banking application which has been developed and tested within the computing environment 8 as herein described.
- test devices 22 , customer devices, and client device 26 can include, but are not limited to, a personal computer, a laptop computer, a tablet computer, a notebook computer, a hand-held computer, a personal digital assistant, a portable navigation device, a mobile phone, a wearable device, a gaming device, an embedded device, a smart phone, a virtual reality device, an augmented reality device, third party portals, an automated teller machine (ATM), and any additional or alternate computing device, and may be operable to transmit and receive data across communication networks such as the communication network 14 shown by way of example in FIG. 1 .
- ATM automated teller machine
- Communication network 14 may include a telephone network, cellular, and/or data communication network to connect different types of electronic devices.
- the communication network 14 may include a private or public switched telephone network (PSTN), mobile network (e.g., code division multiple access (CDMA) network, global system for mobile communications (GSM) network, and/or any 3G, 4G, or 5G wireless carrier network, etc.), WiFi or other similar wireless network, and a private and/or public wide area network (e.g., the Internet).
- PSTN public switched telephone network
- CDMA code division multiple access
- GSM global system for mobile communications
- WiFi or other similar wireless network e.g., WiFi or other similar wireless network
- a private and/or public wide area network e.g., the Internet
- the computing environment 8 may also include a cryptographic server (not shown) for performing cryptographic operations and providing cryptographic services (e.g., authentication (via digital signatures), data protection (via encryption), etc.) to provide a secure interaction channel and interaction session, etc.
- a cryptographic server can also be configured to communicate and operate with a cryptographic infrastructure, such as a public key infrastructure (PKI), certificate authority (CA), certificate revocation service, signing authority, key server, etc.
- PKI public key infrastructure
- CA certificate authority
- certificate revocation service e.g., certificate revocation service, signing authority, key server, etc.
- the cryptographic server and cryptographic infrastructure can be used to protect the various data communications described herein, to secure communication channels therefor, authenticate parties, manage digital certificates for such parties, manage keys (e.g., public and private keys in a PKI), and perform other cryptographic operations that are required or desired for particular applications of the application development environment 12 , task automation system 24 , and/or application testing environment 10 .
- the cryptographic server may be used to protect data within the computing environment 8 (include the application build data 18 and/or application test data 20 ) by way of encryption for data protection, digital signatures or message digests for data integrity, and by using digital certificates to authenticate the identity of the users and entity devices with which the application development environment 12 , task automation system 24 , and application testing environment 10 communicate to inhibit data breaches by adversaries. It can be appreciated that various cryptographic mechanisms and protocols can be chosen and implemented to suit the constraints and requirements of the particular deployment of the application development environment 12 and application testing environment 10 as is known in the art.
- the application development environment 12 may include an editor module 30 , a version and access control manager 32 , one or more libraries 34 , and a compiler 36 , which would be typical components utilized in application development.
- the application development environment 12 also includes the application build data 18 , which, while shown within the environment 12 , may also be a separate entity (e.g., repository) used to store and provide access to the stored build files.
- the application development environment 12 also includes or is provided with (e.g., via an application programming interface (API)), a development environment interface 38 .
- API application programming interface
- the development environment interface 38 provides communication and data transfer capabilities between the application development environment 12 and the application testing environment 10 from the perspective of the application development environment 12 . As shown in FIG. 2 , the development environment interface 38 can connect to the communication network 14 to send/receive data and communications to/from the application testing environment 10 , including instructions or commands initiated by/from the task automation system 24 , as discussed further below.
- the editor module 30 can be used by a developer/programmer to create and edit program code associated with an application being developed. This can include interacting with the version and access control manager 32 to control access to current build files and libraries 34 while honoring permissions and version controls.
- the compiler 36 may then be used to compile an application build file and other data to be stored with the application build data 18 .
- a typical application or software development environment 12 may include other functionality, modules, and systems, details of which are omitted for brevity and ease of illustration. It can also be appreciated that the application development environment 12 may include modules, accounts, and access controls for enabling multiple developers to participate in developing an application, and modules for enabling an application to be developed for multiple platforms.
- a mobile application may be developed by multiple teams, each team potentially having multiple programmers. Also, each team may be responsible for developing the application on a different platform, such as Apple iOS or Google Android for mobile versions, and Google Chrome or Microsoft Edge for web browser versions. Similarly, applications may be developed for deployment on different device types, even with the same underlying operating system.
- the application testing environment 10 can automatically obtain and deploy the latest builds to perform application testing in different scenarios.
- Such scenarios can include not only different device types, operating systems, and versions, but also the same build under different operating conditions.
- the application development environment 12 may be implemented using one or more computing devices such as terminals, servers, and/or databases, having one or more processors, communications modules, and database interfaces.
- Such communications modules may include the development environment interface 38 , which enables the application development environment 12 to communicate with one or more other components of the computing environment 8 , such as the application testing environment 10 , via a bus or other communication network, such as the communication network 14 . While not delineated in FIG.
- the application development environment 12 (and any of its devices, servers, databases, etc.) includes at least one memory or memory device that can include a tangible and non-transitory computer-readable medium having stored therein computer programs, sets of instructions, code, or data to be executed by the one or more processors.
- FIG. 2 illustrates examples of modules, tools and engines stored in memory within the application development environment 12 . It can be appreciated that any of the modules, tools, and engines shown in FIG. 2 may also be hosted externally and be available to the application development environment 12 , e.g., via communications modules such as the development environment interface 38 .
- the application testing environment 10 includes a testing environment interface 40 , which is coupled to the development environment interface 38 in the application development environment 12 , a testing execution module 42 , and one or more testing hosts 44 .
- the testing environment interface 40 can provide a UI for personnel or administrators in the application testing environment 10 to coordinate an automated build management process as herein described and to initiate or manage a test execution process as herein described.
- the testing environment interface 40 can also include, as illustrated in FIG. 3 , the task automation system 24 to provide such UI for personnel or administrators, e.g., via a chat UI as described in greater detail below.
- the testing environment interface 40 can provide a platform on which the task automation system 24 can operate to instruct the development environment interface 38 , e.g., by sending a message or command via the communication network 14 , to access the application build data 18 to obtain the latest application build(s) based on the number and types of devices being tested by the testing host(s) 44 .
- the latest application builds are then returned to the application testing environment 10 by the development environment interface 38 to execute an automated build retrieval operation.
- the application build data 18 can be sent directly to the testing host(s) 44 and thus the testing host(s) 44 can also be coupled to the communication network 14 .
- the application build data 18 can also be provided to the testing host(s) 44 via the testing environment interface 40 , e.g., through messages handled by the task automation system 24 via the chat UI 52 (see also FIG. 4 ).
- the host(s) 44 in this example have access to a number of test devices 22 which, as discussed above, can be actual devices or simulators for certain devices.
- the testing host(s) 44 are also scalable, allowing for additional test devices 22 to be incorporated into the application testing environment 10 . For example, a new test device 22 may be added when a new device type is released and will be capable of using the application being tested.
- the application on each test device 22 can be configured to point to the appropriate environment under test and other settings can be selected/deselected.
- the test devices 22 are also coupled to the testing execution module 42 to allow the testing execution module 42 to coordinate tests 46 to evaluate metrics, for example, by executing tests for application traffic monitoring, determining UI response times, examining device logs, and determining resource utilization metrics (with Test 1, Test 2, . . . , Test N; shown in FIG. 3 for illustrative purposes).
- the tests 46 can generate data logs, reports and other outputs, stored as application test data 20 , which can be made available to various entities or components, such as a dashboard 48 .
- the framework 3 enables the application testing environment 10 to download the latest builds from the respective repositories for the respective device/OS platform(s) and run a UI flow on all test devices 22 to configure the environment, disable system pop-ups, and set feature flags. In this way, the framework can automate the build download and installation process.
- the framework shown in FIG. 3 can also enable tests 46 to be initiated, status updates for such tests 46 to be obtained, and other information gathered concerning the tests 46 and/or test data 20 , through inputs interpreted by a chat UI of the task automation system 24 .
- testing environment interface 40 the testing host(s) 44 , and the testing execution module 42 are shown as separate modules in FIG. 3 , such modules may be combined in other configurations and thus the delineations shown in FIG. 3 are for illustrative purposes.
- FIG. 4 a schematic diagram of the task automation system 24 , integrated with the application development environment 10 and application testing environment 12 , is shown.
- the configuration shown in FIG. 4 provides a backend system that can be implemented with a conversational-style message exchange user interface, also referred to herein as a “chat” user interface (UI) 52 .
- UI conversational-style message exchange user interface
- a task automation user interface 50 is provided, which includes the chat UI 52 , namely an application or front-end for users of client devices 26 to interact with the task automation system 24 .
- the task automation user interface 50 is coupled to a logic system 54 , which is used to determine an intent or instruction from a request made to the chat UI 52 via a chat message.
- the logic system 54 can be implemented using, for example the open source BotpressTM conversational AI platform.
- the logic system 54 is coupled to a language server 58 in this example to access and leverage NLP and NLU modules/processes to assist in determining the intent from the request. It can be appreciated that while the language server 58 is provided as a separate component in FIG. 4 , the logic system 54 can also adopt or otherwise include or provide the functionalities of the language server 58 .
- the language server 58 can also be used to allow spoken language translation, e.g., to allow users to input messages in one spoken language that can be translated and interpreted without having the user perform any translation at his/her end. In this way, the task automation system 24 can provide a convenient way for users to interact in a language with which they are comfortable, which can make interpreting the requests more accurate.
- the logic system 54 includes an API module 56 that is configured to make API calls to the different endpoints that can/should be reached based on the intent of the user's request.
- the API module 56 can therefore maintain the appropriate API calls for the various endpoints in the computing environment 8 that can be interacted with or controlled via the chat UI 52 .
- the API module 56 is therefore coupled to one or more internal APIs 62 that connect into the various endpoints.
- the API module 56 can interface with a conversation automation module 60 , which can be an internal component or third party component that can initiate a conversation in the chat UI 52 and collect responses as if it was a regular or routine conversation.
- the internal APIs 62 in this example enable the logic system 54 to communicate with an application build download module 64 , mobile UI testing module 66 , reporting and logging module 68 , and performance testing module 69 by way of example. It can be appreciated that other types of endpoints with API access can be plugged into the configuration shown in FIG. 4 .
- the user can, e.g., via their client device 26 , initiate a request at stage 1 .
- the task automation user interface 50 calls the logic system 54 to process the request, e.g., by inferring an intent from the text in a message input to the chat UI 52 .
- the logic system 54 may return data at stage 3 , either in response to the call at stage 2 or later if additional data such as statistics, dashboards, logs, etc.
- the logic system 54 can interface and communicate with the language server 58 to apply NLP/NLU processes to determine an intent from the request that can be translated into an actionable set of commands executable instructions via one or more API calls to the internal APIs 62 at stage 5 .
- the API module 56 can also be executed at stage 6 , to initiate a conversational response that incorporates any feedback from the API calls.
- Stage 7 in this example includes four sub-stages 7 a , 7 b , 7 c , and 7 d , each of which corresponds to an API call directed to a corresponding endpoint.
- sub-stage 7 a includes a call to the application build download module 64 to have the application testing environment 10 request the latest build from the application development environment 12 .
- testing execution or status updates can be initiated/fetched (sub-stages 7 b , 7 d ), and reports or logs can be requested at sub-stage 7 c.
- Sub-stages 8 a , 8 b , 8 c , and 8 d represent individual endpoint responses to the request generated by the respective module 64 , 66 , 68 , and 69 . These responses can be collected by the conversation automation module 60 to generate one or more responses to the request that can be rendered in the chat UI 52 at stage 9 . This allows the user of the client device 26 to view or otherwise observe such response(s) at stage 10 , e.g., using a chat application 138 (see FIG. 8 ) on the client device 26 .
- the task automation system 24 thus provides a backend platform on which the chat UI 52 can sit to enable the user of a client device 26 to interact with the performance engineering environment (i.e., the computing environment 8 in this example) to execute, initiate, request or perform various operations, tasks or routines, without necessarily requiring knowledge or expertise of the underlying the system(s) that are required to perform these operations.
- the task automation system 24 also leverages a logic system 54 and language server 58 to provide a seamless conversational experience for the user while implementing largely technical background tasks, by inferring the intent of the user's request from the messages exchanged with the chat UI 52 . That is, the chat UI 52 can provide both a front-end messaging-based user interface as well as an embedded or background “chatbot” with which to communicate.
- the task automation system 24 may include one or more processors 70 , a communications module 72 , and a database interface module 74 for interfacing with the datastores for the build data 18 and test data 20 to retrieve, modify, and store (e.g., add) data.
- Communications module 72 enables the task automation system 24 to communicate with one or more other components of the computing environment 8 , such as client device 26 (or one of its components), via a bus or other communication network, such as the communication network 14 . While not delineated in FIG.
- the task automation system 24 includes at least one memory or memory device that can include a tangible and non-transitory computer-readable medium having stored therein computer programs, sets of instructions, code, or data to be executed by processor 70 .
- FIG. 5 illustrates examples of modules, tools and engines stored in memory on the task automation system 24 and operated by the processor 70 . It can be appreciated that any of the modules, tools, and engines shown in FIG. 5 may also be hosted externally and be available to the task automation system 24 , e.g., via the communications module 72 . In the example embodiment shown in FIG.
- the task automation system 24 includes the logic system 54 , which includes a recommendation engine 76 , a machine learning engine 78 , a classification module 80 , a training module 82 , and a trained model 84 .
- the logic system 54 also includes or has access to a language server interface module 88 to interface and/or communicate with the language server 58 as described above.
- the task automation system 24 also includes an access control module 86 and the task automation user interface 50 .
- the task automation user interface 50 includes or has access to the chat UI 52 as shown in FIG. 4 .
- the task automation system 24 also includes the conversation automation module 60 , the API module 62 (which may instead be part of the logic system 54 ), and an enterprise system interface module 87 .
- the recommendation engine 76 is used by the logic system 54 of the task automation system 24 to generate one or more recommendations for the task automation system 24 and/or a client device 26 that is/are related to an association between inputs (requests) to the chat UI 52 and responses to these requests.
- the logic system 54 can obtain a textual input from a user of the client device 26 requesting that a test be initiated, a status update to be obtained or other data to be gathered with respect to a process engineering task. As discussed above, this can include accessing the language server 58 via the language server interface module 88 in order to apply NLP/NLU processes.
- a recommendation as used herein may refer to a prediction, suggestion, inference, association or other recommended identifier that can be used to generate a suggestion, notification, command, instruction or other data that can be viewed, used or consumed by the task automation system 24 , the testing environment interface 40 and/or the client devices 26 interacting with same.
- the recommendation engine 76 can access chat data (not shown) stored for/by the chat UI 52 and apply one or more inference processes to generate the recommendation(s).
- the recommendation engine 76 may utilize or otherwise interface with the machine learning engine 78 to both classify data currently being analyzed to generate a suggestion or recommendation, and to train classifiers using data that is continually being processed and accumulated by the task automation system 24 .
- the recommendation engine 76 can learn request- or response-related preferences and revise and refine classifications, rules or other analytics-related parameters over time.
- the logic system 54 can be used to update and refine the trained model 84 using the training module 82 as client devices 26 interact with the chat UI 52 during various interactions to improve the NLP/NLU parameters and understanding of how users interact with the processing engineering environment.
- the machine learning engine 78 may also perform operations that classify the chat data in accordance with corresponding classifications parameters, e.g., based on an application of one or more machine learning algorithms to the data or groups of the data (also referred to herein as “chat content”, “conversation content”, “user requests” or “user intent”).
- the machine learning algorithms may include, but are not limited to, a one-dimensional, convolutional neural network model (e.g., implemented using a corresponding neural network library, such as Keras®), and the one or more machine learning algorithms may be trained against, and adaptively improved, using elements of previously classified profile content identifying suitable matches between content identified and potential actions to be executed.
- the recommendation engine 76 may further process each element of the content to identify, and extract, a value characterizing the corresponding one of the classification parameters, e.g., based on an application of one or more additional machine learning algorithms to each of the elements of the chat-related content.
- the additional machine learning algorithms may include, but are not limited to, an adaptive NLP algorithm that, among other things, predicts starting and ending indices of a candidate parameter value within each element of the content, extracts the candidate parameter value in accordance with the predicted indices, and computes a confidence score for the candidate parameter value that reflects a probability that the candidate parameter value accurately represents the corresponding classification parameter.
- the one or more additional machine learning algorithms may be trained against, and adaptively improved using, the locally maintained elements of previously classified content.
- Classification parameters may be stored and maintained using the classification module 80
- training data may be stored and maintained using the training module 82 .
- the trained model 84 may also be created, stored, refined, updated, re-trained, and referenced by the task automation system 24 (e.g., by way of the logic system 54 ) to determine associations between request messages and suitable responses or actions, and/or content related thereto. Such associations can be used to generate recommendations or suggestions for improving the conversational exchange and understanding of the users' intents via the text or other information input to the chat UI 52 .
- classification data stored in the classification module 80 may identify one or more parameters, e.g., “classification” parameters, that facilitate a classification of corresponding elements or groups of recognized content based on any of the exemplary machine learning algorithms or processes described herein.
- the one or more classification parameters may correspond to parameters that can indicate an affinity or compatibility between the request and response (chat) data, and certain potential actions.
- a request to initiate a test can include recognition of the message as being a request and the parsing of the message to determine a suitable endpoint and instruction, command or request to forward along to that endpoint.
- the additional, or alternate, machine learning algorithms may include one or more adaptive, NLP algorithms capable of parsing each of the classified portions of the content and predicting a starting and ending index of the candidate parameter value within each of the classified portions.
- adaptive, NLP algorithms include, but are not limited to, NLP models that leverage machine learning processes or artificial neural network processes, such as a named entity recognition model implemented using a SpaCy® library.
- Examples of these adaptive, machine learning processes include, but are not limited to, one or more artificial, neural network models, such as a one-dimensional, convolutional neural network model, e.g., implemented using a corresponding neural network library, such as Keras®.
- the one-dimensional, convolutional neural network model may implement one or more classifier functions or processes, such a Softmax® classifier, capable of predicting an association between an element of event data (e.g., a value or type of data being augmented with an event or workflow) and a single classification parameter and additionally, or alternatively, multiple classification parameters.
- machine learning engine 78 may perform operations that classify each of the discrete elements of event- or workflow-related content as a corresponding one of the classification parameters, e.g., as obtained from classification data stored by the classification module 80 .
- the outputs of the machine learning algorithms or processes may then be used by the recommendation engine 76 to generate one or more suggested recommendations, instructions, commands, notifications, rules, or other instructional or observational elements that can be presented to the client device 26 via the chat UI 52 .
- the access control module 86 may be used to apply a hierarchy of permission levels or otherwise apply predetermined criteria to determine what chat data or other client/user, financial or transactional data can be shared with which entity in the computing environment 8 .
- the task automation system 24 may have been granted access to certain sensitive user profile data for a user, which is associated with a certain client device 26 in the computing environment 8 .
- certain client data may include potentially sensitive information such as age, date of birth, or nationality, which may not necessarily be needed by the task automation system 24 to execute certain actions (e.g., to more accurately determine the spoken language or conversational style of that user).
- the access control module 86 can be used to control the sharing of certain client data or chat data, a permission or preference, or any other restriction imposed by the computing environment 8 or application in which the task automation system 24 is used.
- the task automation system 24 in this example also includes the task automation user interface 50 described above, which provides the chat UI 52 .
- the conversation automation module 60 and API module 62 are also shown in FIG. 5 which, as described above, can be used to generate a response to a request entered in the chat UI 52 and communicate with endpoints within the process engineering environment such as the application testing environment 10 or application development environment 12 to implement a task such as initiating a test or obtaining status events, etc.
- the logic system 54 as well as the task automation system 24 can be considered one or more devices having a processor 70 , memory and a communications module 72 configured to work with, or as part of, the computing environment 8 , to perform the operations described herein. It can be appreciated that the various elements of the task automation system 24 and logic system 54 are shown delineated as such in FIG. 5 for illustrative purposes and clarity of description and could be provided using other configurations and distribution of functionality and responsibilities.
- the task automation system 24 may also include the enterprise system interface module 87 to provide a graphical user interface (GUI) or API connectivity to communicate with an enterprise system 90 (see FIG. 6 ) to obtain client data 98 for a certain user interacting with the task automation system 24 .
- GUI graphical user interface
- API connectivity to communicate with an enterprise system 90 (see FIG. 6 ) to obtain client data 98 for a certain user interacting with the task automation system 24 .
- the enterprise system interface module 87 may also provide a web browser-based interface, an application or “app” interface, a machine language interface, etc.
- FIG. 6 an example configuration of an enterprise system 90 is shown.
- the enterprise system 90 includes a communications module 92 that enables the enterprise system 90 to communicate with one or more other components of the computing environment 8 , such as the application testing environment 10 , application development environment 12 , or task automation system 24 , via a bus or other communication network, such as the communication network 14 .
- the enterprise system 90 includes at least one memory or memory device that can include a tangible and non-transitory computer-readable medium having stored therein computer programs, sets of instructions, code, or data to be executed by one or more processors (not shown for clarity of illustration).
- FIG. 6 illustrates examples of servers and datastores/databases operable within the enterprise system 90 .
- the enterprise system 90 includes one or more servers to provide access to client data 98 , e.g., to assist in determining an intent from a request input to the chat UI 52 or for development or testing purposes.
- Exemplary servers include a mobile application server 94 , a web application server 96 and a data server 100 .
- the enterprise system 90 may also include a cryptographic server for performing cryptographic operations and providing cryptographic services.
- the cryptographic server can also be configured to communicate and operate with a cryptographic infrastructure.
- the enterprise system 90 may also include one or more data storage elements for storing and providing data for use in such services, such as data storage for storing client data 98 .
- Mobile application server 94 supports interactions with a mobile application installed on client device 26 (which may be similar or the same as a test device 22 ). Mobile application server 94 can access other resources of the enterprise system 90 to carry out requests made by, and to provide content and data to, a mobile application on client device 26 . In certain example embodiments, mobile application server 94 supports a mobile banking application to provide payments from one or more accounts of user, among other things.
- Web application server 96 supports interactions using a website accessed by a web browser application running on the client device. It can be appreciated that the mobile application server 94 and the web application server 96 can provide different front ends for the same application, that is, the mobile (app) and web (browser) versions of the same application.
- the enterprise system 90 may provide a banking application that be accessed via a smartphone or tablet app while also being accessible via a browser on any browser-enabled device.
- the client data 98 can include, in an example embodiment, financial data that is associated with users of the client devices (e.g., customers of the financial institution).
- the financial data may include any data related to or derived from financial values or metrics associated with customers of a financial institution system (i.e., the enterprise system 60 in this example), for example, account balances, transaction histories, line of credit available, credit scores, mortgage balances, affordability metrics, investment account balances, investment values and types, among many others.
- Other metrics can be associated with the financial data, such as financial health data that is indicative of the financial health of the users of the client devices 26 .
- An application deployment module 102 is also shown in the example configuration of FIG. 6 to illustrate that the enterprise system 90 can provide its own mechanism to deploy the developed and tested applications onto client devices 26 within the enterprise. It can be appreciated that the application deployment module 102 can be utilized in conjunction with a third-party deployment environment such as an app store to have tested applications deployed to employees and customers/clients.
- FIG. 7 an example configuration of a test device 22 is shown. It can be appreciated that the test device 22 shown in FIG. 7 can correspond to an actual device or represent a simulation of such a device 22 .
- the client device 22 may include one or more processors 110 , a communications module 112 , and a data store 124 storing device data 126 and application data 128 .
- Communications module 112 enables the test device 22 to communicate with one or more other components of the computing environment 8 via a bus or other communication network, such as the communication network 14 . While not delineated in FIG.
- the client device 22 includes at least one memory or memory device that can include a tangible and non-transitory computer-readable medium having stored therein computer programs, sets of instructions, code, or data to be executed by processor 110 .
- FIG. 7 illustrates examples of modules and applications stored in memory on the test device 22 and operated by the processor 110 . It can be appreciated that any of the modules and applications shown in FIG. 7 may also be hosted externally and be available to the test device 22 , e.g., via the communications module 112 .
- the test device 22 includes a display module 114 for rendering GUIs and other visual outputs on a display device such as a display screen, and an input module 116 for processing user or other inputs received at the test device 22 , e.g., via a touchscreen, input button, transceiver, microphone, keyboard, etc.
- the test device 22 may also include an application 118 to be tested that includes the latest application build data 18 to be tested using the test device 22 , e.g., by executing tests.
- the test device 22 may include a host interface module 120 to enable the test device 22 to interface with a testing host for loading an application build.
- the test device 22 in this example embodiment also includes a test execution interface module 122 for interfacing the application 118 with the testing execution module.
- the data store 124 may be used to store device data 126 , such as, but not limited to, an IP address or a MAC address that uniquely identifies test device 22 .
- the data store 124 may also be used to store application data 128 , such as, but not limited to, login credentials, user preferences, cryptographic data (e.g., cryptographic keys), etc.
- the client device 26 may include one or more processors 130 , a communications module 132 , and a data store 144 storing device data 146 and application data 148 .
- Communications module 132 enables the client device 26 to communicate with one or more other components of the computing environment 8 , such as the task automation system 24 , via a bus or other communication network, such as the communication network 14 .
- the client device 26 includes at least one memory or memory device that can include a tangible and non-transitory computer-readable medium having stored therein computer programs, sets of instructions, code, or data to be executed by processor 130 .
- FIG. 8 illustrates examples of modules and applications stored in memory on the client device 26 and operated by the processor 130 . It can be appreciated that any of the modules and applications shown in FIG. 8 may also be hosted externally and be available to the client device 26 , e.g., via the communications module 132 .
- the client device 26 includes a display module 134 for rendering GUIs and other visual outputs on a display device such as a display screen, and an input module 136 for processing user or other inputs received at the client device 26 , e.g., via a touchscreen, input button, transceiver, microphone, keyboard, etc.
- the client device 26 may also include a chat application 138 , which may take the form of a customized app, plug-in, widget, or software component provided by the task automation system 24 for use by the client device 26 to use the chat UI 52 .
- the client device 26 may include an enterprise system application 142 provided by their enterprise system 90 .
- the client device 26 in this example embodiment also includes a web browser application 140 for accessing Internet-based content, e.g., via a mobile or traditional website.
- the data store 144 may be used to store device data 146 , such as, but not limited to, an IP address or a MAC address that uniquely identifies client device 26 within environment 8 .
- the data store 144 may also be used to store application data 148 , such as, but not limited to, login credentials, user preferences, cryptographic data (e.g., cryptographic keys), etc.
- FIGS. 2 to 8 For ease of illustration and various other components would be provided and utilized by the application testing environment 10 , application development environment 12 , task automation system 24 , test device 22 , enterprise system 90 , and client device 26 as is known in the art.
- any module or component exemplified herein that executes instructions may include or otherwise have access to computer readable media such as storage media, computer storage media, or data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape.
- Computer storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
- Examples of computer storage media include RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information, and which can be accessed by an application, module, or both. Any such computer storage media may be part of any of the servers or other devices in the application testing environment 10 , application development environment 12 , task automation system 24 , enterprise system 90 , client device 26 , or test device 22 , or accessible or connectable thereto. Any application or module herein described may be implemented using computer readable/executable instructions that may be stored or otherwise held by such computer readable media.
- the task automation system 24 receives a request to implement a task within the computing environment 8 , from an input to the chat UI 52 .
- the task automation system 24 communicates with the logic system 54 to determine or infer an intent from the request. This can include communicating with the language server 58 and/or referencing the trained model 84 . Moreover, this can also include translating the text of the request from one spoken language into another that can be interpreted by the language server 58 or the logic system 54 .
- the logic system 54 determines executable instructions to implement one or more operations associated with the task based on the intent. That is, the logic system 54 uses the intent inferred from the request to determine which endpoints, if any, should be communicated with to satisfy the request. For example, the request may involve fetching a latest application build, executing one or more tests, and returning a status of the initiated test(s), which would require determining which endpoints require instructions and the associated API calls to generate. This can also include generating executable instructions in a format that is understandable to that endpoint.
- the logic system 54 uses the API module 56 to communicate, via the internal APIs 62 , with the endpoints (e.g., modules 64 , 66 , 68 , 69 in FIG. 4 ).
- the task automation system 24 receives data associated with the execution of the one or more operations, from the endpoints. As illustrated in FIG. 4 , this can include receiving multiple replies to the conversation automation module 60 to be formatted or otherwise rendered as conversational response(s) based on or including the received data, at block 160 . The conversational response(s) can then be rendered in the chat UI 52 at block 162 .
- a screen shot 200 of the chat UI 52 is shown.
- a first user message 204 includes a textual request 206 , namely: “Please initiate a UI test on the latest mobile release” that can be entered via a message input box 202 .
- the chat UI 52 in this example replies immediately with a status message 210 , namely: “Starting that for you . . . ”.
- the chat UI 52 can display a first response message 212 with response text 214 and a link 216 to further content.
- the response text 214 indicates: “The test has been initiated and is in progress”, and the link 216 allows the user to click through to status information on that test. By selecting this link 216 , the user can be navigated to another UI, a dashboard, or additional data can be inserted with in the chat UI screen.
- FIG. 11 the screen shot 200 of the chat UI 52 is shown in a subsequent stage of conversation.
- a further status message 210 is displayed, in this case indicating: “Getting that for you . . . ”, with “that” referring to the status information associated with the link 216 .
- a further response message 212 is then displayed, which includes a visual representation 218 of the mobile UI test status.
- this message can include the status itself if space permits or, as shown in FIG. 11 , the user can select the visual representation 218 to navigate to another user interface such as a statistics or monitoring dashboard.
- the chat UI 52 can provide a central point for the user to issue commands, obtain data, and navigate to other programs without the need to have specialized knowledge or training with respect to these other programs or interfaces.
- FIG. 12 an example embodiment of computer executable instructions for using the chat UI 52 to initiate periodic testing based on acquiring a latest build for an application is shown.
- the process flow proceeds from block 156 (see FIG. 9 ) to the periodic testing execution and returns a result to the chat UI 52 at block 158 (see FIG. 9 ).
- FIG. 12 illustrates an underlying workflow that can be initiated by the user via the chat UI 52 , without requiring knowledge of that workflow.
- one or more tests are run for a current build.
- test results data are acquired.
- the current test results data acquired at block 302 can be compared with previous test results data at block 304 , e.g., to determine whether feedback provided to the application development environment 12 in a previous iteration has led to an improvement in the application. As shown in FIG. 12 , this comparison can be used to provide additional feedback with the test results data send to the application development environment 12 at block 306 . After a period of time, e.g., one day, one week, etc.; the application testing environment 10 detects the next testing cycle at block 308 and requests a new build from the application development environment 12 at block 310 . It can be appreciated that blocks 306 , 308 and 310 can be coordinated and/or executed by the testing environment interface 50 . The process can be repeated by returning to block 300 wherein new testing is performed using the latest build files that have been acquired, installed, and provisioned on the test devices 22 .
- chat UI can provide a familiar and user friendly front end experience for a user to interface with a more complex process via the backend functionality provided by the task automation system 24 .
Abstract
A system and method are provided for executing operations in a performance engineering environment. The method includes receiving a request to implement a task within the environment, from an input to a conversational chat user interface; communicating with a logic system to determine an intent from the request; determining one or more executable instructions to implement one or more operations associated with the task, based on the determined intent; and communicating via the communications module, with at least one endpoint to trigger execution of the one or more operations using the one or more executable instructions; receiving via the communications module, data from the at least one endpoint, the data associated with execution of the one or more operations; generating a conversational response to the request based on or including the data received from the at least one endpoint; and having the conversational response rendered in the chat user interface.
Description
- The following relates generally to executing operations in a performance engineering environment, such as in an application testing and/or application development environment.
- As the number of mobile users increases, so too does the importance of measuring performance metrics on mobile devices. For example, it is found that users expect applications (also referred to herein as “apps”) to load within a short amount of time, e.g., about two seconds. Because of this, some feel that native app load times should be as fast as possible. Additionally, poor app performance can impact an organization in other ways, for example, by increasing the number of technical service requests or calls, as well as negatively impacting ratings or rankings in application marketplaces (e.g., app stores), or more generally reviews or reputation. These negative impacts can also impact customer retention and uptake, particularly for younger generations who value their ability to perform many tasks remotely and with mobility.
- Mobile performance testing typically measures key performance indicators (KPIs) from three perspectives, namely the end-user perspective, the network perspective, and the server perspective. The end-user perspective looks at installation, launch, transition, navigation, and uninstallation processes. The network perspective looks at network performance on different network types. The server perspective looks at transaction response times, throughput, bandwidth, and latency. This type of testing is performed in order to identify root causes of application performance bottlenecks to fix performance issues, lower the risk of deploying systems that do not meet business requirements, reduce hardware and software costs by improving overall system performance, and support individual, project-based testing and centers of excellence.
- In addition to the above technical challenges, performance engineers are typically faced with several different testing and monitoring platforms and often require specialized knowledge or training in order to use the platforms and associated applications. This can limit the persons that are able to execute performance engineering tasks involved in implementing application testing and application development. For example, performance engineering tasks can include interacting with various systems to load builds, initiate and execute tests, gather test results, analyze test results, and package or provide the results to interested parties. However, these various tasks may require access to several different systems and can require a technical understanding of how these systems work and what output they provide. Moreover, many performance engineering tasks such as data gathering, and data analyses, can include significant manual efforts, consuming additional time and effort. With several systems being used, it can also be difficult for a single user to manage all of the tasks required.
- Embodiments will now be described with reference to the appended drawings wherein:
-
FIG. 1 is a schematic diagram of an example computing environment. -
FIG. 2 is a block diagram of an example configuration of an application development environment. -
FIG. 3 is a block diagram of an example configuration of an application testing environment. -
FIG. 4 is a schematic diagram of an example of a task automation system integrated with application development and testing environments. -
FIG. 5 is a block diagram of an example configuration of a task automation system. -
FIG. 6 is a block diagram of an example configuration of an enterprise system. -
FIG. 7 is a block diagram of an example configuration of a test device used to test an application build in the application testing environment. -
FIG. 8 is a block diagram of an example configuration of a client device used to interface with, for example, the task automation system. -
FIG. 9 is a flow diagram of an example of computer executable instructions for executing tasks in a performance engineering environment such as an application testing or development environment. -
FIG. 10 is a screen shot of an example of a graphical user interface (GUI) for a chat user interface. -
FIG. 11 is a screen shot of the chat user interface ofFIG. 10 after subsequent messaging. -
FIG. 12 is a flow diagram of an example of computer executable instructions for initiating periodic testing based on acquiring a latest build for an application. - It will be appreciated that for simplicity and clarity of illustration, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the example embodiments described herein. However, it will be understood by those of ordinary skill in the art that the example embodiments described herein may be practiced without these specific details. In other instances, well-known methods, procedures and components have not been described in detail so as not to obscure the example embodiments described herein. Also, the description is not to be considered as limiting the scope of the example embodiments described herein.
- The following generally relates to a task automation system that can be integrated with or within a performance engineering environment such as a computing environment which includes an application testing environment and/or an application development environment to enable operations and instructions to be requested and executed using a conversational chat user interface (UI) to simplify the interactions with such an environment as well as simplify the gathering and consumption of data generated through performance engineering tasks and operations.
- The task automation system described herein provides both chat UI and background functionality to automate certain performance engineering tasks, in order to enable, for example, non-technical persons to execute technical or specialized work items, to enable more efficient operations within these environments, and to decrease time, effort and costs associated with manual efforts normally associated with performance engineering execution and monitoring.
- A conversational-based logic system is also provided that can be implemented with third party or built-in natural language processing (NLP) and/or natural language understanding (NLU), a user friendly GUI, and a conversation automation module or “dialog manager”. The logic system and GUI can allow a non-technical user to initiate tests, analyze results and trigger actions through a conversational exchange with a chatbot that is tied into a background system to automate as much of the desired process as possible. With this logic system in place, users can execute mobile device management, such as build download and installation for “build on build” performance engineering. Users can also be provided with a test execution assistant via the chatbot, to allow the user to request and initiate any type of test, such as performance tests, UI tests (for both browser and mobile versions), etc. The chatbot can also provide a conversational tool to allow users to conduct basic conversations about the performance testing, fetch or generate test results or status updates, determine what the results mean, what the user may be looking for, etc.
- As used herein a “build” may refer to the process of creating an application program for a software release, by taking all the relevant source code files and compiling them and then creating build artifacts, such as binaries or executable program(s), etc. “Build data” may therefore refer to any files or other data associated with a build. The terms “build” and “build data” (or “build file”) may also be used interchangeably to commonly refer to a version or other manifestation of an application, or otherwise the code or program associated with an application that can be tested for performance related metrics.
- It will be appreciated that while examples provided herein may be primarily directed to automated testing of mobile applications, the principles discussed herein equally apply to applications deployed on or otherwise used by other devices, such as desktop or laptop computers, e.g., to be run on a web browser or locally installed instance of an application. Similarly, the principles described herein can also be adapted to any performance engineering environment in which executable tasks are implemented, whether they include development, testing, implementation, production, quality assurance, etc.
- Certain example systems and methods described herein are able to execute operations in a performance engineering environment. In one aspect, there is provided a device for executing operations in such a performance engineering environment. The device includes a processor, a communications module coupled to the processor, and a memory coupled to the processor. The memory stores computer executable instructions that when executed by the processor cause the processor to receive via the communications module, a request to implement a task within the environment, from an input to a chat user interface. The computer executable instructions, when executed, also cause the processor to communicate with a logic system to determine an intent from the request; determine one or more executable instructions to implement one or more operations associated with the task, based on the determined intent; and communicate via the communications module, with at least one endpoint to trigger execution of the one or more operations using the one or more executable instructions. The computer executable instructions, when executed, also cause the processor to receive via the communications module, data from the at least one endpoint, the data associated with execution of the one or more operations; generate a conversational response to the request based on or including the data received from the at least one endpoint; and have the conversational response rendered in the chat user interface.
- In another aspect, there is provided a method of executing operations in a performance engineering environment. The method is executed by a device having a communications module. The method includes receiving via the communications module, a request to implement a task within the environment, from an input to a chat user interface; communicating with a logic system to determine an intent from the request; and determining one or more executable instructions to implement one or more operations associated with the task, based on the determined intent. The method also includes communicating via the communications module, with at least one endpoint to trigger execution of the one or more operations using the one or more executable instructions; receiving via the communications module, data from the at least one endpoint, the data associated with execution of the one or more operations; generating a conversational response to the request based on or including the data received from the at least one endpoint; and having the conversational response rendered in the chat user interface.
- In another aspect, there is provided non-transitory computer readable medium for executing operations in a performance engineering environment. The computer readable medium includes computer executable instructions for receiving via a communications module, a request to implement a task within the environment, from an input to a chat user interface. The computer readable medium also includes instructions for communicating with a logic system to determine an intent from the request; determining one or more executable instructions to implement one or more operations associated with the task, based on the determined intent; and communicating via the communications module, with at least one endpoint to trigger execution of the one or more operations using the one or more executable instructions. The computer readable medium also includes instructions for receiving via the communications module, data from the at least one endpoint, the data associated with execution of the one or more operations; generating a conversational response to the request based on or including the data received from the at least one endpoint; and having the conversational response rendered in the chat user interface.
- In certain example embodiments, the device can further have the logic system access a language server to determine the intent from the request. This may include having the logic system use the language server to apply a natural language understanding (NLU) process to correlate the request to the one or more operations associated with the task.
- In certain example embodiments, the device can generate the executable instructions to implement the one or more operations in a format that is understandable to the at least one endpoint, wherein the device is configured to generate executable instructions in formats understandable to a plurality of disparate systems.
- In certain example embodiments, the device can have the logic system communicate with the at least one endpoint via a respective application programming interface (API) exposed by a respective endpoint, to trigger execution of the one or more operations.
- In certain example embodiments, the one or more operations can include initiating an application build download process, wherein the corresponding endpoint returns an initiation status.
- In certain example embodiments, the one or more operations can include initiating a performance test, wherein the corresponding endpoint returns test data.
- In certain example embodiments, the one or more operations can include checking the status of an executed or currently executing performance test, wherein the corresponding endpoint returns a status indicator.
- In certain example embodiments, the request can be translated from a first spoken language to a second spoken language processed by the logic system.
- In certain example embodiments, the device can have the logic system access a model generated by a machine learning system. The machine learning system can use messages exchanged via the chat user interface to build and/or refine the model over time.
- In certain example embodiments, the performance engineering environment includes an application testing environment and an application development environment. The device can be coupled to both the application testing and application development environments to provide a centrally placed chat user interface to execute operations and receive data and status information from a plurality of disparate systems.
-
FIG. 1 illustrates anexemplary computing environment 8. In this example, thecomputing environment 8 may include anapplication testing environment 10, anapplication development environment 12, and acommunications network 14 connecting one or more components of thecomputing environment 8. Thecomputing environment 8 may also include or otherwise be connected to anapplication deployment environment 16, which provides a platform, service, or other entity responsible for posting or providing access to applications that are ready for use by client devices. Thecomputing environment 8 may also include or otherwise be connected to atask automation system 24, which provides both chat UI and background functionality to automate certain performance engineering tasks, in order to enable, for example, non-technical workforce to execute technical or specialized work items, to enable more efficient operations within these environments, and to decrease time, effort and costs associated with manual efforts normally associated with performance engineering execution and monitoring. Theapplication development environment 12 includes or is otherwise coupled to one or more repositories or other data storage elements for storingapplication build data 18. Theapplication build data 18 can include any computer code and related data and information for an application to be deployed, e.g., for testing, execution or other uses. - In this example, the
application build data 18 can be provided via one or more repositories and include the data and code required to perform application testing on a device or simulator. It can be appreciated that whileFIG. 1 illustrates a number oftest devices 22 that resemble a mobile communication device,such testing devices 22 can also include simulators, simulation devices or simulation processes, all of which may be collectively referred to herein as “test devices 22” for ease of illustration. Theapplication testing environment 10 may include or otherwise have access to one or more repositories or other data storage elements for storingapplication test data 20, which includes any files, reports, information, results, metadata or other data associated with and/or generated during a test implemented within theapplication testing environment 10. Also shown inFIG. 1 is aclient device 26, which may represent any electronic device that can be operated by a user to interact or otherwise use thetask automation system 24 as herein described. - The
computing environment 8 may be part of an enterprise or other organization that both develops and tests applications. In such cases, thecommunication network 14 may not be required to provide connectivity between theapplication development environment 12, thetask automation system 24, and theapplication testing environment 10, wherein such connectivity is provided by an internal network. Theapplication development environment 12,task automation system 24, andapplication testing environment 10 may also be integrated into the same enterprise environment as subsets thereof. That is, the configuration shown inFIG. 1 is illustrative only. Moreover, thecomputing environment 8 can include multiple enterprises or organizations, e.g., wherein separate organizations are configured to, and responsible for, application testing and application development. For example, an organization may contract a third-party to develop an app for their organization but perform testing internally to meet proprietary or regulatory requirements. Similarly, an organization that develops an app may outsource the testing stages, particularly when testing is performed infrequently. Theapplication deployment environment 16 may likewise be implemented in several different ways. For example, thedeployment environment 16 may include an internal deployment channel for employee devices, may include a public marketplace such as an app store, or may include any other channel that can make the app available to clients, consumers or other users. - One example of the
computing environment 8 may include a financial institution system (e.g., a commercial bank) that provides financial services accounts to users and processes financial transactions associated with those financial service accounts. Such a financial institution system may provide to its customers various browser-based and mobile applications, e.g., for mobile banking, mobile investing, mortgage management, etc. -
Test devices 22 can be, or be simulators for, client communication devices that would normally be associated with one or more users. Users may be referred to herein as customers, clients, correspondents, or other entities that interact with the enterprise or organization associated with thecomputing environment 8 via one or more apps. Such customer communication devices are not shown inFIG. 1 since such devices would typically be used outside of thecomputing environment 8 in which the development and testing occurs.Client device 26 shown inFIG. 1 may be a similar type of device as a customer communication device and is shown to illustrate a manner in which an individual can interact with thetask automation system 24. However, it may be noted that such customer communication devices and/orclient device 26 may be connectable to theapplication deployment environment 16, e.g., to download newly developed apps, to update existing apps, etc. - In certain embodiments, a user may operate the customer communication devices such that customer device performs one or more processes consistent with what is being tested in the disclosed embodiments. For example, the user may use customer device to engage and interface with a mobile or web-based banking application which has been developed and tested within the
computing environment 8 as herein described. In certain aspects,test devices 22, customer devices, andclient device 26 can include, but are not limited to, a personal computer, a laptop computer, a tablet computer, a notebook computer, a hand-held computer, a personal digital assistant, a portable navigation device, a mobile phone, a wearable device, a gaming device, an embedded device, a smart phone, a virtual reality device, an augmented reality device, third party portals, an automated teller machine (ATM), and any additional or alternate computing device, and may be operable to transmit and receive data across communication networks such as thecommunication network 14 shown by way of example inFIG. 1 . -
Communication network 14 may include a telephone network, cellular, and/or data communication network to connect different types of electronic devices. For example, thecommunication network 14 may include a private or public switched telephone network (PSTN), mobile network (e.g., code division multiple access (CDMA) network, global system for mobile communications (GSM) network, and/or any 3G, 4G, or 5G wireless carrier network, etc.), WiFi or other similar wireless network, and a private and/or public wide area network (e.g., the Internet). - Referring back to
FIG. 1 , thecomputing environment 8 may also include a cryptographic server (not shown) for performing cryptographic operations and providing cryptographic services (e.g., authentication (via digital signatures), data protection (via encryption), etc.) to provide a secure interaction channel and interaction session, etc. Such a cryptographic server can also be configured to communicate and operate with a cryptographic infrastructure, such as a public key infrastructure (PKI), certificate authority (CA), certificate revocation service, signing authority, key server, etc. The cryptographic server and cryptographic infrastructure can be used to protect the various data communications described herein, to secure communication channels therefor, authenticate parties, manage digital certificates for such parties, manage keys (e.g., public and private keys in a PKI), and perform other cryptographic operations that are required or desired for particular applications of theapplication development environment 12,task automation system 24, and/orapplication testing environment 10. The cryptographic server may be used to protect data within the computing environment 8 (include theapplication build data 18 and/or application test data 20) by way of encryption for data protection, digital signatures or message digests for data integrity, and by using digital certificates to authenticate the identity of the users and entity devices with which theapplication development environment 12,task automation system 24, andapplication testing environment 10 communicate to inhibit data breaches by adversaries. It can be appreciated that various cryptographic mechanisms and protocols can be chosen and implemented to suit the constraints and requirements of the particular deployment of theapplication development environment 12 andapplication testing environment 10 as is known in the art. - In
FIG. 2 , an example configuration of theapplication development environment 12 is shown. It can be appreciated that the configuration shown inFIG. 2 has been simplified for ease of illustration. In certain example embodiments, theapplication development environment 12 may include aneditor module 30, a version andaccess control manager 32, one ormore libraries 34, and acompiler 36, which would be typical components utilized in application development. In this example, theapplication development environment 12 also includes theapplication build data 18, which, while shown within theenvironment 12, may also be a separate entity (e.g., repository) used to store and provide access to the stored build files. Theapplication development environment 12 also includes or is provided with (e.g., via an application programming interface (API)), adevelopment environment interface 38. Thedevelopment environment interface 38 provides communication and data transfer capabilities between theapplication development environment 12 and theapplication testing environment 10 from the perspective of theapplication development environment 12. As shown inFIG. 2 , thedevelopment environment interface 38 can connect to thecommunication network 14 to send/receive data and communications to/from theapplication testing environment 10, including instructions or commands initiated by/from thetask automation system 24, as discussed further below. - The
editor module 30 can be used by a developer/programmer to create and edit program code associated with an application being developed. This can include interacting with the version andaccess control manager 32 to control access to current build files andlibraries 34 while honoring permissions and version controls. Thecompiler 36 may then be used to compile an application build file and other data to be stored with theapplication build data 18. It can be appreciated that a typical application orsoftware development environment 12 may include other functionality, modules, and systems, details of which are omitted for brevity and ease of illustration. It can also be appreciated that theapplication development environment 12 may include modules, accounts, and access controls for enabling multiple developers to participate in developing an application, and modules for enabling an application to be developed for multiple platforms. For example, a mobile application may be developed by multiple teams, each team potentially having multiple programmers. Also, each team may be responsible for developing the application on a different platform, such as Apple iOS or Google Android for mobile versions, and Google Chrome or Microsoft Edge for web browser versions. Similarly, applications may be developed for deployment on different device types, even with the same underlying operating system. - By having build files stored for all of the various operating systems, device types, and versions that are currently compatible and being used, and providing access via the
development environment interface 38, theapplication testing environment 10 can automatically obtain and deploy the latest builds to perform application testing in different scenarios. Such scenarios can include not only different device types, operating systems, and versions, but also the same build under different operating conditions. - While not shown in
FIG. 2 for clarity of illustration, in example embodiments, theapplication development environment 12 may be implemented using one or more computing devices such as terminals, servers, and/or databases, having one or more processors, communications modules, and database interfaces. Such communications modules may include thedevelopment environment interface 38, which enables theapplication development environment 12 to communicate with one or more other components of thecomputing environment 8, such as theapplication testing environment 10, via a bus or other communication network, such as thecommunication network 14. While not delineated inFIG. 2 , the application development environment 12 (and any of its devices, servers, databases, etc.) includes at least one memory or memory device that can include a tangible and non-transitory computer-readable medium having stored therein computer programs, sets of instructions, code, or data to be executed by the one or more processors.FIG. 2 illustrates examples of modules, tools and engines stored in memory within theapplication development environment 12. It can be appreciated that any of the modules, tools, and engines shown inFIG. 2 may also be hosted externally and be available to theapplication development environment 12, e.g., via communications modules such as thedevelopment environment interface 38. - Turning now to
FIG. 3 , an example configuration of theapplication testing environment 10 is shown. Theapplication testing environment 10 includes atesting environment interface 40, which is coupled to thedevelopment environment interface 38 in theapplication development environment 12, atesting execution module 42, and one or more testing hosts 44. Thetesting environment interface 40 can provide a UI for personnel or administrators in theapplication testing environment 10 to coordinate an automated build management process as herein described and to initiate or manage a test execution process as herein described. Thetesting environment interface 40 can also include, as illustrated inFIG. 3 , thetask automation system 24 to provide such UI for personnel or administrators, e.g., via a chat UI as described in greater detail below. - The
testing environment interface 40 can provide a platform on which thetask automation system 24 can operate to instruct thedevelopment environment interface 38, e.g., by sending a message or command via thecommunication network 14, to access theapplication build data 18 to obtain the latest application build(s) based on the number and types of devices being tested by the testing host(s) 44. The latest application builds are then returned to theapplication testing environment 10 by thedevelopment environment interface 38 to execute an automated build retrieval operation. As shown inFIG. 3 , theapplication build data 18 can be sent directly to the testing host(s) 44 and thus the testing host(s) 44 can also be coupled to thecommunication network 14. It can be appreciated that theapplication build data 18 can also be provided to the testing host(s) 44 via thetesting environment interface 40, e.g., through messages handled by thetask automation system 24 via the chat UI 52 (see alsoFIG. 4 ). The host(s) 44 in this example have access to a number oftest devices 22 which, as discussed above, can be actual devices or simulators for certain devices. The testing host(s) 44 are also scalable, allowing foradditional test devices 22 to be incorporated into theapplication testing environment 10. For example, anew test device 22 may be added when a new device type is released and will be capable of using the application being tested. Upon installation, the application on eachtest device 22 can be configured to point to the appropriate environment under test and other settings can be selected/deselected. - The
test devices 22 are also coupled to thetesting execution module 42 to allow thetesting execution module 42 to coordinatetests 46 to evaluate metrics, for example, by executing tests for application traffic monitoring, determining UI response times, examining device logs, and determining resource utilization metrics (withTest 1,Test 2, . . . , Test N; shown inFIG. 3 for illustrative purposes). Thetests 46 can generate data logs, reports and other outputs, stored asapplication test data 20, which can be made available to various entities or components, such as adashboard 48. The framework shown inFIG. 3 enables theapplication testing environment 10 to download the latest builds from the respective repositories for the respective device/OS platform(s) and run a UI flow on alltest devices 22 to configure the environment, disable system pop-ups, and set feature flags. In this way, the framework can automate the build download and installation process. The framework shown inFIG. 3 can also enabletests 46 to be initiated, status updates forsuch tests 46 to be obtained, and other information gathered concerning thetests 46 and/ortest data 20, through inputs interpreted by a chat UI of thetask automation system 24. - It can be appreciated that while the
testing environment interface 40, the testing host(s) 44, and thetesting execution module 42 are shown as separate modules inFIG. 3 , such modules may be combined in other configurations and thus the delineations shown inFIG. 3 are for illustrative purposes. - Referring now to
FIG. 4 , a schematic diagram of thetask automation system 24, integrated with theapplication development environment 10 andapplication testing environment 12, is shown. The configuration shown inFIG. 4 provides a backend system that can be implemented with a conversational-style message exchange user interface, also referred to herein as a “chat” user interface (UI) 52. In this configuration, a taskautomation user interface 50 is provided, which includes thechat UI 52, namely an application or front-end for users ofclient devices 26 to interact with thetask automation system 24. The taskautomation user interface 50 is coupled to alogic system 54, which is used to determine an intent or instruction from a request made to thechat UI 52 via a chat message. In this way, a less technically inclined or trained user can interact with theenvironments application build data 18, to initiate test, etc. Thelogic system 54 can be implemented using, for example the open source Botpress™ conversational AI platform. - The
logic system 54 is coupled to alanguage server 58 in this example to access and leverage NLP and NLU modules/processes to assist in determining the intent from the request. It can be appreciated that while thelanguage server 58 is provided as a separate component inFIG. 4 , thelogic system 54 can also adopt or otherwise include or provide the functionalities of thelanguage server 58. Thelanguage server 58 can also be used to allow spoken language translation, e.g., to allow users to input messages in one spoken language that can be translated and interpreted without having the user perform any translation at his/her end. In this way, thetask automation system 24 can provide a convenient way for users to interact in a language with which they are comfortable, which can make interpreting the requests more accurate. - The
logic system 54 includes anAPI module 56 that is configured to make API calls to the different endpoints that can/should be reached based on the intent of the user's request. TheAPI module 56 can therefore maintain the appropriate API calls for the various endpoints in thecomputing environment 8 that can be interacted with or controlled via thechat UI 52. TheAPI module 56 is therefore coupled to one or moreinternal APIs 62 that connect into the various endpoints. In addition, theAPI module 56 can interface with aconversation automation module 60, which can be an internal component or third party component that can initiate a conversation in thechat UI 52 and collect responses as if it was a regular or routine conversation. - The
internal APIs 62 in this example enable thelogic system 54 to communicate with an applicationbuild download module 64, mobileUI testing module 66, reporting andlogging module 68, andperformance testing module 69 by way of example. It can be appreciated that other types of endpoints with API access can be plugged into the configuration shown inFIG. 4 . - Referring to the encircled stage markers in
FIG. 4 , the user can, e.g., via theirclient device 26, initiate a request atstage 1. Atstage 2, the taskautomation user interface 50 calls thelogic system 54 to process the request, e.g., by inferring an intent from the text in a message input to thechat UI 52. Thelogic system 54 may return data atstage 3, either in response to the call atstage 2 or later if additional data such as statistics, dashboards, logs, etc. Atstage 4, thelogic system 54 can interface and communicate with thelanguage server 58 to apply NLP/NLU processes to determine an intent from the request that can be translated into an actionable set of commands executable instructions via one or more API calls to theinternal APIs 62 atstage 5. TheAPI module 56 can also be executed atstage 6, to initiate a conversational response that incorporates any feedback from the API calls. Stage 7 in this example includes foursub-stages build download module 64 to have theapplication testing environment 10 request the latest build from theapplication development environment 12. In the other examples shown inFIG. 4 , testing execution or status updates can be initiated/fetched (sub-stages sub-stage 7 c. -
Sub-stages respective module conversation automation module 60 to generate one or more responses to the request that can be rendered in thechat UI 52 atstage 9. This allows the user of theclient device 26 to view or otherwise observe such response(s) atstage 10, e.g., using a chat application 138 (seeFIG. 8 ) on theclient device 26. - The
task automation system 24 thus provides a backend platform on which thechat UI 52 can sit to enable the user of aclient device 26 to interact with the performance engineering environment (i.e., thecomputing environment 8 in this example) to execute, initiate, request or perform various operations, tasks or routines, without necessarily requiring knowledge or expertise of the underlying the system(s) that are required to perform these operations. Thetask automation system 24 also leverages alogic system 54 andlanguage server 58 to provide a seamless conversational experience for the user while implementing largely technical background tasks, by inferring the intent of the user's request from the messages exchanged with thechat UI 52. That is, thechat UI 52 can provide both a front-end messaging-based user interface as well as an embedded or background “chatbot” with which to communicate. - In
FIG. 5 , an example configuration of thetask automation system 24 is shown. In certain embodiments, thetask automation system 24 may include one ormore processors 70, acommunications module 72, and adatabase interface module 74 for interfacing with the datastores for thebuild data 18 andtest data 20 to retrieve, modify, and store (e.g., add) data.Communications module 72 enables thetask automation system 24 to communicate with one or more other components of thecomputing environment 8, such as client device 26 (or one of its components), via a bus or other communication network, such as thecommunication network 14. While not delineated inFIG. 5 , thetask automation system 24 includes at least one memory or memory device that can include a tangible and non-transitory computer-readable medium having stored therein computer programs, sets of instructions, code, or data to be executed byprocessor 70.FIG. 5 illustrates examples of modules, tools and engines stored in memory on thetask automation system 24 and operated by theprocessor 70. It can be appreciated that any of the modules, tools, and engines shown inFIG. 5 may also be hosted externally and be available to thetask automation system 24, e.g., via thecommunications module 72. In the example embodiment shown inFIG. 5 , thetask automation system 24 includes thelogic system 54, which includes arecommendation engine 76, amachine learning engine 78, aclassification module 80, atraining module 82, and a trainedmodel 84. Thelogic system 54 also includes or has access to a languageserver interface module 88 to interface and/or communicate with thelanguage server 58 as described above. Thetask automation system 24 also includes anaccess control module 86 and the taskautomation user interface 50. The taskautomation user interface 50 includes or has access to thechat UI 52 as shown inFIG. 4 . Thetask automation system 24 also includes theconversation automation module 60, the API module 62 (which may instead be part of the logic system 54), and an enterprisesystem interface module 87. - The
recommendation engine 76 is used by thelogic system 54 of thetask automation system 24 to generate one or more recommendations for thetask automation system 24 and/or aclient device 26 that is/are related to an association between inputs (requests) to thechat UI 52 and responses to these requests. For example, thelogic system 54 can obtain a textual input from a user of theclient device 26 requesting that a test be initiated, a status update to be obtained or other data to be gathered with respect to a process engineering task. As discussed above, this can include accessing thelanguage server 58 via the languageserver interface module 88 in order to apply NLP/NLU processes. It may be noted that a recommendation as used herein may refer to a prediction, suggestion, inference, association or other recommended identifier that can be used to generate a suggestion, notification, command, instruction or other data that can be viewed, used or consumed by thetask automation system 24, thetesting environment interface 40 and/or theclient devices 26 interacting with same. Therecommendation engine 76 can access chat data (not shown) stored for/by thechat UI 52 and apply one or more inference processes to generate the recommendation(s). Therecommendation engine 76 may utilize or otherwise interface with themachine learning engine 78 to both classify data currently being analyzed to generate a suggestion or recommendation, and to train classifiers using data that is continually being processed and accumulated by thetask automation system 24. That is, therecommendation engine 76 can learn request- or response-related preferences and revise and refine classifications, rules or other analytics-related parameters over time. For example, thelogic system 54 can be used to update and refine the trainedmodel 84 using thetraining module 82 asclient devices 26 interact with thechat UI 52 during various interactions to improve the NLP/NLU parameters and understanding of how users interact with the processing engineering environment. - The
machine learning engine 78 may also perform operations that classify the chat data in accordance with corresponding classifications parameters, e.g., based on an application of one or more machine learning algorithms to the data or groups of the data (also referred to herein as “chat content”, “conversation content”, “user requests” or “user intent”). The machine learning algorithms may include, but are not limited to, a one-dimensional, convolutional neural network model (e.g., implemented using a corresponding neural network library, such as Keras®), and the one or more machine learning algorithms may be trained against, and adaptively improved, using elements of previously classified profile content identifying suitable matches between content identified and potential actions to be executed. Subsequent to classifying the event- or workflow-related content or content being analyzed, therecommendation engine 76 may further process each element of the content to identify, and extract, a value characterizing the corresponding one of the classification parameters, e.g., based on an application of one or more additional machine learning algorithms to each of the elements of the chat-related content. By way of example, the additional machine learning algorithms may include, but are not limited to, an adaptive NLP algorithm that, among other things, predicts starting and ending indices of a candidate parameter value within each element of the content, extracts the candidate parameter value in accordance with the predicted indices, and computes a confidence score for the candidate parameter value that reflects a probability that the candidate parameter value accurately represents the corresponding classification parameter. As described herein, the one or more additional machine learning algorithms may be trained against, and adaptively improved using, the locally maintained elements of previously classified content. Classification parameters may be stored and maintained using theclassification module 80, and training data may be stored and maintained using thetraining module 82. - The trained
model 84 may also be created, stored, refined, updated, re-trained, and referenced by the task automation system 24 (e.g., by way of the logic system 54) to determine associations between request messages and suitable responses or actions, and/or content related thereto. Such associations can be used to generate recommendations or suggestions for improving the conversational exchange and understanding of the users' intents via the text or other information input to thechat UI 52. - In some instances, classification data stored in the
classification module 80 may identify one or more parameters, e.g., “classification” parameters, that facilitate a classification of corresponding elements or groups of recognized content based on any of the exemplary machine learning algorithms or processes described herein. The one or more classification parameters may correspond to parameters that can indicate an affinity or compatibility between the request and response (chat) data, and certain potential actions. For example, a request to initiate a test can include recognition of the message as being a request and the parsing of the message to determine a suitable endpoint and instruction, command or request to forward along to that endpoint. - In some instances, the additional, or alternate, machine learning algorithms may include one or more adaptive, NLP algorithms capable of parsing each of the classified portions of the content and predicting a starting and ending index of the candidate parameter value within each of the classified portions. Examples of the adaptive, NLP algorithms include, but are not limited to, NLP models that leverage machine learning processes or artificial neural network processes, such as a named entity recognition model implemented using a SpaCy® library.
- Examples of these adaptive, machine learning processes include, but are not limited to, one or more artificial, neural network models, such as a one-dimensional, convolutional neural network model, e.g., implemented using a corresponding neural network library, such as Keras®. In some instances, the one-dimensional, convolutional neural network model may implement one or more classifier functions or processes, such a Softmax® classifier, capable of predicting an association between an element of event data (e.g., a value or type of data being augmented with an event or workflow) and a single classification parameter and additionally, or alternatively, multiple classification parameters.
- Based on the output of the one or more machine learning algorithms or processes, such as the one-dimensional, convolutional neural network model described herein,
machine learning engine 78 may perform operations that classify each of the discrete elements of event- or workflow-related content as a corresponding one of the classification parameters, e.g., as obtained from classification data stored by theclassification module 80. - The outputs of the machine learning algorithms or processes may then be used by the
recommendation engine 76 to generate one or more suggested recommendations, instructions, commands, notifications, rules, or other instructional or observational elements that can be presented to theclient device 26 via thechat UI 52. - Referring again to
FIG. 5 , theaccess control module 86 may be used to apply a hierarchy of permission levels or otherwise apply predetermined criteria to determine what chat data or other client/user, financial or transactional data can be shared with which entity in thecomputing environment 8. For example, thetask automation system 24 may have been granted access to certain sensitive user profile data for a user, which is associated with acertain client device 26 in thecomputing environment 8. Similarly, certain client data may include potentially sensitive information such as age, date of birth, or nationality, which may not necessarily be needed by thetask automation system 24 to execute certain actions (e.g., to more accurately determine the spoken language or conversational style of that user). As such, theaccess control module 86 can be used to control the sharing of certain client data or chat data, a permission or preference, or any other restriction imposed by thecomputing environment 8 or application in which thetask automation system 24 is used. - The
task automation system 24 in this example also includes the taskautomation user interface 50 described above, which provides thechat UI 52. Theconversation automation module 60 andAPI module 62 are also shown inFIG. 5 which, as described above, can be used to generate a response to a request entered in thechat UI 52 and communicate with endpoints within the process engineering environment such as theapplication testing environment 10 orapplication development environment 12 to implement a task such as initiating a test or obtaining status events, etc. - As illustrated in
FIG. 5 , thelogic system 54 as well as thetask automation system 24 can be considered one or more devices having aprocessor 70, memory and acommunications module 72 configured to work with, or as part of, thecomputing environment 8, to perform the operations described herein. It can be appreciated that the various elements of thetask automation system 24 andlogic system 54 are shown delineated as such inFIG. 5 for illustrative purposes and clarity of description and could be provided using other configurations and distribution of functionality and responsibilities. - The
task automation system 24 may also include the enterprisesystem interface module 87 to provide a graphical user interface (GUI) or API connectivity to communicate with an enterprise system 90 (seeFIG. 6 ) to obtainclient data 98 for a certain user interacting with thetask automation system 24. It can be appreciated that the enterprisesystem interface module 87 may also provide a web browser-based interface, an application or “app” interface, a machine language interface, etc. - In
FIG. 6 , an example configuration of anenterprise system 90 is shown. Theenterprise system 90 includes acommunications module 92 that enables theenterprise system 90 to communicate with one or more other components of thecomputing environment 8, such as theapplication testing environment 10,application development environment 12, ortask automation system 24, via a bus or other communication network, such as thecommunication network 14. While not delineated inFIG. 6 , theenterprise system 90 includes at least one memory or memory device that can include a tangible and non-transitory computer-readable medium having stored therein computer programs, sets of instructions, code, or data to be executed by one or more processors (not shown for clarity of illustration).FIG. 6 illustrates examples of servers and datastores/databases operable within theenterprise system 90. It can be appreciated that any of the components shown inFIG. 6 may also be hosted externally and be available to theenterprise system 90, e.g., via thecommunications module 92. In the example embodiment shown inFIG. 6 , theenterprise system 90 includes one or more servers to provide access toclient data 98, e.g., to assist in determining an intent from a request input to thechat UI 52 or for development or testing purposes. Exemplary servers include amobile application server 94, aweb application server 96 and adata server 100. Although not shown inFIG. 6 , theenterprise system 90 may also include a cryptographic server for performing cryptographic operations and providing cryptographic services. The cryptographic server can also be configured to communicate and operate with a cryptographic infrastructure. Theenterprise system 90 may also include one or more data storage elements for storing and providing data for use in such services, such as data storage for storingclient data 98. -
Mobile application server 94 supports interactions with a mobile application installed on client device 26 (which may be similar or the same as a test device 22).Mobile application server 94 can access other resources of theenterprise system 90 to carry out requests made by, and to provide content and data to, a mobile application onclient device 26. In certain example embodiments,mobile application server 94 supports a mobile banking application to provide payments from one or more accounts of user, among other things. -
Web application server 96 supports interactions using a website accessed by a web browser application running on the client device. It can be appreciated that themobile application server 94 and theweb application server 96 can provide different front ends for the same application, that is, the mobile (app) and web (browser) versions of the same application. For example, theenterprise system 90 may provide a banking application that be accessed via a smartphone or tablet app while also being accessible via a browser on any browser-enabled device. - The
client data 98 can include, in an example embodiment, financial data that is associated with users of the client devices (e.g., customers of the financial institution). The financial data may include any data related to or derived from financial values or metrics associated with customers of a financial institution system (i.e., theenterprise system 60 in this example), for example, account balances, transaction histories, line of credit available, credit scores, mortgage balances, affordability metrics, investment account balances, investment values and types, among many others. Other metrics can be associated with the financial data, such as financial health data that is indicative of the financial health of the users of theclient devices 26. - An
application deployment module 102 is also shown in the example configuration ofFIG. 6 to illustrate that theenterprise system 90 can provide its own mechanism to deploy the developed and tested applications ontoclient devices 26 within the enterprise. It can be appreciated that theapplication deployment module 102 can be utilized in conjunction with a third-party deployment environment such as an app store to have tested applications deployed to employees and customers/clients. - In
FIG. 7 , an example configuration of atest device 22 is shown. It can be appreciated that thetest device 22 shown inFIG. 7 can correspond to an actual device or represent a simulation of such adevice 22. In certain embodiments, theclient device 22 may include one ormore processors 110, acommunications module 112, and adata store 124storing device data 126 andapplication data 128.Communications module 112 enables thetest device 22 to communicate with one or more other components of thecomputing environment 8 via a bus or other communication network, such as thecommunication network 14. While not delineated inFIG. 7 , theclient device 22 includes at least one memory or memory device that can include a tangible and non-transitory computer-readable medium having stored therein computer programs, sets of instructions, code, or data to be executed byprocessor 110.FIG. 7 illustrates examples of modules and applications stored in memory on thetest device 22 and operated by theprocessor 110. It can be appreciated that any of the modules and applications shown inFIG. 7 may also be hosted externally and be available to thetest device 22, e.g., via thecommunications module 112. - In the example embodiment shown in
FIG. 7 , thetest device 22 includes adisplay module 114 for rendering GUIs and other visual outputs on a display device such as a display screen, and aninput module 116 for processing user or other inputs received at thetest device 22, e.g., via a touchscreen, input button, transceiver, microphone, keyboard, etc. Thetest device 22 may also include anapplication 118 to be tested that includes the latestapplication build data 18 to be tested using thetest device 22, e.g., by executing tests. Thetest device 22 may include ahost interface module 120 to enable thetest device 22 to interface with a testing host for loading an application build. Thetest device 22 in this example embodiment also includes a testexecution interface module 122 for interfacing theapplication 118 with the testing execution module. Thedata store 124 may be used to storedevice data 126, such as, but not limited to, an IP address or a MAC address that uniquely identifiestest device 22. Thedata store 124 may also be used to storeapplication data 128, such as, but not limited to, login credentials, user preferences, cryptographic data (e.g., cryptographic keys), etc. - In
FIG. 8 , an example configuration of theclient device 26 is shown. In certain embodiments, theclient device 26 may include one ormore processors 130, acommunications module 132, and adata store 144storing device data 146 andapplication data 148.Communications module 132 enables theclient device 26 to communicate with one or more other components of thecomputing environment 8, such as thetask automation system 24, via a bus or other communication network, such as thecommunication network 14. While not delineated inFIG. 8 , theclient device 26 includes at least one memory or memory device that can include a tangible and non-transitory computer-readable medium having stored therein computer programs, sets of instructions, code, or data to be executed byprocessor 130.FIG. 8 illustrates examples of modules and applications stored in memory on theclient device 26 and operated by theprocessor 130. It can be appreciated that any of the modules and applications shown inFIG. 8 may also be hosted externally and be available to theclient device 26, e.g., via thecommunications module 132. - In the example embodiment shown in
FIG. 8 , theclient device 26 includes adisplay module 134 for rendering GUIs and other visual outputs on a display device such as a display screen, and aninput module 136 for processing user or other inputs received at theclient device 26, e.g., via a touchscreen, input button, transceiver, microphone, keyboard, etc. Theclient device 26 may also include achat application 138, which may take the form of a customized app, plug-in, widget, or software component provided by thetask automation system 24 for use by theclient device 26 to use thechat UI 52. Similarly, theclient device 26 may include anenterprise system application 142 provided by theirenterprise system 90. Theclient device 26 in this example embodiment also includes aweb browser application 140 for accessing Internet-based content, e.g., via a mobile or traditional website. Thedata store 144 may be used to storedevice data 146, such as, but not limited to, an IP address or a MAC address that uniquely identifiesclient device 26 withinenvironment 8. Thedata store 144 may also be used to storeapplication data 148, such as, but not limited to, login credentials, user preferences, cryptographic data (e.g., cryptographic keys), etc. - It will be appreciated that only certain modules, applications, tools and engines are shown in
FIGS. 2 to 8 for ease of illustration and various other components would be provided and utilized by theapplication testing environment 10,application development environment 12,task automation system 24,test device 22,enterprise system 90, andclient device 26 as is known in the art. - It will also be appreciated that any module or component exemplified herein that executes instructions may include or otherwise have access to computer readable media such as storage media, computer storage media, or data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Computer storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. Examples of computer storage media include RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information, and which can be accessed by an application, module, or both. Any such computer storage media may be part of any of the servers or other devices in the
application testing environment 10,application development environment 12,task automation system 24,enterprise system 90,client device 26, ortest device 22, or accessible or connectable thereto. Any application or module herein described may be implemented using computer readable/executable instructions that may be stored or otherwise held by such computer readable media. - Referring to
FIG. 9 , an example embodiment of computer executable instructions for executing tasks in a performance engineering environment, such as an application testing ordevelopment environment computing environment 8, is shown. Atblock 150, thetask automation system 24 receives a request to implement a task within thecomputing environment 8, from an input to thechat UI 52. Atblock 152, thetask automation system 24 communicates with thelogic system 54 to determine or infer an intent from the request. This can include communicating with thelanguage server 58 and/or referencing the trainedmodel 84. Moreover, this can also include translating the text of the request from one spoken language into another that can be interpreted by thelanguage server 58 or thelogic system 54. - At
block 154, thelogic system 54 determines executable instructions to implement one or more operations associated with the task based on the intent. That is, thelogic system 54 uses the intent inferred from the request to determine which endpoints, if any, should be communicated with to satisfy the request. For example, the request may involve fetching a latest application build, executing one or more tests, and returning a status of the initiated test(s), which would require determining which endpoints require instructions and the associated API calls to generate. This can also include generating executable instructions in a format that is understandable to that endpoint. Atblock 156, thelogic system 54 uses theAPI module 56 to communicate, via theinternal APIs 62, with the endpoints (e.g.,modules FIG. 4 ). Atblock 158, thetask automation system 24 receives data associated with the execution of the one or more operations, from the endpoints. As illustrated inFIG. 4 , this can include receiving multiple replies to theconversation automation module 60 to be formatted or otherwise rendered as conversational response(s) based on or including the received data, atblock 160. The conversational response(s) can then be rendered in thechat UI 52 atblock 162. - Turning now to
FIG. 10 , a screen shot 200 of thechat UI 52 is shown. In this example, afirst user message 204 includes atextual request 206, namely: “Please initiate a UI test on the latest mobile release” that can be entered via amessage input box 202. Thechat UI 52 in this example replies immediately with astatus message 210, namely: “Starting that for you . . . ”. Once communication has been established with the appropriate endpoint, thechat UI 52 can display afirst response message 212 withresponse text 214 and alink 216 to further content. In this example, theresponse text 214 indicates: “The test has been initiated and is in progress”, and thelink 216 allows the user to click through to status information on that test. By selecting thislink 216, the user can be navigated to another UI, a dashboard, or additional data can be inserted with in the chat UI screen. - Turning now to
FIG. 11 , the screen shot 200 of thechat UI 52 is shown in a subsequent stage of conversation. In this example, after selecting thelink 216 as illustrated inFIG. 10 , afurther status message 210 is displayed, in this case indicating: “Getting that for you . . . ”, with “that” referring to the status information associated with thelink 216. Afurther response message 212 is then displayed, which includes avisual representation 218 of the mobile UI test status. It can be appreciated that this message can include the status itself if space permits or, as shown inFIG. 11 , the user can select thevisual representation 218 to navigate to another user interface such as a statistics or monitoring dashboard. As such, it can be seen that thechat UI 52 can provide a central point for the user to issue commands, obtain data, and navigate to other programs without the need to have specialized knowledge or training with respect to these other programs or interfaces. - Referring to
FIG. 12 , an example embodiment of computer executable instructions for using thechat UI 52 to initiate periodic testing based on acquiring a latest build for an application is shown. In this example embodiment, the process flow proceeds from block 156 (seeFIG. 9 ) to the periodic testing execution and returns a result to thechat UI 52 at block 158 (seeFIG. 9 ). That is,FIG. 12 illustrates an underlying workflow that can be initiated by the user via thechat UI 52, without requiring knowledge of that workflow. Atblock 300, one or more tests are run for a current build. Atblock 302, test results data are acquired. It can be appreciated that the current test results data acquired atblock 302 can be compared with previous test results data atblock 304, e.g., to determine whether feedback provided to theapplication development environment 12 in a previous iteration has led to an improvement in the application. As shown inFIG. 12 , this comparison can be used to provide additional feedback with the test results data send to theapplication development environment 12 atblock 306. After a period of time, e.g., one day, one week, etc.; theapplication testing environment 10 detects the next testing cycle atblock 308 and requests a new build from theapplication development environment 12 atblock 310. It can be appreciated that blocks 306, 308 and 310 can be coordinated and/or executed by thetesting environment interface 50. The process can be repeated by returning to block 300 wherein new testing is performed using the latest build files that have been acquired, installed, and provisioned on thetest devices 22. - From
FIG. 12 it can be seen that the chat UI can provide a familiar and user friendly front end experience for a user to interface with a more complex process via the backend functionality provided by thetask automation system 24. - It will be appreciated that the examples and corresponding diagrams used herein are for illustrative purposes only. Different configurations and terminology can be used without departing from the principles expressed herein. For instance, components and modules can be added, deleted, modified, or arranged with differing connections without departing from these principles.
- The steps or operations in the flow charts and diagrams described herein are just for example. There may be many variations to these steps or operations without departing from the principles discussed above. For instance, the steps may be performed in a differing order, or steps may be added, deleted, or modified.
- Although the above principles have been described with reference to certain specific examples, various modifications thereof will be apparent to those skilled in the art as outlined in the appended claims.
Claims (22)
1. A device for executing operations in a performance engineering environment, the device comprising:
a processor;
a communications module coupled to the processor; and
a memory coupled to the processor, the memory storing computer executable instructions that when executed by the processor cause the processor to:
receive via the communications module, a request to implement automated testing within the performance engineering environment, from an input to a centrally integrated chat user interface connected to and between both an application testing environment and an application development environment of the performance engineering environment to coordinate testing operations among disparate systems;
generate and transmit, via the communications module to the centrally integrated chat user interface, one or more instructional elements that suggest modifications to the request;
receive, via the communications module, an updated request from another input to the centrally integrated chat user interface;
communicate with a logic system to determine an intent from the updated request;
determine one or more executable instructions to implement one or more operations associated with automated testing, based on the determined intent, wherein the one or more operations of the automated testing are implemented within one or more of the application testing environment and the application development environment;
communicate via the communications module, with at least one endpoint to trigger execution of the one or more operations using the one or more executable instructions;
receive via the communications module, data from the at least one endpoint, the data associated with execution of the one or more operations;
generate a conversational response to the updated request based on or including the data received from the at least one endpoint; and
have the conversational response rendered in the centrally integrated chat user interface.
2. The device of claim 1 , wherein the computer executable instructions further cause the processor to:
have the logic system access a language server to determine the intent from the updated request.
3. The device of claim 2 , wherein the computer executable instructions further cause the processor to:
have the logic system use the language server to apply a natural language understanding (NLU) process to correlate the updated request to the one or more operations associated with the automated testing.
4. The device of claim 1 , wherein the computer executable instructions further cause the processor to:
generate the executable instructions to implement the one or more operations in a format that is understandable to the at least one endpoint, wherein the device is configured to generate executable instructions in formats understandable to a plurality of disparate systems.
5. The device of claim 1 , wherein the computer executable instructions further cause the processor to:
have the logic system communicate with the at least one endpoint via a respective application programming interface (API) exposed by a respective endpoint, to trigger execution of the one or more operations.
6. The device of claim 1 , wherein the one or more operations comprises initiating an application build download process, and wherein the corresponding endpoint returns an initiation status.
7. The device of claim 1 , wherein the one or more operations comprises initiating a performance test, and wherein the corresponding endpoint returns test data which includes at least one of application traffic monitoring, determining UI response times, examining device logs, and determining device resource utilization metrics.
8. The device of claim 1 , wherein the one or more operations comprises checking the status of an executed or currently executing performance test, and wherein the corresponding endpoint returns a status indicator.
9. The device of claim 1 , wherein the updated request is translated from a first spoken language to a second spoken language processed by the logic system.
10. The device of claim 1 , wherein the computer executable instructions further cause the processor to:
have the logic system access a model generated by a machine learning system, the machine learning system using messages exchanged via the centrally integrated chat user interface to build and/or refine the model over time.
11. (canceled)
12. A method of executing operations in a performance engineering environment, the method executed by a device having a communications module and comprising:
receiving via the communications module, a request to implement automated testing within the performance engineering environment, from an input to a centrally integrated chat user interface connected to and between both an application testing environment and an application development environment of the performance engineering environment to coordinate testing operations among disparate systems;
generating and transmitting, via the communications module to the centrally integrated chat user interface, one or more instructional elements that suggest modifications to the request;
receiving, via the communications module, an updated request from another input to the centrally integrated chat user interface;
communicating with a logic system to determine an intent from the updated request;
determining one or more executable instructions to implement one or more operations associated with automated testing, based on the determined intent, wherein the one or more operations of the automated testing are implemented within one or more of the application testing environment and the application development environment;
communicating via the communications module, with at least one endpoint to trigger execution of the one or more operations using the one or more executable instructions;
receiving via the communications module, data from the at least one endpoint, the data associated with execution of the one or more operations;
generating a conversational response to the updated request based on or including the data received from the at least one endpoint; and
having the conversational response rendered in the centrally integrated chat user interface.
13. The method of claim 12 , further comprising:
generating the executable instructions to implement the one or more operations in a format that is understandable to the at least one endpoint, wherein the device is configured to generate executable instructions in formats understandable to a plurality of disparate systems.
14. The method of claim 12 , further comprising:
having the logic system communicate with the at least one endpoint via a respective application programming interface (API) exposed by a respective endpoint, to trigger execution of the one or more operations.
15. The method of claim 12 , wherein the one or more operations comprises initiating an application build download process, and wherein the corresponding endpoint returns an initiation status.
16. The method of claim 12 , wherein the one or more operations comprises initiating a performance test, and wherein the corresponding endpoint returns test data which includes at least one of application traffic monitoring, determining UI response times, examining device logs, and determining resource utilization metrics.
17. The method of claim 12 , wherein the one or more operations comprises checking the status of an executed or currently executing performance test, and wherein the corresponding endpoint returns a status indicator.
18. The method of claim 12 , further comprising:
having the logic system access a model generated by a machine learning system, the machine learning system using messages exchanged via the centrally integrated chat user interface to build and/or refine the model over time.
19. (canceled)
20. A non-transitory computer readable medium for executing operations in a performance engineering environment, the computer readable medium comprising computer executable instructions for:
receiving via the communications module, a request to implement automated testing within the performance engineering environment, from an input to a centrally integrated chat user interface connected to and between both an application testing environment and an application development environment of the performance engineering environment to coordinate testing operations among disparate systems;
generating and transmitting, via the communications module to the centrally integrated chat user interface, one or more instructional elements that suggest modifications to the request;
receiving, via the communications module, an updated request from another input to the centrally integrated chat user interface;
communicating with a logic system to determine an intent from the updated request;
determining one or more executable instructions to implement one or more operations associated with automated testing, based on the determined intent, wherein the one or more operations of the automated testing are implemented based one or more of the application testing environment and the application development environment;
communicating via the communications module, with at least one endpoint to trigger execution of the one or more operations using the one or more executable instructions;
receiving via the communications module, data from the at least one endpoint, the data associated with execution of the one or more operations;
generating a conversational response to the updated request based on or including the data received from the at least one endpoint; and
having the conversational response rendered in the centrally integrated chat user interface.
21. The method of claim 12 , further comprising:
having the logic system access a language server to determine the intent from the updated request.
22. The method of claim 21 , further comprising:
having the logic system use the language server to apply a natural language understanding (NLU) process to correlate the updated request to the one or more operations associated with the automated testing.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/248,461 US11394668B1 (en) | 2021-01-26 | 2021-01-26 | System and method for executing operations in a performance engineering environment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/248,461 US11394668B1 (en) | 2021-01-26 | 2021-01-26 | System and method for executing operations in a performance engineering environment |
Publications (2)
Publication Number | Publication Date |
---|---|
US11394668B1 US11394668B1 (en) | 2022-07-19 |
US20220239609A1 true US20220239609A1 (en) | 2022-07-28 |
Family
ID=82385202
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/248,461 Active US11394668B1 (en) | 2021-01-26 | 2021-01-26 | System and method for executing operations in a performance engineering environment |
Country Status (1)
Country | Link |
---|---|
US (1) | US11394668B1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116545852B (en) * | 2023-07-03 | 2023-09-01 | 深圳市江元科技(集团)有限公司 | Intelligent control method, system and medium for online release of mobile equipment |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10268571B2 (en) | 2009-12-22 | 2019-04-23 | Cyara Solutions Pty Ltd | System and method for automated thin client contact center agent desktop testing |
US10469418B2 (en) | 2009-12-22 | 2019-11-05 | Cyara Solutions Pty Ltd | Automated contact center customer mobile device client infrastructure testing |
KR101132560B1 (en) | 2010-06-09 | 2012-04-03 | 강원대학교산학협력단 | System and method for automatic interface testing based on simulation for robot software components |
US9038026B2 (en) | 2011-10-17 | 2015-05-19 | International Business Machines Corporation | System and method for automating test automation |
US20150314454A1 (en) | 2013-03-15 | 2015-11-05 | JIBO, Inc. | Apparatus and methods for providing a persistent companion device |
US9823811B2 (en) | 2013-12-31 | 2017-11-21 | Next It Corporation | Virtual assistant team identification |
US10645034B2 (en) | 2016-04-22 | 2020-05-05 | Smartbothub, Inc. | System and method for facilitating computer generated conversations with the aid of a digital computer |
US10474561B2 (en) | 2017-08-11 | 2019-11-12 | Wipro Limited | Method and system for automated testing of human machine interface (HMI) applications associated with vehicles |
US11249734B2 (en) * | 2018-02-07 | 2022-02-15 | Sangeeta Patni | Tri-affinity model driven method and platform for authoring, realizing, and analyzing a cross-platform application |
US11030412B2 (en) * | 2018-04-10 | 2021-06-08 | Verizon Patent And Licensing Inc. | System and method for chatbot conversation construction and management |
US11157244B2 (en) * | 2018-11-21 | 2021-10-26 | Kony, Inc. | System and method for delivering interactive tutorial platform and related processes |
US10776082B2 (en) * | 2018-11-28 | 2020-09-15 | International Business Machines Corporation | Programming environment augment with automated dialog system assistance |
US10983789B2 (en) * | 2019-01-25 | 2021-04-20 | Allstate Insurance Company | Systems and methods for automating and monitoring software development operations |
US10629191B1 (en) | 2019-06-16 | 2020-04-21 | Linc Global, Inc. | Methods and systems for deploying and managing scalable multi-service virtual assistant platform |
US11328181B2 (en) * | 2019-08-26 | 2022-05-10 | International Business Machines Corporation | Knowledge graph-based query in artificial intelligence chatbot with base query element detection and graph path generation |
US11709998B2 (en) * | 2019-09-06 | 2023-07-25 | Accenture Global Solutions Limited | Dynamic and unscripted virtual agent systems and methods |
-
2021
- 2021-01-26 US US17/248,461 patent/US11394668B1/en active Active
Also Published As
Publication number | Publication date |
---|---|
US11394668B1 (en) | 2022-07-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11847578B2 (en) | Chatbot for defining a machine learning (ML) solution | |
US11663523B2 (en) | Machine learning (ML) infrastructure techniques | |
US11475374B2 (en) | Techniques for automated self-adjusting corporation-wide feature discovery and integration | |
US11681607B2 (en) | System and method for facilitating performance testing | |
WO2021051031A1 (en) | Techniques for adaptive and context-aware automated service composition for machine learning (ml) | |
US11640351B2 (en) | System and method for automated application testing | |
US11726897B2 (en) | System and method for testing applications | |
US20230336340A1 (en) | Techniques for adaptive pipelining composition for machine learning (ml) | |
US10671809B2 (en) | Assisting with written communication style based on recipient dress style | |
US11394668B1 (en) | System and method for executing operations in a performance engineering environment | |
US11574015B2 (en) | Natural language interaction based data analytics | |
CA3106998C (en) | System and method for executing operations in a performance engineering environment | |
US20220122038A1 (en) | Process Version Control for Business Process Management | |
CA3107004C (en) | System and method for facilitating performance testing | |
US20220245060A1 (en) | System and Method for Automated Testing | |
US20230418734A1 (en) | System And Method for Evaluating Test Results of Application Testing | |
CA3108166A1 (en) | System and method for automated testing | |
US20230418722A1 (en) | System, Device, and Method for Continuous Modelling to Simiulate Test Results | |
US20220076079A1 (en) | Distributed machine learning scoring | |
CA3165228A1 (en) | System, device, and method for continuous modelling to simulate test results |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |