WO2018022030A1 - System automation tools - Google Patents

System automation tools Download PDF

Info

Publication number
WO2018022030A1
WO2018022030A1 PCT/US2016/044154 US2016044154W WO2018022030A1 WO 2018022030 A1 WO2018022030 A1 WO 2018022030A1 US 2016044154 W US2016044154 W US 2016044154W WO 2018022030 A1 WO2018022030 A1 WO 2018022030A1
Authority
WO
WIPO (PCT)
Prior art keywords
command
workflow
instructions
user interface
available
Prior art date
Application number
PCT/US2016/044154
Other languages
French (fr)
Inventor
Nikita SHARAEV
Muthu Kamaran MUTHU PANDIAN
Original Assignee
Schlumberger Technology Corporation
Schlumberger Canada Limited
Services Petroliers Schlumberger
Geoquest Systems B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Schlumberger Technology Corporation, Schlumberger Canada Limited, Services Petroliers Schlumberger, Geoquest Systems B.V. filed Critical Schlumberger Technology Corporation
Priority to PCT/US2016/044154 priority Critical patent/WO2018022030A1/en
Publication of WO2018022030A1 publication Critical patent/WO2018022030A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B17/00Systems involving the use of models or simulators of said systems
    • G05B17/02Systems involving the use of models or simulators of said systems electric
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management

Definitions

  • an exploration and production sector (E&P) software system allows users to interpret seismic data, perform well correlation, build reservoir models suitable for simulation, submit and visualize simulation results, calculate volumes, produce maps, and design development strategies to maximize reservoir exploitation.
  • Certain software systems may provide automation tools that allow users to create automatic workflows to perform multiple iterations of a test with different parameters, which can be used to compare results.
  • automation tools are generally limited in functionality, and many routines, functions, processes, and commands of the software system may not be available for automation because the automation tool may just allow specific and/or preprogrammed processes to be automated.
  • the automation tools may not be able to simulate user interactions with a user interface (UI), and, thus, may not be able to perform automations where user input is required for different iterations.
  • UI user interface
  • Systems, apparatus, computer-readable media, and methods are disclosed, of which the methods include creating a workflow for an automated test, receiving instructions to get a status of a first command and a first command identifier, determining whether the first command is available using the first command identifier, determining whether prerequisites for the first command are met using the first command identifier, receiving instructions to execute a second command and a second command identifier, receiving instructions to simulate a user interface interaction, and running the workflow by executing the second command associated with the second command identifier and simulating the user interface interaction.
  • the methods can include generating results based on running the workflow, where the results include a three-dimensional model, a graph, and/or a numerical result.
  • running the workflow can include running multiple iterations of the workflow and each iteration can include executing the second command and simulating the user interface interaction.
  • the list of available commands can include commands that are not pre-programmed for automation.
  • the methods can include receiving instructions to display available commands and displaying a list of available commands.
  • the methods can be performed using a software system, and receiving the instructions to display the available commands, displaying the list of available commands, receiving the instructions to get the status of the first command and the first command identifier, determining whether the first command is available using the first command identifier, determining whether the prerequisites for the first command are met using the first command identifier, receiving the instructions to execute the second command and the second command identifier, and receiving the instructions to simulate the user interface interaction are performed using a plug-in application to the software system provided by a party that created the software system, distributed the software system, and/or maintains the software system.
  • the instructions to execute the second command can include a parameter and a status variable.
  • the methods can include displaying an indication that the first command is disabled based on determining that the prerequisites for the first command are not met and/or determining that the first command is not available.
  • the methods can include displaying an indication that the first command is enabled based on determining that the prerequisites for the first command are met and determining that the first command is available.
  • simulating the user interface interaction can include selecting a user interface object, activating a pane, sending keyboard strokes, simulating a mouse event, getting a screen resolution, checking a dialog box status, interacting with a user interface object, interacting with a tree item, and/or showing a message.
  • receiving the instructions to simulate the user interface interaction can include receiving parameters via a dialog box.
  • Systems and apparatus include a processor and a memory system with non-transitory, computer-readable media storing instructions what, when executed by the processor, causes the systems and apparatus to perform operations that include creating a workflow for an automated test, receiving instructions to get a status of a first command and a first command identifier, determining whether the first command is available using the first command identifier, determining whether prerequisites for the first command are met using the first command identifier, receiving instructions to execute a second command and a second command identifier, receiving instructions to simulate a user interface interaction, and running the workflow by executing the second command associated with the second command identifier and simulating the user interface interaction.
  • Non-transitory, computer-readable media are also disclosed that store instructions that, when executed by a processor of a computing system, cause the computing system to perform operations that include creating a workflow for an automated test, receiving instructions to get a status of a first command and a first command identifier, determining whether the first command is available using the first command identifier, determining whether prerequisites for the first command are met using the first command identifier, receiving instructions to execute a second command and a second command identifier, receiving instructions to simulate a user interface interaction, and running the workflow by executing the second command associated with the second command identifier and simulating the user interface interaction.
  • Figure 1 illustrates an example of a system that includes various management components to manage various aspects of a geologic environment, according to an embodiment.
  • Figure 2 illustrates an example of a method for creating a new workflow, according to an embodiment.
  • Figure 3 illustrates an example of a method for displaying a list of available commands, according to an embodiment.
  • Figure 4 illustrates an example of a method for getting a command status, according to an embodiment.
  • Figure 5 illustrates an example of a method for executing a command, according to an embodiment.
  • Figure 6 illustrates an example of a method for simulating user interface interactions, according to an embodiment.
  • Figure 7 illustrates an example of a user interface for selecting workflow processes, according to an embodiment.
  • Figure 8 illustrates example dialog boxes for simulating user interface interactions, according to an embodiment.
  • Figure 9 illustrates example dialog boxes for testing tree items, according to an embodiment.
  • Figure 10 illustrates an example computing system that may execute methods of the present disclosure, according to an embodiment.
  • first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are used to distinguish one element from another. For example, a first object or step could be termed a second object or step, and, similarly, a second object or step could be termed a first object or step, without departing from the scope of the disclosure. The first object or step, and the second object or step, are both, objects or steps, respectively, but they are not to be considered the same object or step. [0030] The terminology used in the description herein is for the purpose of describing particular embodiments and is not intended to be limiting.
  • FIG 1 illustrates an example of a system 100 that includes various management components 1 10 to manage various aspects of a geologic environment 150 (e.g., an environment that includes a sedimentary basin, a reservoir 151, one or more faults 153-1, one or more geobodies 153-2, etc.).
  • the management components 110 may allow for direct or indirect management of sensing, drilling, injecting, extracting, etc., with respect to the geologic environment 150.
  • further information about the geologic environment 150 may become available as feedback 160 (e.g., optionally as input to one or more of the management components 110).
  • the management components 110 include a seismic data component 112, an additional information component 1 14 (e.g., well/logging data), a processing component 116, a simulation component 120, an attribute component 130, an analysis/visualization component 142, and a workflow component 144.
  • seismic data and other information provided per the components 112 and 114 may be input to the simulation component 120.
  • the simulation component 120 may rely on entities 122.
  • Entities 122 may include earth entities or geological objects such as wells, surfaces, bodies, reservoirs, etc.
  • the entities 122 can include virtual representations of actual physical entities that are reconstructed for purposes of simulation.
  • the entities 122 may include entities based on data acquired via sensing, observation, etc. (e.g., the seismic data 112 and other information 114).
  • An entity may be characterized by one or more properties (e.g., a geometrical pillar grid entity of an earth model may be characterized by a porosity property). Such properties may represent one or more measurements (e.g., acquired data), calculations, etc.
  • the simulation component 120 may operate in conjunction with a software framework such as an object-based framework.
  • entities may include entities based on pre-defined classes to facilitate modeling and simulation.
  • object-based framework is the MICROSOFT ® .NET ® framework (Redmond, Washington), which provides a set of extensible object classes.
  • .NET ® framework an object class encapsulates a module of reusable code and associated data structures.
  • Object classes can be used to instantiate object instances for use by a program, script, etc.
  • borehole classes may define objects for representing boreholes based on well data.
  • the simulation component 120 may process information to conform to one or more attributes specified by the attribute component 130, which may include a library of attributes. Such processing may occur prior to input to the simulation component 120 (e.g., consider the processing component 116). As an example, the simulation component 120 may perform operations on input information based on one or more attributes specified by the attribute component 130. In an example embodiment, the simulation component 120 may construct one or more models of the geologic environment 150, which may be relied on to simulate behavior of the geologic environment 150 (e.g., responsive to one or more acts, whether natural or artificial). In the example of Figure 1, the analysis/visualization component 142 may allow for interaction with a model or model-based results (e.g., simulation results, etc.). As an example, output from the simulation component 120 may be input to one or more other workflows, as indicated by a workflow component 144.
  • the simulation component 120 may include one or more features of a simulator such as the ECLIPSETM reservoir simulator (Schlumberger Limited, Houston Texas), the INTERSECTTM reservoir simulator (Schlumberger Limited, Houston Texas), etc.
  • a simulation component, a simulator, etc. may include features to implement one or more meshless techniques (e.g., to solve one or more equations, etc.).
  • a reservoir or reservoirs may be simulated with respect to one or more enhanced recovery techniques (e.g., consider a thermal process such as SAGD, etc.).
  • the management components 110 may include features of a commercially available framework such as the PETREL ® seismic to simulation software framework (Schlumberger Limited, Houston, Texas).
  • the PETREL ® framework provides components that allow for optimization of exploration and development operations.
  • the PETREL ® framework includes seismic to simulation software components that can output information for use in increasing reservoir performance, for example, by improving asset team productivity.
  • various professionals e.g., geophysicists, geologists, and reservoir engineers
  • Such a framework may be considered an application and may be considered a data-driven application (e.g., where data is input for purposes of modeling, simulating, etc.).
  • various aspects of the management components 110 may include add-ons or plug-ins that operate according to specifications of a framework environment.
  • a framework environment e.g., a commercially available framework environment marketed as the OCEAN ® framework environment (Schlumberger Limited, Houston, Texas) allows for integration of addons (or plug-ins) into a PETREL ® framework workflow.
  • the OCEAN ® framework environment leverages .NET ® tools (Microsoft Corporation, Redmond, Washington) and offers stable, user- friendly interfaces for efficient development.
  • various components may be implemented as add-ons (or plug-ins) that conform to and operate according to specifications of a framework environment (e.g., according to application programming interface (API) specifications, etc.).
  • API application programming interface
  • Figure 1 also shows an example of a framework 170 that includes a model simulation layer 180 along with a framework services layer 190, a framework core layer 195 and a modules layer 175.
  • the framework 170 may include the commercially available OCEAN ® framework where the model simulation layer 180 is the commercially available PETREL ® model-centric software package that hosts OCEAN ® framework applications.
  • the PETREL ® software may be considered a data-driven application.
  • the PETREL ® software can include a framework for model building and visualization.
  • a framework may include features for implementing one or more mesh generation techniques.
  • a framework may include an input component for receipt of information from interpretation of seismic data, one or more attributes based at least in part on seismic data, log data, image data, etc.
  • Such a framework may include a mesh generation component that processes input information, optionally in conjunction with other information, to generate a mesh.
  • the model simulation layer 180 may provide domain objects 182, act as a data source 184, provide for rendering 186 and provide for various user interfaces 188.
  • Rendering 186 may provide a graphical environment in which applications can display their data while the user interfaces 188 may provide a common look and feel for application user interface components.
  • the domain objects 182 can include entity objects, property objects and optionally other objects.
  • Entity objects may be used to geometrically represent wells, surfaces, bodies, reservoirs, etc.
  • property objects may be used to provide property values as well as data versions and display parameters.
  • an entity object may represent a well where a property object provides log information as well as version information and display information (e.g., to display the well as part of a model).
  • data may be stored in one or more data sources (or data stores, generally physical data storage devices), which may be at the same or different physical sites and accessible via one or more networks.
  • the model simulation layer 180 may be configured to model projects. As such, a particular project may be stored where stored project information may include inputs, models, results and cases. Thus, upon completion of a modeling session, a user may store a project. At a later time, the project can be accessed and restored using the model simulation layer 180, which can recreate instances of the relevant domain objects.
  • the geologic environment 150 may include layers (e.g., stratification) that include a reservoir 151 and one or more other features such as the fault 153-1, the geobody 153-2, etc.
  • the geologic environment 150 may be outfitted with any of a variety of sensors, detectors, actuators, etc.
  • equipment 152 may include communication circuitry to receive and to transmit information with respect to one or more networks 155.
  • Such information may include information associated with downhole equipment 154, which may be equipment to acquire information, to assist with resource recovery, etc.
  • Other equipment 156 may be located remote from a well site and include sensing, detecting, emitting or other circuitry.
  • Such equipment may include storage and communication circuitry to store and to communicate data, instructions, etc.
  • one or more satellites may be provided for purposes of communications, data acquisition, etc.
  • Figure 1 shows a satellite in communication with the network 155 that may be configured for communications, noting that the satellite may also include circuitry for imagery (e.g., spatial, spectral, temporal, radiometric, etc.).
  • imagery e.g., spatial, spectral, temporal, radiometric, etc.
  • Figure 1 also shows the geologic environment 150 as optionally including equipment 157 and 158 associated with a well that includes a substantially horizontal portion that may intersect with one or more fractures 159.
  • equipment 157 and 158 associated with a well that includes a substantially horizontal portion that may intersect with one or more fractures 159.
  • a well in a shale formation may include natural fractures, artificial fractures (e.g., hydraulic fractures) or a combination of natural and artificial fractures.
  • a well may be drilled for a reservoir that is laterally extensive.
  • lateral variations in properties, stresses, etc. may exist where an assessment of such variations may assist with planning, operations, etc. to develop a laterally extensive reservoir (e.g., via fracturing, injecting, extracting, etc.).
  • the equipment 157 and/or 158 may include components, a system, systems, etc. for fracturing, seismic sensing, analysis of seismic data, assessment of one or more fractures, etc.
  • a workflow may be a process that includes a number of worksteps.
  • a workstep may operate on data, for example, to create new data, to update existing data, etc.
  • a workstep may operate on one or more inputs and create one or more results, for example, based on one or more algorithms.
  • a system may include a workflow editor for creation, editing, executing, etc. of a workflow. In such an example, the workflow editor may provide for selection of one or more pre-defined worksteps, one or more customized worksteps, etc.
  • a workflow may be a workflow implementable in the PETREL ® software, for example, that operates on seismic data, seismic attribute(s), etc.
  • a workflow may be a process implementable in the OCEAN ® framework.
  • a workflow may include one or more worksteps that access a module such as a plug-in (e.g., external executable code, etc.).
  • system 100 may include an E&P software system allows users to interpret seismic data, perform well correlation, build reservoir models suitable for simulation, submit and visualize simulation results, calculate volumes, produce maps, and design development strategies to maximize reservoir exploitation.
  • the E&P software system may provide automation tools that allow users to create automatic workflows to perform multiple iterations of a test using different parameters, which can be used to compare results.
  • one or more of the automation tools, including the automation tools discussed below may be an "out-of-the- box" feature of the software system that is included in an unmodified version of the software system.
  • one or more of the automation tools, including the automation tools discussed below may be installed after the initial installation of the software system, such as, for example, in an update and/or as a plug-in application of the software system.
  • a plug-in application can be a software component that adds one or more specific features to a software system.
  • the plug-in application can be provided by a party that created, distributed, and/or maintains the software system.
  • the plug-in application can be provided by a third-party entity that is not affiliated with the party or parties that created, distributed, and/or maintains the software system.
  • automation tools provided by the software system and/or by plug-ins to the software system (hereinafter, “automation tool”) can allow users to generate automated tests using routines, functions, or processes of the software system (hereinafter, “commands"), which may not have been specifically selected or pre-programmed to be automated.
  • automation tool may allow users to perform automated tests using an entire library of commands available to the software system (e.g., including out-of-the-box commands of the software system, commands included in updates to the software system, commands of installed plug-ins, etc.).
  • the automation tool can simulate user interactions with a UI, and, thus, can perform automated tests where user input is used for different iterations
  • Figure 2 illustrates an example of a method for creating a new workflow, according to an embodiment.
  • the new workflow can be used to automate multiple iterations of tests in an E&P software system.
  • the example method illustrated in Figure 2 can be performed using a computing device that includes the framework (e.g., framework 170) and the management components (e.g., management components 110) described above with reference to Figure 1.
  • framework e.g., framework 170
  • management components e.g., management components 110
  • the example method can begin in 200 by creating a new instance of a workflow.
  • the new instance of the workflow can be created based on instructions from a user of the software system (e.g., via a UI of the software system).
  • the workflow can be initialized with default parameters (e.g., workflow name, author name, description, etc.) and may not include any operations, utilities, or processes upon initialization.
  • the computing device can add operations to the workflow.
  • the computing device can display a list of one or more operations via a UI of the software system and one or more operations can be added based on instructions from a user via the UI.
  • the operations can be, for example, operations that are pre-programmed in the software system and/or operations that are added to the software system via one or more plug-ins.
  • the operations can be operations related to seismic simulations, three-dimensional (3D) map simulations, map-based volume calculations, model extractions, map making, obtaining properties, obtaining values from properties, calculations, filters, well simulations, well attribute simulations, points with well attribute simulations, sector modeling, arithmetic operations, etc.
  • the computing device can add utilities to the workflow.
  • the computing device can display a list of one or more utilities via a UI of the software system and one or more utilities can be added based on instructions from a user via the UI.
  • the utilities can be, for example, utilities that are pre-programmed in the software system and/or utilities that are added to the software system via one or more plug-ins.
  • the utilities can be utilities related to programing and/or logical statements (e.g., while loops, for loops, for all functions, if statements, stop statements, run statements, etc.), or utilities related to variables (e.g., set reference, set reference list, numeric expression, string expression, data expression, get name, etc.).
  • the computing device can add processes to the workflow.
  • the computing device can display a list of one or more processes via a UI of the software system and one or more processes can be added based on instructions from a user via the UI.
  • the processes can be, for example, processes that are pre-programmed in the software system and/or processes that are added to the software system via one or more plug-ins.
  • the processes can be processes related to getting, printing a list of, or executing commands for automated testing, as discussed in further detail below.
  • the processes can be processes related to simulating UI interactions for automated testing, as discussed in further detail below.
  • the computing system can run the workflow and generate results.
  • the computing system can compile and run the workflow based on instructions from a user of the software system (e.g., via a UI of the software system). Accordingly, the computing system can run the workflow using the order of operations, utilities, and/or processes in the workflow to perform one or more iterations of an automated test. The computing system can then generate and output results for one or more iterations of the automated test.
  • the results can be generated 3D models (e.g., of a well, of a reservoir, etc.), generated graphs, generated numerical results, etc. that represent the one or more iterations of the automated test.
  • Figure 3 illustrates an example of a method for displaying a list of available commands, according to an embodiment.
  • the example method illustrated in Figure 3 can be performed using a computing device that includes the framework (e.g., framework 170) and the management components (e.g., management components 110) described above with reference to Figure 1.
  • framework e.g., framework 170
  • management components e.g., management components 110
  • the example method can begin in 300, when the computing device displays a list of available processes for use in a workflow.
  • the computing device can display the list of available processes based on instructions from a user to create a new workflow and/or based on instructions from a user to view the list of available processes.
  • the list of available processes can be displayed in a window of a dialog box for a workflow editor.
  • the computing device can receive instructions to print all command identifiers ("IDs"). For example, a user can select a print all command IDs process (e.g., via the window of the dialog box for the workflow editor). The computing device can then execute the process or can execute the process based on instructions from the user to compile and/or run the workflow.
  • IDs all command identifiers
  • the computing device can determine a list of available commands.
  • the print all command IDs process can be a process provided by an automation tool.
  • the automation tool can be part of an out-of-the-box feature of the software system or may be part of a plug-in provided by a party that created, distributed, and/or maintains the software system. Accordingly, the automation tool may have access to an entire library of commands available to the software system (e.g., including commands of installed plug-ins), that include commands that are not available to 3 rd party plug-ins.
  • the computing device can display command IDs for each command in the list of available commands. For example, the computing device can display command IDs for each command in the library of commands available to the software system. In some embodiments, the command IDs can be displayed in a dialog box of the software system. In further embodiments, the computing device can additionally display parameters associated with the commands with the corresponding command ID.
  • Figure 4 illustrates an example of a method for getting a command status, according to an embodiment.
  • the example method illustrated in Figure 4 can be performed using a computing device that includes the framework (e.g., framework 170) and the management components (e.g., management components 110) described above with reference to Figure 1.
  • framework e.g., framework 170
  • management components e.g., management components 110
  • the example method can begin in 400, when the computing device displays a list of available processes for use in a workflow.
  • the computing device can display the list of available processes based on instructions from a user to create a new workflow and/or based on instructions from a user to view the list of available processes.
  • the list of available processes can be displayed in a window of a dialog box for a workflow editor.
  • the computing device can receive instructions to get a command status. For example, a user can select a get command status process (e.g., via the window of the dialog box for the workflow editor).
  • the computing device can receive a command ID.
  • the computing device can prompt a user for the command ID in response to receiving instructions to get a command status.
  • the computing device can then execute the get command status process or can execute the process based on instructions from the user to compile and/or run the workflow.
  • the computing device can determine if the command is available for use. For example, certain commands may be associated with one or more specific licenses associated with the software system. A user running the software system may not have certain licenses associated with the software system. Accordingly, not some features and/or commands may not be available to the user. Thus, the computing device can determine the licenses the user or the instance of the system have, determine whether the command is available with one or more licenses, and determine whether the licenses the user or the instance of the system have allow use of the command. [0068] Additionally, for example, certain commands may not be installed on each instance of the software system. Accordingly, some features and/or commands may not be available to the user unless plug-ins, updates, etc. are installed. Thus, the computing device can determine whether a plug-in, update, etc. associated with the command is installed.
  • the computing device can, in 450, display an indication that the command is disabled.
  • the disabled status can be displayed in a dialog box of the software system.
  • the computing device can additionally display why the command is disabled (e.g., associated with a specific license the user does not have, not installed, etc.).
  • the computing device can, in 460, determine if prerequisites for the command are met.
  • a command may utilize other functions, processes, models, features, logs, windows, etc. in order to perform its function.
  • the command may not have been specifically designed for testing and may be designed to run in certain situations where certain events are assumed to have occurred.
  • a command can be associated with a 3D model and may normally be run when a 3D model is presented in an open 3D window for visualization.
  • the command may utilize certain petrophysical logs.
  • prerequisites for running the command may be that a 3D model is created, a 3D window is open, and the petrophysical logs are populated.
  • the computing device can, in 450, display an indication that the command is disabled.
  • the disabled status can be displayed in a dialog box of the software system.
  • the computing device can additionally display why the command is disabled (e.g., the prerequisites are not met, which prerequisites are not met, instructions on how to meet the prerequisites, etc.).
  • the computing device can, in 480, display an indication that the command is enabled.
  • the enabled status can be displayed in a dialog box of the software system.
  • Figure 5 illustrates an example of a method for executing a command, according to an embodiment.
  • the example method illustrated in Figure 5 can be performed using a computing device that includes the framework (e.g., framework 170) and the management components (e.g., management components 110) described above with reference to Figure 1.
  • the example method can begin in 500, when the computing device displays a list of available processes for use in a workflow.
  • the computing device can display the list of available processes based on instructions from a user to create a new workflow and/or based on instructions from a user to view the list of available processes.
  • the list of available processes can be displayed in a window of a dialog box for a workflow editor.
  • the computing device can receive instructions to execute a command. For example, a user can add an execute command process to a workflow (e.g., via the window of the dialog box for the workflow editor).
  • the computing device can display an indication of the execute command process in a dialog box of the software system as an item in a workflow.
  • the indication of the execute command process can include text input boxes for a command ID, a parameter of the command, and a status variable of the command.
  • the computing device can receive a command ID associated with the instructions to execute the command.
  • the user can enter the command ID using the indication of the execute command process displayed in the dialog box.
  • the user can retrieve the command ID using the print all command IDs process, discussed above.
  • the user can retrieve the command ID using a configuration designer tool.
  • the computing device can receive a parameter associated with the instructions to execute the command.
  • the user can enter the parameter using the indication of the execute command process displayed in the dialog box.
  • the user can retrieve the parameter using the print all command IDs process, discussed above.
  • the user can retrieve the parameter using a configuration designer tool.
  • the parameter can be, for example, a function, a model, a map, a simulation, a property, an attribute, etc.
  • the command associated with a command ID may not be associated with a parameter. Accordingly, in such embodiments, no parameter may be entered in 530.
  • the computing device can receive a status variable associated with the instructions to execute the command. In some embodiments, the user can enter a name of the status variable using the indication of the execute command process displayed in the dialog box.
  • the computing device can execute the command.
  • the computing device can execute the command based on instructions from the user to compile and/or run the workflow.
  • the computing device can run the workflow using the order of operations, utilities, and/or processes (including the execute command process) in the workflow to perform one or more iterations of an automated test.
  • the computing device can retrieve the parameter specified by the user, use the parameter to execute the command, and output a status of the command using the status variable.
  • Figure 6 illustrates an example of a method for simulating user interface interactions, according to an embodiment.
  • the example method illustrated in Figure 6 can be performed using a computing device that includes the framework (e.g., framework 170) and the management components (e.g., management components 110) described above with reference to Figure 1.
  • framework e.g., framework 170
  • management components e.g., management components 110
  • the example method can begin in 600, when the computing device displays a list of available processes for use in a workflow.
  • the computing device can display the list of available processes based on instructions from a user to create a new workflow and/or based on instructions from a user to view the list of available processes.
  • the list of available processes can be displayed in a window of a dialog box for a workflow editor.
  • the computing device can select a UI object based on received instructions. For example, a user can add a select object process to a workflow (e.g., via the window of the dialog box for the workflow editor).
  • the computing device can display an indication of the select object process in a dialog box of the software system as an item in a workflow.
  • the indication of the select object process can include an input box for the obj ect to be selected and a text input box for a status variable of the obj ect to be selected.
  • the object can be a seismic object such as, for example, lines, polylines, intersection, seismic cubs, 3D volumes, attributes, horizon and fault interpretations, geoprobes, geobodies, fault models, etc.
  • the object can be a general object, such as, for example, 3D grid properties, annotations, cross plots, control points, filters, functions, intersections, targets, histograms, grids, points, polygons, surfaces, logs, plans, etc.
  • the user can enter a name of the status variable using the indication of the select object process displayed in the dialog box.
  • a status of the object can be output to the status variable after running the select object process,
  • the computing device can activate a pane based on received instructions. For example, a user can add an activate pane process to a workflow (e.g., via the window of the dialog box for the workflow editor).
  • the computing device can display an indication of the activate pane process in a dialog box of the software system as an item in a workflow.
  • the indication of the activate pane process can include an input text box for a name of the pane to activate.
  • the pane can be a UI element such as a window or screen that can include, for example, data or controls.
  • a UI element such as a window or screen that can include, for example, data or controls.
  • an activate pane process can be used to list data the results from one or more tests, and/or the data in the pane can be used as input for other tests/processes.
  • the computing device can send keyboard strokes based on received instructions. For example, a user can add a send keyboard strokes process to a workflow (e.g., via the window of the dialog box for the workflow editor).
  • the computing device can display an indication of the send keyboard strokes process in a dialog box of the software system as an item in a workflow.
  • the indication of the send keyboard strokes process can include an input box for a keyboard symbol of the keyboard stroke to send, a numerical input box for a delay timer (e.g., in seconds), and a checkbox for indicating whether to wait for a keyboard process command to finish before proceeding.
  • the keyboard stroke can be used to simulate a user interacting with the UI. For example, to simulate a user entering values into multiple text boxes, a keyboard stroke for the TAB key can be used to move the cursor to the next text box in sequence, and the keyboard stroke can be delayed one second to ensure that the cursor has switched to the next text box before entering a value.
  • the computing device can simulate a mouse event based on received instructions. For example, a user can add a simulate mouse event process to a workflow (e.g., via the window of the dialog box for the workflow editor).
  • the computing device can display an indication of the simulate mouse event process in a dialog box of the software system as an item in a workflow.
  • the indication of the simulate mouse event process can include an input box for an X coordinate and for a Y coordinate of the mouse event.
  • the indication of the simulate mouse event process can include an input box for a drag stop X coordinate and for a drag stop Y coordinate of the mouse event.
  • the mouse event can be used to simulate a mouse click at the specified X and Y coordinates (e.g., in pixels). Accordingly, a user can fill in an X coordinate and a Y coordinate of the mouse event and may not fill in a drag stop X coordinate or a drag stop Y coordinate of the mouse event.
  • the mouse event can be used to simulate a mouse drag from a first position to a second position. Accordingly, a user can fill in an X coordinate and a Y coordinate of the mouse event (i.e., the drag start position) and fill in a drag stop X coordinate and a drag stop Y coordinate of the mouse event (i.e., the drag stop position).
  • the computing device can get a screen resolution based on received instructions. For example, a user can add a get screen resolution process to a workflow (e.g., via the window of the dialog box for the workflow editor).
  • the computing device can display an indication of the get screen resolution process in a dialog box of the software system as an item in a workflow.
  • the indication of the get screen resolution process can include a text input box for a width variable and a text input box for a height variable of the screen resolution.
  • the get screen resolution process can be used to obtain the screen resolution of the user's device (e.g., in pixels).
  • the determined height and width can be output as the selected height and width variables. Accordingly, a user can use the screen resolution to determine pixel positions of a mouse event (e.g., in 640).
  • the computing device can check a dialog box status based on received instructions. For example, a user can add a check dialog status process to a workflow (e.g., via the window of the dialog box for the workflow editor). In some embodiments, the computing device can display an indication of the check dialog status process in a dialog box of the software system as an item in a workflow. In further embodiments, the indication of the check dialog status process can include a text input box for a name of the dialog box, an input box for a status variable, and/or an input box for the dialog box object. [0097] In some embodiments, the check dialog status process can be used to determine if a dialog box exists and/or if a dialog box is open when the check dialog status process is run.
  • the computing device can search for a dialog box associated with the name of the dialog box entered by a user and determine if the dialog box exists and/or is open. As an additional example, the computing device can determine if a dialog box, corresponding to the dialog box object that was dragged into the input box, exists and/or is open.
  • the user can enter a name of the status variable using the indication of the check dialog status process displayed in the dialog box.
  • a status of the object can be output to the status variable after running the check dialog status process.
  • Example statuses include, does not exist, exists and not open, and exists and open.
  • the computing device can interact with UI objects in a dialog box based on received instructions. For example, a user can add a dialog process to a workflow (e.g., via the window of the dialog box for the workflow editor).
  • the computing device can display an indication of the dialog process in a dialog box of the software system as an item in a workflow.
  • the indication of the dialog process can include a text input box for a name of the dialog box and a text input box for a name of the UI object.
  • the indication of the dialog process can also include an input box for an action object, a checkbox to indicate whether the UI object name is an automation ID, an input box for an input/output variable, and an input box for a status variable.
  • the dialog process can be used to interact with the specified UI object in the specified dialog box.
  • Inputs and outputs of the UI object and/or the dialog box can be entered and returned, respectively, via the input/output variable.
  • a UI object can be a text box, and inputting of text can be simulated by entering in text from the input/output variable and a result of entering the text can be added to the input/output variable.
  • a status of the UI object and/or dialog box can be output to the status variable after running the dialog process.
  • the computing device can interact with tree items based on received instructions. For example, a user can add a tree process to a workflow (e.g., via the window of the dialog box for the workflow editor).
  • the computing device can display an indication of the tree process in a dialog box of the software system as an item in a workflow.
  • the indication of the dialog process can include a text input box for a name of a pane and/or tree and checkboxes or radio buttons to indicate whether the input text is a pane name and/or an automation ID.
  • the indication of the dialog process can also include an input box for a tree item object, an action object, an input box for an output variable, and an input box for a status variable.
  • the tree process can be used to select tree items for use.
  • tree items can be input data, models, results, cases, templates, workflows, or processes that are represented in a tree structure in the UI.
  • the tree process can be used to find a tree based on the pane/tree name (e.g., Workflows), select a tree item (e.g., Variable A), and perform an action (e.g., double click).
  • pane/tree name e.g., Workflows
  • select a tree item e.g., Variable A
  • perform an action e.g., double click.
  • a status of the tree and/or tree item can be output to the status variable after running the tree process.
  • an output from performing the action can be added to the output variable for output to the user, use by other commands, etc.
  • the computing device can show a message based on received instructions. For example, a user can add a show message process to a workflow (e.g., via the window of the dialog box for the workflow editor).
  • the computing device can display an indication of the show message process in a dialog box of the software system as an item in a workflow.
  • the indication of the show message process can include a text input box for a message caption, a text input box for a message, and a numerical input box for a duration.
  • the show message process can be used to display a message (e.g., in a separate dialog box) for a specific duration.
  • a message can be displayed with the input caption and the specified message, and the message can timeout after the indicated duration (e.g., in seconds).
  • Figure 7 illustrates an example of a user interface for selecting workflow processes, according to an embodiment.
  • UI 700 can be a pane that includes a tree of automation processes.
  • a computing device can display the tree when, for example, a user instructs the computing device to create a new workflow and/or the user instructs the computing device to display workflow processes from a selection that includes workflow utilities, workflow operations, and workflow processes.
  • UI 700 can show different command manager testing processes that can selected, including get command status, print all command IDs, and execute command.
  • UI 700 can show different UI interaction testing processes that can be selected, including select object, activate pane, send keys (e.g., send keyboard strokes, discussed above), mouse event (e.g., simulate mouse event, discussed above), screen resolution (e.g., get screen resolution, discussed above), find dialog (e.g., check dialog box status, discussed above), dialog (e.g., interact with UI objects in a dialog box, discussed above), tree (e.g., interact with tree items, discussed above), and show message.
  • send keys e.g., send keyboard strokes, discussed above
  • mouse event e.g., simulate mouse event, discussed above
  • screen resolution e.g., get screen resolution, discussed above
  • find dialog e.g., check dialog box status, discussed above
  • dialog e.g., interact with UI objects in a dialog box, discussed above
  • tree e.g., interact with tree items, discussed above
  • UI 700 can show: a players testing process, time player; different performance testing processes, create frame per second (FPS) counter and get FPS measurement; different service testing processes, open reference project and close reference project; different data generation testing processes, create seismic cube and create well; a meta data testing process, meta data; etc.
  • FPS frame per second
  • Figure 8 illustrates example dialog boxes for simulating user interface interactions, according to an embodiment.
  • a dialog box 800 can be used to interact with UI objects in a selected dialog box for testing.
  • parameters such as a dialog box name, an action, an object name or automation ID, and/or an input/output variable using the indication of the dialog process in the workflow editor, as discussed above.
  • the user can double click on the indication, and the computing device can display dialog box 800 for entering parameters.
  • dialog box 800 the user can enter the dialog box name, select an action from a drop down list (shown in dialog box 810), input an object name, indicate whether the object name is an automation ID, etc.
  • the user can use the object inspector button to create an object inspector dialog box that can be used to find objects, dialog box names, automation IDs, etc.
  • the object inspector button can invoke a user interface inspection tool, such as the Inspect WINDOWS ® -based tool provided by the Microsoft Corporation.
  • Figure 9 illustrates example dialog boxes for testing tree items, according to an embodiment.
  • a dialog box 900 can be used to interact with tree items for testing.
  • a user can enter parameters, such as a tree or pane name, an indication of whether the name is a pane name or an automation ID, input tree items, and an action using the indication of the tree process in the workflow editor, as discussed above.
  • the user can double click on the indication, and the computing device can display dialog box 900 for entering parameters.
  • dialog box 900 the user can enter a tree or pane name, indicate whether the name is a pane name or an automation ID (using the checkboxes), input tree items, select an action from a drop down list (shown in dialog box 910), etc.
  • the user can use the object inspector button to create an object inspector dialog box that can be used to find trees, panes, pane names, automation IDs, etc.
  • the methods of the present disclosure may be executed by a computing system.
  • Figure 10 illustrates an example of such a computing system 1000, in accordance with some embodiments.
  • the computing system 1000 may include a computer or computer system 1001-1, which may be an individual computer system 1001-1 or an arrangement of distributed computer systems.
  • the computer system 1001-1 includes one or more analysis modules 1002 that are configured to perform various tasks according to some embodiments, such as one or more methods disclosed herein. To perform these various tasks, the analysis module 1002 executes independently, or in coordination with, one or more processors 1004, which is (or are) connected to one or more storage media 1006.
  • the processor(s) 1004 is (or are) also connected to a network interface 1007 to allow the computer system 1001-1 to communicate over a data network 1009 with one or more additional computer systems and/or computing systems, such as 1001-2, 1001-3, and/or 1001-4 (note that computer systems 1001-2, 1001-3, and/or 1001-4 may or may not share the same architecture as computer system 1001-1, and may be located in different physical locations, e.g., computer systems 1001-1 and 1001-2 may be located in a processing facility, while in communication with one or more computer systems such as 1001-3 and/or 1001- 4 that are located in one or more data centers, and/or located in varying countries on different continents).
  • a processor may include a microprocessor, microcontroller, processor module or subsystem, programmable integrated circuit, programmable gate array, or another control or computing device.
  • the storage media 1006 may be implemented as one or more computer-readable or machine-readable storage media. Note that while in the example embodiment of Figure 10 storage media 1006 is depicted as within computer system 1001-1, in some embodiments, storage media 1001-1 may be distributed within and/or across multiple internal and/or external enclosures of computing system 1001-1 and/or additional computing systems.
  • Storage media 1006 may include one or more different forms of memory including semiconductor memory devices such as dynamic or static random access memories (DRAMs or SRAMs), erasable and programmable read-only memories (EPROMs), electrically erasable and programmable read-only memories (EEPROMs) and flash memories, magnetic disks such as fixed, floppy and removable disks, other magnetic media including tape, optical media such as compact disks (CDs) or digital video disks (DVDs), BLURAY ® disks, or other types of optical storage, or other types of storage devices.
  • semiconductor memory devices such as dynamic or static random access memories (DRAMs or SRAMs), erasable and programmable read-only memories (EPROMs), electrically erasable and programmable read-only memories (EEPROMs) and flash memories
  • magnetic disks such as fixed, floppy and removable disks, other magnetic media including tape
  • optical media such as compact disks (CDs) or digital video disks (DVDs)
  • DVDs digital video disks
  • Such computer-readable or machine-readable storage medium or media is (are) considered to be part of an article (or article of manufacture).
  • An article or article of manufacture may refer to any manufactured single component or multiple components.
  • the storage medium or media may be located either in the machine running the machine-readable instructions, or located at a remote site from which machine-readable instructions may be downloaded over a network for execution.
  • computing system 1000 contains automation module(s) 1008 for creating workflows, displaying available commands, getting command statuses, executing commands, simulating user interface interactions, generating user interfaces, generating dialog boxes, etc.
  • computer system 1001-1 includes the automation module 1008.
  • an automation module may be used to perform aspects of one or more embodiments of the methods disclosed herein.
  • a plurality of automation modules may be used to perform aspects of methods disclosed herein.
  • computing system 1000 is one example of a computing system, and that computing system 1000 may have more or fewer components than shown, may combine additional components not depicted in the example embodiment of Figure 10, and/or computing system 1000 may have a different configuration or arrangement of the components depicted in Figure 10.
  • the various components shown in Figure 10 may be implemented in hardware, software, or a combination of both hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • the steps in the processing methods described herein may be implemented by running one or more functional modules in information processing apparatus such as general purpose processors or application specific chips, such as ASICs, FPGAs, PLDs, or other appropriate devices. These modules, combinations of these modules, and/or their combination with general hardware are included within the scope of protection of the disclosure.
  • Geologic interpretations, models, and/or other interpretation aids may be refined in an iterative fashion; this concept is applicable to the methods discussed herein. This may include use of feedback loops executed on an algorithmic basis, such as at a computing device (e.g., computing system 1000, Figure 10), and/or through manual control by a user who may make determinations regarding whether a given step, action, template, model, or set of curves has become sufficiently accurate for the evaluation of the subsurface three-dimensional geologic formation under consideration.
  • a computing device e.g., computing system 1000, Figure 10

Abstract

Systems and methods for creating a workflow for an automated test, of which the method includes receiving instructions to get a status of a first command and a first command identifier, determining whether the first command is available using the first command identifier, determining whether prerequisites for the first command are met using the first command identifier, receiving instructions to execute a second command and a second command identifier, receiving instructions to simulate a user interface interaction, and running the workflow by executing the second command associated with the second command identifier and simulating the user interface interaction.

Description

SYSTEM AUTOMATION TOOLS
Background
[0001] Workers in various organizations utilize and often rely on software systems to perform their work. For example, in the oil and gas industry, an exploration and production sector (E&P) software system allows users to interpret seismic data, perform well correlation, build reservoir models suitable for simulation, submit and visualize simulation results, calculate volumes, produce maps, and design development strategies to maximize reservoir exploitation.
[0002] Certain software systems may provide automation tools that allow users to create automatic workflows to perform multiple iterations of a test with different parameters, which can be used to compare results. However, such automation tools are generally limited in functionality, and many routines, functions, processes, and commands of the software system may not be available for automation because the automation tool may just allow specific and/or preprogrammed processes to be automated. Additionally, the automation tools may not be able to simulate user interactions with a user interface (UI), and, thus, may not be able to perform automations where user input is required for different iterations.
Summary
[0003] Systems, apparatus, computer-readable media, and methods are disclosed, of which the methods include creating a workflow for an automated test, receiving instructions to get a status of a first command and a first command identifier, determining whether the first command is available using the first command identifier, determining whether prerequisites for the first command are met using the first command identifier, receiving instructions to execute a second command and a second command identifier, receiving instructions to simulate a user interface interaction, and running the workflow by executing the second command associated with the second command identifier and simulating the user interface interaction.
[0004] In some embodiments, the methods can include generating results based on running the workflow, where the results include a three-dimensional model, a graph, and/or a numerical result.
[0005] In further embodiments, running the workflow can include running multiple iterations of the workflow and each iteration can include executing the second command and simulating the user interface interaction. [0006] In other embodiments, the list of available commands can include commands that are not pre-programmed for automation.
[0007] In still further embodiments, the methods can include receiving instructions to display available commands and displaying a list of available commands.
[0008] In additional embodiments, the methods can be performed using a software system, and receiving the instructions to display the available commands, displaying the list of available commands, receiving the instructions to get the status of the first command and the first command identifier, determining whether the first command is available using the first command identifier, determining whether the prerequisites for the first command are met using the first command identifier, receiving the instructions to execute the second command and the second command identifier, and receiving the instructions to simulate the user interface interaction are performed using a plug-in application to the software system provided by a party that created the software system, distributed the software system, and/or maintains the software system.
[0009] In some embodiments, the instructions to execute the second command can include a parameter and a status variable.
[0010] In further embodiments, the methods can include displaying an indication that the first command is disabled based on determining that the prerequisites for the first command are not met and/or determining that the first command is not available.
[0011] In other embodiments, the methods can include displaying an indication that the first command is enabled based on determining that the prerequisites for the first command are met and determining that the first command is available.
[0012] In additional embodiments, simulating the user interface interaction can include selecting a user interface object, activating a pane, sending keyboard strokes, simulating a mouse event, getting a screen resolution, checking a dialog box status, interacting with a user interface object, interacting with a tree item, and/or showing a message.
[0013] In further embodiments, receiving the instructions to simulate the user interface interaction can include receiving parameters via a dialog box.
[0014] Systems and apparatus are also disclosed that include a processor and a memory system with non-transitory, computer-readable media storing instructions what, when executed by the processor, causes the systems and apparatus to perform operations that include creating a workflow for an automated test, receiving instructions to get a status of a first command and a first command identifier, determining whether the first command is available using the first command identifier, determining whether prerequisites for the first command are met using the first command identifier, receiving instructions to execute a second command and a second command identifier, receiving instructions to simulate a user interface interaction, and running the workflow by executing the second command associated with the second command identifier and simulating the user interface interaction.
[0015] Non-transitory, computer-readable media are also disclosed that store instructions that, when executed by a processor of a computing system, cause the computing system to perform operations that include creating a workflow for an automated test, receiving instructions to get a status of a first command and a first command identifier, determining whether the first command is available using the first command identifier, determining whether prerequisites for the first command are met using the first command identifier, receiving instructions to execute a second command and a second command identifier, receiving instructions to simulate a user interface interaction, and running the workflow by executing the second command associated with the second command identifier and simulating the user interface interaction.
[0016] The foregoing summary is intended merely to introduce a subset of the aspects of the present disclosure, and is not intended to be exhaustive or in any way identify any particular elements as being more relevant than any others. This summary, therefore, should not be considered limiting on the present disclosure or the appended claims.
Brief Description of the Drawings
[0017] The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the present teachings and together with the description, serve to explain the principles of the present teachings. In the figures:
[0018] Figure 1 illustrates an example of a system that includes various management components to manage various aspects of a geologic environment, according to an embodiment.
[0019] Figure 2 illustrates an example of a method for creating a new workflow, according to an embodiment.
[0020] Figure 3 illustrates an example of a method for displaying a list of available commands, according to an embodiment. [0021] Figure 4 illustrates an example of a method for getting a command status, according to an embodiment.
[0022] Figure 5 illustrates an example of a method for executing a command, according to an embodiment.
[0023] Figure 6 illustrates an example of a method for simulating user interface interactions, according to an embodiment.
[0024] Figure 7 illustrates an example of a user interface for selecting workflow processes, according to an embodiment.
[0025] Figure 8 illustrates example dialog boxes for simulating user interface interactions, according to an embodiment.
[0026] Figure 9 illustrates example dialog boxes for testing tree items, according to an embodiment.
[0027] Figure 10 illustrates an example computing system that may execute methods of the present disclosure, according to an embodiment.
Detailed Description
[0028] Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings and figures. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the disclosure. However, it will be apparent to one of ordinary skill in the art that certain embodiments of the disclosure may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
[0029] It will also be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are used to distinguish one element from another. For example, a first object or step could be termed a second object or step, and, similarly, a second object or step could be termed a first object or step, without departing from the scope of the disclosure. The first object or step, and the second object or step, are both, objects or steps, respectively, but they are not to be considered the same object or step. [0030] The terminology used in the description herein is for the purpose of describing particular embodiments and is not intended to be limiting. As used in the description and the appended claims, the singular forms "a," "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term "and/or" as used herein refers to and encompasses any possible combinations of one or more of the associated listed items. It will be further understood that the terms "includes," "including," "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Further, as used herein, the term "if may be construed to mean "when" or "upon" or "in response to determining" or "in response to detecting," depending on the context.
[0031] Attention is now directed to processing procedures, methods, techniques, and workflows that are in accordance with some embodiments. Some operations in the processing procedures, methods, techniques, and workflows disclosed herein may be combined and/or the order of some operations may be changed.
[0032] Figure 1 illustrates an example of a system 100 that includes various management components 1 10 to manage various aspects of a geologic environment 150 (e.g., an environment that includes a sedimentary basin, a reservoir 151, one or more faults 153-1, one or more geobodies 153-2, etc.). For example, the management components 110 may allow for direct or indirect management of sensing, drilling, injecting, extracting, etc., with respect to the geologic environment 150. In turn, further information about the geologic environment 150 may become available as feedback 160 (e.g., optionally as input to one or more of the management components 110).
[0033] In the example of Figure 1, the management components 110 include a seismic data component 112, an additional information component 1 14 (e.g., well/logging data), a processing component 116, a simulation component 120, an attribute component 130, an analysis/visualization component 142, and a workflow component 144. In operation, seismic data and other information provided per the components 112 and 114 may be input to the simulation component 120.
[0034] In an example embodiment, the simulation component 120 may rely on entities 122. Entities 122 may include earth entities or geological objects such as wells, surfaces, bodies, reservoirs, etc. In the system 100, the entities 122 can include virtual representations of actual physical entities that are reconstructed for purposes of simulation. The entities 122 may include entities based on data acquired via sensing, observation, etc. (e.g., the seismic data 112 and other information 114). An entity may be characterized by one or more properties (e.g., a geometrical pillar grid entity of an earth model may be characterized by a porosity property). Such properties may represent one or more measurements (e.g., acquired data), calculations, etc.
[0035] In an example embodiment, the simulation component 120 may operate in conjunction with a software framework such as an object-based framework. In such a framework, entities may include entities based on pre-defined classes to facilitate modeling and simulation. A commercially available example of an object-based framework is the MICROSOFT® .NET® framework (Redmond, Washington), which provides a set of extensible object classes. In the .NET® framework, an object class encapsulates a module of reusable code and associated data structures. Object classes can be used to instantiate object instances for use by a program, script, etc. For example, borehole classes may define objects for representing boreholes based on well data.
[0036] In the example of Figure 1, the simulation component 120 may process information to conform to one or more attributes specified by the attribute component 130, which may include a library of attributes. Such processing may occur prior to input to the simulation component 120 (e.g., consider the processing component 116). As an example, the simulation component 120 may perform operations on input information based on one or more attributes specified by the attribute component 130. In an example embodiment, the simulation component 120 may construct one or more models of the geologic environment 150, which may be relied on to simulate behavior of the geologic environment 150 (e.g., responsive to one or more acts, whether natural or artificial). In the example of Figure 1, the analysis/visualization component 142 may allow for interaction with a model or model-based results (e.g., simulation results, etc.). As an example, output from the simulation component 120 may be input to one or more other workflows, as indicated by a workflow component 144.
[0037] As an example, the simulation component 120 may include one or more features of a simulator such as the ECLIPSE™ reservoir simulator (Schlumberger Limited, Houston Texas), the INTERSECT™ reservoir simulator (Schlumberger Limited, Houston Texas), etc. As an example, a simulation component, a simulator, etc. may include features to implement one or more meshless techniques (e.g., to solve one or more equations, etc.). As an example, a reservoir or reservoirs may be simulated with respect to one or more enhanced recovery techniques (e.g., consider a thermal process such as SAGD, etc.).
[0038] In an example embodiment, the management components 110 may include features of a commercially available framework such as the PETREL® seismic to simulation software framework (Schlumberger Limited, Houston, Texas). The PETREL® framework provides components that allow for optimization of exploration and development operations. The PETREL® framework includes seismic to simulation software components that can output information for use in increasing reservoir performance, for example, by improving asset team productivity. Through use of such a framework, various professionals (e.g., geophysicists, geologists, and reservoir engineers) can develop collaborative workflows and integrate operations to streamline processes. Such a framework may be considered an application and may be considered a data-driven application (e.g., where data is input for purposes of modeling, simulating, etc.).
[0039] In an example embodiment, various aspects of the management components 110 may include add-ons or plug-ins that operate according to specifications of a framework environment. For example, a commercially available framework environment marketed as the OCEAN® framework environment (Schlumberger Limited, Houston, Texas) allows for integration of addons (or plug-ins) into a PETREL® framework workflow. The OCEAN® framework environment leverages .NET® tools (Microsoft Corporation, Redmond, Washington) and offers stable, user- friendly interfaces for efficient development. In an example embodiment, various components may be implemented as add-ons (or plug-ins) that conform to and operate according to specifications of a framework environment (e.g., according to application programming interface (API) specifications, etc.).
[0040] Figure 1 also shows an example of a framework 170 that includes a model simulation layer 180 along with a framework services layer 190, a framework core layer 195 and a modules layer 175. The framework 170 may include the commercially available OCEAN® framework where the model simulation layer 180 is the commercially available PETREL® model-centric software package that hosts OCEAN® framework applications. In an example embodiment, the PETREL® software may be considered a data-driven application. The PETREL® software can include a framework for model building and visualization. [0041] As an example, a framework may include features for implementing one or more mesh generation techniques. For example, a framework may include an input component for receipt of information from interpretation of seismic data, one or more attributes based at least in part on seismic data, log data, image data, etc. Such a framework may include a mesh generation component that processes input information, optionally in conjunction with other information, to generate a mesh.
[0042] In the example of Figure 1, the model simulation layer 180 may provide domain objects 182, act as a data source 184, provide for rendering 186 and provide for various user interfaces 188. Rendering 186 may provide a graphical environment in which applications can display their data while the user interfaces 188 may provide a common look and feel for application user interface components.
[0043] As an example, the domain objects 182 can include entity objects, property objects and optionally other objects. Entity objects may be used to geometrically represent wells, surfaces, bodies, reservoirs, etc., while property objects may be used to provide property values as well as data versions and display parameters. For example, an entity object may represent a well where a property object provides log information as well as version information and display information (e.g., to display the well as part of a model).
[0044] In the example of Figure 1 , data may be stored in one or more data sources (or data stores, generally physical data storage devices), which may be at the same or different physical sites and accessible via one or more networks. The model simulation layer 180 may be configured to model projects. As such, a particular project may be stored where stored project information may include inputs, models, results and cases. Thus, upon completion of a modeling session, a user may store a project. At a later time, the project can be accessed and restored using the model simulation layer 180, which can recreate instances of the relevant domain objects.
[0045] In the example of Figure 1, the geologic environment 150 may include layers (e.g., stratification) that include a reservoir 151 and one or more other features such as the fault 153-1, the geobody 153-2, etc. As an example, the geologic environment 150 may be outfitted with any of a variety of sensors, detectors, actuators, etc. For example, equipment 152 may include communication circuitry to receive and to transmit information with respect to one or more networks 155. Such information may include information associated with downhole equipment 154, which may be equipment to acquire information, to assist with resource recovery, etc. Other equipment 156 may be located remote from a well site and include sensing, detecting, emitting or other circuitry. Such equipment may include storage and communication circuitry to store and to communicate data, instructions, etc. As an example, one or more satellites may be provided for purposes of communications, data acquisition, etc. For example, Figure 1 shows a satellite in communication with the network 155 that may be configured for communications, noting that the satellite may also include circuitry for imagery (e.g., spatial, spectral, temporal, radiometric, etc.).
[0046] Figure 1 also shows the geologic environment 150 as optionally including equipment 157 and 158 associated with a well that includes a substantially horizontal portion that may intersect with one or more fractures 159. For example, consider a well in a shale formation that may include natural fractures, artificial fractures (e.g., hydraulic fractures) or a combination of natural and artificial fractures. As an example, a well may be drilled for a reservoir that is laterally extensive. In such an example, lateral variations in properties, stresses, etc. may exist where an assessment of such variations may assist with planning, operations, etc. to develop a laterally extensive reservoir (e.g., via fracturing, injecting, extracting, etc.). As an example, the equipment 157 and/or 158 may include components, a system, systems, etc. for fracturing, seismic sensing, analysis of seismic data, assessment of one or more fractures, etc.
[0047] As mentioned, the system 100 may be used to perform one or more workflows. A workflow may be a process that includes a number of worksteps. A workstep may operate on data, for example, to create new data, to update existing data, etc. As an example, a workstep may operate on one or more inputs and create one or more results, for example, based on one or more algorithms. As an example, a system may include a workflow editor for creation, editing, executing, etc. of a workflow. In such an example, the workflow editor may provide for selection of one or more pre-defined worksteps, one or more customized worksteps, etc. As an example, a workflow may be a workflow implementable in the PETREL® software, for example, that operates on seismic data, seismic attribute(s), etc. As an example, a workflow may be a process implementable in the OCEAN® framework. As an example, a workflow may include one or more worksteps that access a module such as a plug-in (e.g., external executable code, etc.).
[0048] In some embodiments, system 100 may include an E&P software system allows users to interpret seismic data, perform well correlation, build reservoir models suitable for simulation, submit and visualize simulation results, calculate volumes, produce maps, and design development strategies to maximize reservoir exploitation. The E&P software system may provide automation tools that allow users to create automatic workflows to perform multiple iterations of a test using different parameters, which can be used to compare results. In some embodiments, one or more of the automation tools, including the automation tools discussed below, may be an "out-of-the- box" feature of the software system that is included in an unmodified version of the software system. In other embodiments, one or more of the automation tools, including the automation tools discussed below, may be installed after the initial installation of the software system, such as, for example, in an update and/or as a plug-in application of the software system.
[0049] In some implementations, a plug-in application can be a software component that adds one or more specific features to a software system. In some embodiments, the plug-in application can be provided by a party that created, distributed, and/or maintains the software system. In other embodiments, the plug-in application can be provided by a third-party entity that is not affiliated with the party or parties that created, distributed, and/or maintains the software system.
[0050] As discussed in further detail below, automation tools provided by the software system and/or by plug-ins to the software system (hereinafter, "automation tool") can allow users to generate automated tests using routines, functions, or processes of the software system (hereinafter, "commands"), which may not have been specifically selected or pre-programmed to be automated. For example, automation tool may allow users to perform automated tests using an entire library of commands available to the software system (e.g., including out-of-the-box commands of the software system, commands included in updates to the software system, commands of installed plug-ins, etc.). Additionally, the automation tool can simulate user interactions with a UI, and, thus, can perform automated tests where user input is used for different iterations
[0051] Figure 2 illustrates an example of a method for creating a new workflow, according to an embodiment. In various embodiments, the new workflow can be used to automate multiple iterations of tests in an E&P software system. In some embodiments, the example method illustrated in Figure 2 can be performed using a computing device that includes the framework (e.g., framework 170) and the management components (e.g., management components 110) described above with reference to Figure 1.
[0052] The example method can begin in 200 by creating a new instance of a workflow. In some embodiments, the new instance of the workflow can be created based on instructions from a user of the software system (e.g., via a UI of the software system). Additionally, in some implementations the workflow can be initialized with default parameters (e.g., workflow name, author name, description, etc.) and may not include any operations, utilities, or processes upon initialization.
[0053] In 210, the computing device can add operations to the workflow. In some embodiments, the computing device can display a list of one or more operations via a UI of the software system and one or more operations can be added based on instructions from a user via the UI. The operations can be, for example, operations that are pre-programmed in the software system and/or operations that are added to the software system via one or more plug-ins. For example, the operations can be operations related to seismic simulations, three-dimensional (3D) map simulations, map-based volume calculations, model extractions, map making, obtaining properties, obtaining values from properties, calculations, filters, well simulations, well attribute simulations, points with well attribute simulations, sector modeling, arithmetic operations, etc.
[0054] In 220, the computing device can add utilities to the workflow. In some embodiments, the computing device can display a list of one or more utilities via a UI of the software system and one or more utilities can be added based on instructions from a user via the UI. The utilities can be, for example, utilities that are pre-programmed in the software system and/or utilities that are added to the software system via one or more plug-ins. For example, the utilities can be utilities related to programing and/or logical statements (e.g., while loops, for loops, for all functions, if statements, stop statements, run statements, etc.), or utilities related to variables (e.g., set reference, set reference list, numeric expression, string expression, data expression, get name, etc.).
[0055] In 230, the computing device can add processes to the workflow. In some embodiments, the computing device can display a list of one or more processes via a UI of the software system and one or more processes can be added based on instructions from a user via the UI. The processes can be, for example, processes that are pre-programmed in the software system and/or processes that are added to the software system via one or more plug-ins. For example, the processes can be processes related to getting, printing a list of, or executing commands for automated testing, as discussed in further detail below. As an additional example, the processes can be processes related to simulating UI interactions for automated testing, as discussed in further detail below.
[0056] In 240, the computing system can run the workflow and generate results. In some embodiments, the computing system can compile and run the workflow based on instructions from a user of the software system (e.g., via a UI of the software system). Accordingly, the computing system can run the workflow using the order of operations, utilities, and/or processes in the workflow to perform one or more iterations of an automated test. The computing system can then generate and output results for one or more iterations of the automated test. For example, the results can be generated 3D models (e.g., of a well, of a reservoir, etc.), generated graphs, generated numerical results, etc. that represent the one or more iterations of the automated test.
[0057] Figure 3 illustrates an example of a method for displaying a list of available commands, according to an embodiment. In some embodiments, the example method illustrated in Figure 3 can be performed using a computing device that includes the framework (e.g., framework 170) and the management components (e.g., management components 110) described above with reference to Figure 1.
[0058] The example method can begin in 300, when the computing device displays a list of available processes for use in a workflow. In some embodiments, the computing device can display the list of available processes based on instructions from a user to create a new workflow and/or based on instructions from a user to view the list of available processes. In some implementations, the list of available processes can be displayed in a window of a dialog box for a workflow editor.
[0059] In 310, the computing device can receive instructions to print all command identifiers ("IDs"). For example, a user can select a print all command IDs process (e.g., via the window of the dialog box for the workflow editor). The computing device can then execute the process or can execute the process based on instructions from the user to compile and/or run the workflow.
[0060] In 320, the computing device can determine a list of available commands. In some embodiments, the print all command IDs process can be a process provided by an automation tool. The automation tool can be part of an out-of-the-box feature of the software system or may be part of a plug-in provided by a party that created, distributed, and/or maintains the software system. Accordingly, the automation tool may have access to an entire library of commands available to the software system (e.g., including commands of installed plug-ins), that include commands that are not available to 3rd party plug-ins.
[0061] In 330, the computing device can display command IDs for each command in the list of available commands. For example, the computing device can display command IDs for each command in the library of commands available to the software system. In some embodiments, the command IDs can be displayed in a dialog box of the software system. In further embodiments, the computing device can additionally display parameters associated with the commands with the corresponding command ID.
[0062] Figure 4 illustrates an example of a method for getting a command status, according to an embodiment. In some embodiments, the example method illustrated in Figure 4 can be performed using a computing device that includes the framework (e.g., framework 170) and the management components (e.g., management components 110) described above with reference to Figure 1.
[0063] The example method can begin in 400, when the computing device displays a list of available processes for use in a workflow. In some embodiments, the computing device can display the list of available processes based on instructions from a user to create a new workflow and/or based on instructions from a user to view the list of available processes. In some implementations, the list of available processes can be displayed in a window of a dialog box for a workflow editor.
[0064] In 410, the computing device can receive instructions to get a command status. For example, a user can select a get command status process (e.g., via the window of the dialog box for the workflow editor).
[0065] In 420, the computing device can receive a command ID. In some embodiments, the computing device can prompt a user for the command ID in response to receiving instructions to get a command status.
[0066] The computing device can then execute the get command status process or can execute the process based on instructions from the user to compile and/or run the workflow.
[0067] In 430, the computing device can determine if the command is available for use. For example, certain commands may be associated with one or more specific licenses associated with the software system. A user running the software system may not have certain licenses associated with the software system. Accordingly, not some features and/or commands may not be available to the user. Thus, the computing device can determine the licenses the user or the instance of the system have, determine whether the command is available with one or more licenses, and determine whether the licenses the user or the instance of the system have allow use of the command. [0068] Additionally, for example, certain commands may not be installed on each instance of the software system. Accordingly, some features and/or commands may not be available to the user unless plug-ins, updates, etc. are installed. Thus, the computing device can determine whether a plug-in, update, etc. associated with the command is installed.
[0069] If, in 440, the command is not available, the computing device can, in 450, display an indication that the command is disabled. In some embodiments, the disabled status can be displayed in a dialog box of the software system. In further embodiments, the computing device can additionally display why the command is disabled (e.g., associated with a specific license the user does not have, not installed, etc.).
[0070] If, in 440, the command is available, the computing device can, in 460, determine if prerequisites for the command are met. In some embodiments, a command may utilize other functions, processes, models, features, logs, windows, etc. in order to perform its function. In some implementations, because a command can be from an entire library of commands associated with the software system, the command may not have been specifically designed for testing and may be designed to run in certain situations where certain events are assumed to have occurred. As an example, a command can be associated with a 3D model and may normally be run when a 3D model is presented in an open 3D window for visualization. Additionally, the command may utilize certain petrophysical logs. Thus, prerequisites for running the command may be that a 3D model is created, a 3D window is open, and the petrophysical logs are populated.
[0071] If, in 470, the prerequisites are not met, the computing device can, in 450, display an indication that the command is disabled. In some embodiments, the disabled status can be displayed in a dialog box of the software system. In further embodiments, the computing device can additionally display why the command is disabled (e.g., the prerequisites are not met, which prerequisites are not met, instructions on how to meet the prerequisites, etc.).
[0072] If, in 470, the prerequisites are met, the computing device can, in 480, display an indication that the command is enabled. In some embodiments, the enabled status can be displayed in a dialog box of the software system.
[0073] Figure 5 illustrates an example of a method for executing a command, according to an embodiment. In some embodiments, the example method illustrated in Figure 5 can be performed using a computing device that includes the framework (e.g., framework 170) and the management components (e.g., management components 110) described above with reference to Figure 1. [0074] The example method can begin in 500, when the computing device displays a list of available processes for use in a workflow. In some embodiments, the computing device can display the list of available processes based on instructions from a user to create a new workflow and/or based on instructions from a user to view the list of available processes. In some implementations, the list of available processes can be displayed in a window of a dialog box for a workflow editor.
[0075] In 510, the computing device can receive instructions to execute a command. For example, a user can add an execute command process to a workflow (e.g., via the window of the dialog box for the workflow editor). In some embodiments, the computing device can display an indication of the execute command process in a dialog box of the software system as an item in a workflow. In further embodiments, the indication of the execute command process can include text input boxes for a command ID, a parameter of the command, and a status variable of the command.
[0076] In 520, the computing device can receive a command ID associated with the instructions to execute the command. In some embodiments, the user can enter the command ID using the indication of the execute command process displayed in the dialog box. In some implementations, the user can retrieve the command ID using the print all command IDs process, discussed above. In other implementations, the user can retrieve the command ID using a configuration designer tool.
[0077] In 530, the computing device can receive a parameter associated with the instructions to execute the command. In some embodiments, the user can enter the parameter using the indication of the execute command process displayed in the dialog box. In some implementations, the user can retrieve the parameter using the print all command IDs process, discussed above. In other implementations, the user can retrieve the parameter using a configuration designer tool.
[0078] In some embodiments, the parameter can be, for example, a function, a model, a map, a simulation, a property, an attribute, etc.
[0079] In other embodiments, the command associated with a command ID may not be associated with a parameter. Accordingly, in such embodiments, no parameter may be entered in 530. [0080] In 540, the computing device can receive a status variable associated with the instructions to execute the command. In some embodiments, the user can enter a name of the status variable using the indication of the execute command process displayed in the dialog box.
[0081] In 550, the computing device can execute the command. In some embodiments, the computing device can execute the command based on instructions from the user to compile and/or run the workflow. The computing device can run the workflow using the order of operations, utilities, and/or processes (including the execute command process) in the workflow to perform one or more iterations of an automated test.
[0082] When the computing device runs the execute command process, the computing device can retrieve the parameter specified by the user, use the parameter to execute the command, and output a status of the command using the status variable.
[0083] Figure 6 illustrates an example of a method for simulating user interface interactions, according to an embodiment. In some embodiments, the example method illustrated in Figure 6 can be performed using a computing device that includes the framework (e.g., framework 170) and the management components (e.g., management components 110) described above with reference to Figure 1.
[0084] The example method can begin in 600, when the computing device displays a list of available processes for use in a workflow. In some embodiments, the computing device can display the list of available processes based on instructions from a user to create a new workflow and/or based on instructions from a user to view the list of available processes. In some implementations, the list of available processes can be displayed in a window of a dialog box for a workflow editor.
[0085] In 610, the computing device can select a UI object based on received instructions. For example, a user can add a select object process to a workflow (e.g., via the window of the dialog box for the workflow editor). In some embodiments, the computing device can display an indication of the select object process in a dialog box of the software system as an item in a workflow. In further embodiments, the indication of the select object process can include an input box for the obj ect to be selected and a text input box for a status variable of the obj ect to be selected.
[0086] In some embodiments, the object can be a seismic object such as, for example, lines, polylines, intersection, seismic cubs, 3D volumes, attributes, horizon and fault interpretations, geoprobes, geobodies, fault models, etc. In further embodiments, the object can be a general object, such as, for example, 3D grid properties, annotations, cross plots, control points, filters, functions, intersections, targets, histograms, grids, points, polygons, surfaces, logs, plans, etc.
[0087] In some implementations, the user can enter a name of the status variable using the indication of the select object process displayed in the dialog box. In further implementations, a status of the object can be output to the status variable after running the select object process,
[0088] In 620, the computing device can activate a pane based on received instructions. For example, a user can add an activate pane process to a workflow (e.g., via the window of the dialog box for the workflow editor). In some embodiments, the computing device can display an indication of the activate pane process in a dialog box of the software system as an item in a workflow. In further embodiments, the indication of the activate pane process can include an input text box for a name of the pane to activate.
[0089] In some embodiments, the pane can be a UI element such as a window or screen that can include, for example, data or controls. For example, an activate pane process can be used to list data the results from one or more tests, and/or the data in the pane can be used as input for other tests/processes.
[0090] In 630, the computing device can send keyboard strokes based on received instructions. For example, a user can add a send keyboard strokes process to a workflow (e.g., via the window of the dialog box for the workflow editor). In some embodiments, the computing device can display an indication of the send keyboard strokes process in a dialog box of the software system as an item in a workflow. In further embodiments, the indication of the send keyboard strokes process can include an input box for a keyboard symbol of the keyboard stroke to send, a numerical input box for a delay timer (e.g., in seconds), and a checkbox for indicating whether to wait for a keyboard process command to finish before proceeding.
[0091] In some embodiments, the keyboard stroke can be used to simulate a user interacting with the UI. For example, to simulate a user entering values into multiple text boxes, a keyboard stroke for the TAB key can be used to move the cursor to the next text box in sequence, and the keyboard stroke can be delayed one second to ensure that the cursor has switched to the next text box before entering a value.
[0092] In 640, the computing device can simulate a mouse event based on received instructions. For example, a user can add a simulate mouse event process to a workflow (e.g., via the window of the dialog box for the workflow editor). In some embodiments, the computing device can display an indication of the simulate mouse event process in a dialog box of the software system as an item in a workflow. In further embodiments, the indication of the simulate mouse event process can include an input box for an X coordinate and for a Y coordinate of the mouse event. Additionally, the indication of the simulate mouse event process can include an input box for a drag stop X coordinate and for a drag stop Y coordinate of the mouse event.
[0093] In some embodiments, the mouse event can be used to simulate a mouse click at the specified X and Y coordinates (e.g., in pixels). Accordingly, a user can fill in an X coordinate and a Y coordinate of the mouse event and may not fill in a drag stop X coordinate or a drag stop Y coordinate of the mouse event. In further embodiments, the mouse event can be used to simulate a mouse drag from a first position to a second position. Accordingly, a user can fill in an X coordinate and a Y coordinate of the mouse event (i.e., the drag start position) and fill in a drag stop X coordinate and a drag stop Y coordinate of the mouse event (i.e., the drag stop position).
[0094] In 650, the computing device can get a screen resolution based on received instructions. For example, a user can add a get screen resolution process to a workflow (e.g., via the window of the dialog box for the workflow editor). In some embodiments, the computing device can display an indication of the get screen resolution process in a dialog box of the software system as an item in a workflow. In further embodiments, the indication of the get screen resolution process can include a text input box for a width variable and a text input box for a height variable of the screen resolution.
[0095] In some embodiments, the get screen resolution process can be used to obtain the screen resolution of the user's device (e.g., in pixels). The determined height and width can be output as the selected height and width variables. Accordingly, a user can use the screen resolution to determine pixel positions of a mouse event (e.g., in 640).
[0096] In 660, the computing device can check a dialog box status based on received instructions. For example, a user can add a check dialog status process to a workflow (e.g., via the window of the dialog box for the workflow editor). In some embodiments, the computing device can display an indication of the check dialog status process in a dialog box of the software system as an item in a workflow. In further embodiments, the indication of the check dialog status process can include a text input box for a name of the dialog box, an input box for a status variable, and/or an input box for the dialog box object. [0097] In some embodiments, the check dialog status process can be used to determine if a dialog box exists and/or if a dialog box is open when the check dialog status process is run. For example, the computing device can search for a dialog box associated with the name of the dialog box entered by a user and determine if the dialog box exists and/or is open. As an additional example, the computing device can determine if a dialog box, corresponding to the dialog box object that was dragged into the input box, exists and/or is open.
[0098] In some implementations, the user can enter a name of the status variable using the indication of the check dialog status process displayed in the dialog box. In further implementations, a status of the object can be output to the status variable after running the check dialog status process. Example statuses include, does not exist, exists and not open, and exists and open.
[0099] In 670, the computing device can interact with UI objects in a dialog box based on received instructions. For example, a user can add a dialog process to a workflow (e.g., via the window of the dialog box for the workflow editor). In some embodiments, the computing device can display an indication of the dialog process in a dialog box of the software system as an item in a workflow. In further embodiments, the indication of the dialog process can include a text input box for a name of the dialog box and a text input box for a name of the UI object. In some implementations, the indication of the dialog process can also include an input box for an action object, a checkbox to indicate whether the UI object name is an automation ID, an input box for an input/output variable, and an input box for a status variable.
[0100] In some embodiments, the dialog process can be used to interact with the specified UI object in the specified dialog box. Inputs and outputs of the UI object and/or the dialog box can be entered and returned, respectively, via the input/output variable. For example, a UI object can be a text box, and inputting of text can be simulated by entering in text from the input/output variable and a result of entering the text can be added to the input/output variable.
[0101] In some implementations, a status of the UI object and/or dialog box can be output to the status variable after running the dialog process.
[0102] In 680, the computing device can interact with tree items based on received instructions. For example, a user can add a tree process to a workflow (e.g., via the window of the dialog box for the workflow editor). In some embodiments, the computing device can display an indication of the tree process in a dialog box of the software system as an item in a workflow. In further embodiments, the indication of the dialog process can include a text input box for a name of a pane and/or tree and checkboxes or radio buttons to indicate whether the input text is a pane name and/or an automation ID. In some implementations, the indication of the dialog process can also include an input box for a tree item object, an action object, an input box for an output variable, and an input box for a status variable.
[0103] In some embodiments, the tree process can be used to select tree items for use. For example, tree items can be input data, models, results, cases, templates, workflows, or processes that are represented in a tree structure in the UI. As an example, the tree process can be used to find a tree based on the pane/tree name (e.g., Workflows), select a tree item (e.g., Variable A), and perform an action (e.g., double click).
[0104] In some implementations, a status of the tree and/or tree item can be output to the status variable after running the tree process. In further implementations, an output from performing the action can be added to the output variable for output to the user, use by other commands, etc.
[0105] In 690, the computing device can show a message based on received instructions. For example, a user can add a show message process to a workflow (e.g., via the window of the dialog box for the workflow editor). In some embodiments, the computing device can display an indication of the show message process in a dialog box of the software system as an item in a workflow. In further embodiments, the indication of the show message process can include a text input box for a message caption, a text input box for a message, and a numerical input box for a duration.
[0106] In some embodiments, the show message process can be used to display a message (e.g., in a separate dialog box) for a specific duration. For example, a message can be displayed with the input caption and the specified message, and the message can timeout after the indicated duration (e.g., in seconds).
[0107] Figure 7 illustrates an example of a user interface for selecting workflow processes, according to an embodiment. As shown in Figure 7, UI 700 can be a pane that includes a tree of automation processes. A computing device can display the tree when, for example, a user instructs the computing device to create a new workflow and/or the user instructs the computing device to display workflow processes from a selection that includes workflow utilities, workflow operations, and workflow processes. [0108] In some embodiments, UI 700 can show different command manager testing processes that can selected, including get command status, print all command IDs, and execute command.
[0109] In other embodiments, UI 700 can show different UI interaction testing processes that can be selected, including select object, activate pane, send keys (e.g., send keyboard strokes, discussed above), mouse event (e.g., simulate mouse event, discussed above), screen resolution (e.g., get screen resolution, discussed above), find dialog (e.g., check dialog box status, discussed above), dialog (e.g., interact with UI objects in a dialog box, discussed above), tree (e.g., interact with tree items, discussed above), and show message.
[0110] In further embodiments, UI 700 can show: a players testing process, time player; different performance testing processes, create frame per second (FPS) counter and get FPS measurement; different service testing processes, open reference project and close reference project; different data generation testing processes, create seismic cube and create well; a meta data testing process, meta data; etc.
[0111] Figure 8 illustrates example dialog boxes for simulating user interface interactions, according to an embodiment. As shown in Figure 8, a dialog box 800 can be used to interact with UI objects in a selected dialog box for testing. In some embodiments, a user can enter parameters, such as a dialog box name, an action, an object name or automation ID, and/or an input/output variable using the indication of the dialog process in the workflow editor, as discussed above. In other embodiments, the user can double click on the indication, and the computing device can display dialog box 800 for entering parameters.
[0112] Using dialog box 800, the user can enter the dialog box name, select an action from a drop down list (shown in dialog box 810), input an object name, indicate whether the object name is an automation ID, etc. In some embodiments, the user can use the object inspector button to create an object inspector dialog box that can be used to find objects, dialog box names, automation IDs, etc. For example, the object inspector button can invoke a user interface inspection tool, such as the Inspect WINDOWS®-based tool provided by the Microsoft Corporation.
[0113] Figure 9 illustrates example dialog boxes for testing tree items, according to an embodiment. As shown in Figure 9, a dialog box 900 can be used to interact with tree items for testing. In some embodiments, a user can enter parameters, such as a tree or pane name, an indication of whether the name is a pane name or an automation ID, input tree items, and an action using the indication of the tree process in the workflow editor, as discussed above. In other embodiments, the user can double click on the indication, and the computing device can display dialog box 900 for entering parameters.
[0114] Using dialog box 900, the user can enter a tree or pane name, indicate whether the name is a pane name or an automation ID (using the checkboxes), input tree items, select an action from a drop down list (shown in dialog box 910), etc. In some embodiments, the user can use the object inspector button to create an object inspector dialog box that can be used to find trees, panes, pane names, automation IDs, etc.
[0115] In some embodiments, the methods of the present disclosure may be executed by a computing system. Figure 10 illustrates an example of such a computing system 1000, in accordance with some embodiments. The computing system 1000 may include a computer or computer system 1001-1, which may be an individual computer system 1001-1 or an arrangement of distributed computer systems. The computer system 1001-1 includes one or more analysis modules 1002 that are configured to perform various tasks according to some embodiments, such as one or more methods disclosed herein. To perform these various tasks, the analysis module 1002 executes independently, or in coordination with, one or more processors 1004, which is (or are) connected to one or more storage media 1006. The processor(s) 1004 is (or are) also connected to a network interface 1007 to allow the computer system 1001-1 to communicate over a data network 1009 with one or more additional computer systems and/or computing systems, such as 1001-2, 1001-3, and/or 1001-4 (note that computer systems 1001-2, 1001-3, and/or 1001-4 may or may not share the same architecture as computer system 1001-1, and may be located in different physical locations, e.g., computer systems 1001-1 and 1001-2 may be located in a processing facility, while in communication with one or more computer systems such as 1001-3 and/or 1001- 4 that are located in one or more data centers, and/or located in varying countries on different continents).
[0116] A processor may include a microprocessor, microcontroller, processor module or subsystem, programmable integrated circuit, programmable gate array, or another control or computing device.
[0117] The storage media 1006 may be implemented as one or more computer-readable or machine-readable storage media. Note that while in the example embodiment of Figure 10 storage media 1006 is depicted as within computer system 1001-1, in some embodiments, storage media 1001-1 may be distributed within and/or across multiple internal and/or external enclosures of computing system 1001-1 and/or additional computing systems. Storage media 1006 may include one or more different forms of memory including semiconductor memory devices such as dynamic or static random access memories (DRAMs or SRAMs), erasable and programmable read-only memories (EPROMs), electrically erasable and programmable read-only memories (EEPROMs) and flash memories, magnetic disks such as fixed, floppy and removable disks, other magnetic media including tape, optical media such as compact disks (CDs) or digital video disks (DVDs), BLURAY® disks, or other types of optical storage, or other types of storage devices. Note that the instructions discussed above may be provided on one computer-readable or machine-readable storage medium, or may be provided on multiple computer-readable or machine-readable storage media distributed in a large system having possibly plural nodes. Such computer-readable or machine-readable storage medium or media is (are) considered to be part of an article (or article of manufacture). An article or article of manufacture may refer to any manufactured single component or multiple components. The storage medium or media may be located either in the machine running the machine-readable instructions, or located at a remote site from which machine-readable instructions may be downloaded over a network for execution.
[0118] In some embodiments, computing system 1000 contains automation module(s) 1008 for creating workflows, displaying available commands, getting command statuses, executing commands, simulating user interface interactions, generating user interfaces, generating dialog boxes, etc. In the example of computing system 1000, computer system 1001-1 includes the automation module 1008. In some embodiments, an automation module may be used to perform aspects of one or more embodiments of the methods disclosed herein. In alternate embodiments, a plurality of automation modules may be used to perform aspects of methods disclosed herein.
[0119] It should be appreciated that computing system 1000 is one example of a computing system, and that computing system 1000 may have more or fewer components than shown, may combine additional components not depicted in the example embodiment of Figure 10, and/or computing system 1000 may have a different configuration or arrangement of the components depicted in Figure 10. The various components shown in Figure 10 may be implemented in hardware, software, or a combination of both hardware and software, including one or more signal processing and/or application specific integrated circuits.
[0120] Further, the steps in the processing methods described herein may be implemented by running one or more functional modules in information processing apparatus such as general purpose processors or application specific chips, such as ASICs, FPGAs, PLDs, or other appropriate devices. These modules, combinations of these modules, and/or their combination with general hardware are included within the scope of protection of the disclosure.
[0121] Geologic interpretations, models, and/or other interpretation aids may be refined in an iterative fashion; this concept is applicable to the methods discussed herein. This may include use of feedback loops executed on an algorithmic basis, such as at a computing device (e.g., computing system 1000, Figure 10), and/or through manual control by a user who may make determinations regarding whether a given step, action, template, model, or set of curves has become sufficiently accurate for the evaluation of the subsurface three-dimensional geologic formation under consideration.
[0122] The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or limited to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. Moreover, the order in which the elements of the methods described herein are illustrated and described may be re-arranged, and/or two or more elements may occur simultaneously. The embodiments were chosen and described in order to explain principals of the disclosure and practical applications, to thereby enable others skilled in the art to utilize the disclosure and various embodiments with various modifications as are suited to the particular use contemplated.

Claims

Claims What is claimed is:
1. A method, comprising:
creating a workflow for an automated test;
receiving instructions to get a status of a first command comprising a first command identifier;
determining whether the first command is available using the first command identifier; determining whether prerequisites for the first command are met using the first command identifier;
receiving instructions to execute a second command comprising a second command identifier;
receiving instructions to simulate a user interface interaction; and
running the workflow, wherein running the workflow comprises executing the second command associated with the second command identifier and simulating the user interface interaction.
2. The method of claim 1, further comprising generating results based on running the workflow, wherein the results comprise one or more of a three-dimensional model, a graph, or a numerical result.
3. The method of claim 1, wherein:
running the workflow comprises running a plurality of iterations of the workflow; and each iteration comprises executing the second command and simulating the user interface interaction.
4. The method of claim 1, wherein the list of available commands comprises commands that are not pre-programmed for automation.
5. The method of claim 1, further comprising:
receiving instructions to display available commands; and displaying a list of available commands;
6. The method of claim 5, wherein:
the method is performed using a software system; and
receiving the instructions to display the available commands, displaying the list of available commands, receiving the instructions to get the status of the first command comprising the first command identifier, determining whether the first command is available using the first command identifier, determining whether the prerequisites for the first command are met using the first command identifier, receiving the instructions to execute the second command comprising the second command identifier, and receiving the instructions to simulate the user interface interaction are performed using a plug-in application to the software system provided by a party that at least one of created the software system, distributed the software system, or maintains the software system.
7. The method of claim 1, wherein the instructions to execute the second command further comprise at least one of a parameter and a status variable.
8. The method of claim 1 , further comprising displaying an indication that the first command is disabled based on at least one of determining that the prerequisites for the first command are not met or determining that the first command is not available.
9. The method of claim 1 , further comprising displaying an indication that the first command is enabled based on determining that the prerequisites for the first command are met and determining that the first command is available.
10. The method of claim 1, wherein simulating the user interface interaction comprises one or more of selecting a user interface object, activating a pane, sending keyboard strokes, simulating a mouse event, getting a screen resolution, checking a dialog box status, interacting with a user interface object, interacting with a tree item, or showing a message.
11. The method of claim 10, wherein receiving the instructions to simulate the user interface interaction comprises receiving one or more parameters via a dialog box.
12. A computing system comprising:
one or more processors; and
a memory system comprising one or more non-transitory, computer-readable media storing instructions that, when executed by at least one of the one or more processors, cause the computing system to perform operations, the operations comprising:
creating a workflow for an automated test;
receiving instructions to get a status of a first command comprising a first command identifier;
determining whether the first command is available using the first command identifier;
determining whether prerequisites for the first command are met using the first command identifier;
receiving instructions to execute a second command comprising a second command identifier;
receiving instructions to simulate a user interface interaction; and
running the workflow, wherein running the workflow comprises executing the second command associated with the second command identifier and simulating the user interface interaction.
13. The system of claim 12, wherein the operations further comprise generating results based on running the workflow, wherein the results comprise one or more of a three-dimensional model, a graph, or a numerical result.
14. The system of claim 12, wherein:
running the workflow comprises running a plurality of iterations of the workflow; and each iteration comprises executing the second command and simulating the user interface interaction.
15. The system of claim 12, wherein the list of available commands comprises commands that are not pre-programmed for automation.
16. The system of claim 12, further comprising:
receiving instructions to display available commands; and
displaying a list of available commands;
17. The system of claim 16, wherein:
wherein the instructions comprise a software system; and
receiving the instructions to display the available commands, displaying the list of available commands, receiving the instructions to get the status of the first command comprising the first command identifier, determining whether the first command is available using the first command identifier, determining whether the prerequisites for the first command are met using the first command identifier, receiving the instructions to execute the second command comprising the second command identifier, and receiving the instructions to simulate the user interface interaction are performed using a plug-in application to the software system provided by a party that at least one of created the software system, distributed the software system, or maintains the software system.
18. The system of claim 12, wherein the operations further comprise displaying an indication that the first command is disabled based on at least one of determining that the prerequisites for the first command are not met or determining that the first command is not available.
19. The system of claim 12, wherein simulating the user interface interaction comprises one or more of selecting a user interface object, activating a pane, sending keyboard strokes, simulating a mouse event, getting a screen resolution, checking a dialog box status, interacting with a user interface object, interacting with a tree item, or showing a message.
20. A non-transitory, computer-readable medium storing instructions that, when executed by one or more processors of a computing system, cause the computing system to perform operations, the operations comprising: creating a workflow for an automated test;
receiving instructions to get a status of a first command comprising a first command identifier;
determining whether the first command is available using the first command identifier; determining whether prerequisites for the first command are met using the first command identifier;
receiving instructions to execute a second command comprising a second command identifier;
receiving instructions to simulate a user interface interaction; and
running the workflow, wherein running the workflow comprises executing the second command associated with the second command identifier and simulating the user interface interaction.
PCT/US2016/044154 2016-07-27 2016-07-27 System automation tools WO2018022030A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2016/044154 WO2018022030A1 (en) 2016-07-27 2016-07-27 System automation tools

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2016/044154 WO2018022030A1 (en) 2016-07-27 2016-07-27 System automation tools

Publications (1)

Publication Number Publication Date
WO2018022030A1 true WO2018022030A1 (en) 2018-02-01

Family

ID=61017133

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/044154 WO2018022030A1 (en) 2016-07-27 2016-07-27 System automation tools

Country Status (1)

Country Link
WO (1) WO2018022030A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110275628A (en) * 2019-06-26 2019-09-24 西南民族大学 A kind of full-automatic mouse action device of electromechanical based on machine vision

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008083343A1 (en) * 2006-12-29 2008-07-10 Schlumberger Canada Limited Oilfield management system and method
US20150149142A1 (en) * 2013-11-25 2015-05-28 Schlumberger Technology Corporation Geologic feature splitting
WO2015156792A1 (en) * 2014-04-09 2015-10-15 Landmark Graphics Corporation Parameter measurement refinement in oil exploration operations
US9175547B2 (en) * 2007-06-05 2015-11-03 Schlumberger Technology Corporation System and method for performing oilfield production operations
WO2016093794A1 (en) * 2014-12-08 2016-06-16 Landmark Graphics Corporation Defining non-linear petrofacies for a reservoir simulation model

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008083343A1 (en) * 2006-12-29 2008-07-10 Schlumberger Canada Limited Oilfield management system and method
US9175547B2 (en) * 2007-06-05 2015-11-03 Schlumberger Technology Corporation System and method for performing oilfield production operations
US20150149142A1 (en) * 2013-11-25 2015-05-28 Schlumberger Technology Corporation Geologic feature splitting
WO2015156792A1 (en) * 2014-04-09 2015-10-15 Landmark Graphics Corporation Parameter measurement refinement in oil exploration operations
WO2016093794A1 (en) * 2014-12-08 2016-06-16 Landmark Graphics Corporation Defining non-linear petrofacies for a reservoir simulation model

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110275628A (en) * 2019-06-26 2019-09-24 西南民族大学 A kind of full-automatic mouse action device of electromechanical based on machine vision

Similar Documents

Publication Publication Date Title
WO2017206182A1 (en) Detecting events in well reports
US20200019882A1 (en) Systems and Methods for Generating, Deploying, Discovering, and Managing Machine Learning Model Packages
US11371333B2 (en) Visualizations of reservoir simulations with fracture networks
US8483852B2 (en) Representing geological objects specified through time in a spatial geology modeling framework
US9708897B2 (en) Oilfield application framework
US20180119523A1 (en) Oilfield Reservoir Saturation and Permeability Modeling
US11294095B2 (en) Reservoir simulations with fracture networks
US20200224531A1 (en) Well planning using geomechanics nudge
US11402540B2 (en) Coupled reservoir-geomechanical models using compaction tables
WO2017030725A1 (en) Reservoir simulations with fracture networks
US9542064B2 (en) Information pinning for contexual and task status awareness
WO2017206159A1 (en) Fracture network extraction by microseismic events clustering analysis
CN113874864A (en) Training machine learning system using hard constraints and soft constraints
WO2017206157A1 (en) Systems, methods, and computer readable media for enchanced simulation of drilling dynamics
WO2018022030A1 (en) System automation tools
US20220316320A1 (en) Event detection from pump data
US20200333505A1 (en) Pore Pressure Prediction
US20240094433A1 (en) Integrated autonomous operations for injection-production analysis and parameter selection
US20230376404A1 (en) Enriched automatic on-cloud integrated validations for client customizations
WO2024063797A1 (en) Automated machine learning fault modeling with grouping
EP4264554A1 (en) Predictive geological drawing system and method
CA3201448A1 (en) Multi-agent drilling decision system and method
WO2022026988A1 (en) Well correlation using global and local machine learning models

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16910705

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16910705

Country of ref document: EP

Kind code of ref document: A1