EP1247238A1 - Petroleum reservoir simulation and characterization system and method - Google Patents
Petroleum reservoir simulation and characterization system and methodInfo
- Publication number
- EP1247238A1 EP1247238A1 EP00970940A EP00970940A EP1247238A1 EP 1247238 A1 EP1247238 A1 EP 1247238A1 EP 00970940 A EP00970940 A EP 00970940A EP 00970940 A EP00970940 A EP 00970940A EP 1247238 A1 EP1247238 A1 EP 1247238A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- data
- reservoir
- applications
- characterization
- computational
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 73
- 238000012512 characterization method Methods 0.000 title claims abstract description 43
- 239000003208 petroleum Substances 0.000 title claims abstract description 29
- 238000004088 simulation Methods 0.000 title claims description 17
- 238000004458 analytical method Methods 0.000 claims abstract description 22
- 238000005457 optimization Methods 0.000 claims description 32
- 239000012530 fluid Substances 0.000 claims description 24
- 238000004422 calculation algorithm Methods 0.000 claims description 16
- 230000002085 persistent effect Effects 0.000 claims description 15
- 238000004891 communication Methods 0.000 claims description 12
- 238000012800 visualization Methods 0.000 claims description 10
- 230000006870 function Effects 0.000 claims description 9
- 238000012544 monitoring process Methods 0.000 claims description 9
- 230000008859 change Effects 0.000 claims description 7
- 230000002688 persistence Effects 0.000 claims description 7
- 238000006243 chemical reaction Methods 0.000 claims description 6
- 238000003491 array Methods 0.000 claims description 5
- 239000000126 substance Substances 0.000 claims description 5
- 238000003860 storage Methods 0.000 claims description 4
- 239000011435 rock Substances 0.000 claims description 3
- 238000010205 computational analysis Methods 0.000 claims 3
- 230000003292 diminished effect Effects 0.000 claims 1
- 238000004883 computer application Methods 0.000 abstract description 5
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 abstract description 5
- 238000000605 extraction Methods 0.000 abstract description 3
- 230000002452 interceptive effect Effects 0.000 abstract description 3
- 230000000977 initiatory effect Effects 0.000 abstract description 2
- 230000008569 process Effects 0.000 description 16
- 238000013461 design Methods 0.000 description 15
- 239000008186 active pharmaceutical agent Substances 0.000 description 12
- 241000010972 Ballerus ballerus Species 0.000 description 11
- 238000012545 processing Methods 0.000 description 11
- 239000007789 gas Substances 0.000 description 9
- 238000004519 manufacturing process Methods 0.000 description 9
- 238000013515 script Methods 0.000 description 9
- 239000003921 oil Substances 0.000 description 8
- 238000011156 evaluation Methods 0.000 description 6
- 230000010354 integration Effects 0.000 description 5
- 238000007726 management method Methods 0.000 description 5
- 230000007246 mechanism Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000005266 casting Methods 0.000 description 3
- 238000010276 construction Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000002474 experimental method Methods 0.000 description 3
- 230000006872 improvement Effects 0.000 description 3
- 238000011835 investigation Methods 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 230000032258 transport Effects 0.000 description 3
- 238000003339 best practice Methods 0.000 description 2
- 235000014510 cooky Nutrition 0.000 description 2
- 239000010779 crude oil Substances 0.000 description 2
- 238000007405 data analysis Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000007788 liquid Substances 0.000 description 2
- VNWKTOKETHGBQD-UHFFFAOYSA-N methane Chemical compound C VNWKTOKETHGBQD-UHFFFAOYSA-N 0.000 description 2
- 238000011002 quantification Methods 0.000 description 2
- JJLJMEJHUUYSSY-UHFFFAOYSA-L Copper hydroxide Chemical compound [OH-].[OH-].[Cu+2] JJLJMEJHUUYSSY-UHFFFAOYSA-L 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000010835 comparative analysis Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000013523 data management Methods 0.000 description 1
- 238000013499 data model Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000004090 dissolution Methods 0.000 description 1
- 238000012854 evaluation process Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000002068 genetic effect Effects 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 229910052500 inorganic mineral Inorganic materials 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000012804 iterative process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000013508 migration Methods 0.000 description 1
- 230000005012 migration Effects 0.000 description 1
- 239000011707 mineral Substances 0.000 description 1
- 239000003345 natural gas Substances 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 230000001575 pathological effect Effects 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- -1 pressure decline Substances 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 238000005086 pumping Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 239000000344 soap Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000036962 time dependent Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01V—GEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
- G01V1/00—Seismology; Seismic or acoustic prospecting or detecting
- G01V1/28—Processing seismic data, e.g. for interpretation or for event detection
- G01V1/30—Analysis
Definitions
- the present invention relates to systems and methods for the evaluation and modeling of mineral reservoirs, and in particular to methods and systems for providing in a single, remotely- accessible computer middleware and interface for the efficient movement of disparate data and processing outputs among and between several software applications used for the comparative analysis and prediction regarding the characteristics over time of withdrawal of petroleum reservoir fluid (crude oil, gas, and water) using any or all among seismic, well logs, production data and other geological, geophysical, and petroleum engineering data and analyses that might be relevant.
- petroleum reservoir fluid crude oil, gas, and water
- the goals of evaluating reservoirs are manifold and begin with the earliest stages of speculative exploration activity (at a point when it is not necessarily known whether a geologic region or structure contains accessible petroleum in commercially marketable quantities), through the production lifetime of an identified reservoir (when it may be important, for example, to evaluate and/or vary the best sites for placing wells to tap the reservoir, or the optimal rate at which petroleum may be removed from a reservoir during ongoing pumping). Because companies in the petroleum industry invest very large sums of money in exploration, development, and exploitation of potential or known petroleum reservoirs, it is important that the evaluation and assessment of reservoir characteristics be accomplished with the most efficient and accurate use of a wide range of data regarding the reservoir.
- geologists, geophysicists, and petroleum engineers have developed numerous methodologies for assessing petroleum reservoirs (as to such parameters as total reserves, location of petroleum, pressure decline, water encroachment, gas dissolution, etc.). These methodologies have relied upon a wide range of software applications, to which are input data variables regarding the geological character of the reservoir and how these data variables vary over the time of production.
- seismic data i.e., data obtained by analysis of the characteristics of sound waves travelling through and reflected from underground geological structures.
- seismic analysis of sound waves caused to travel through a petroleum reservoir can be used to characterize that reservoir in terms of its heterogeneous constituent parts; e.g., the various solid, liquid, etc. regions or components of the reservoir and their respective locations within the reservoir can be mapped as they change over time.
- SeisRes may be used generally throughout herein to refer to data and processes for characterizing petroleum reservoirs (typically by seismic means), though it must be understood that input physical data other than, or in addition to strictly seismic data can be and is used in evaluating, analyzing, and forming characterizations of subsurface regions and petroleum reservoirs.
- the present invention is particularly useful for, but is not limited to, characterization techniques focusing in large part on seismic data; it can also manage and optimize characterization using other types of data inputs.
- Reservoir evaluation and characterization genetically including but not limited to that using seismic, non-seismic, and hybrid data analysis
- SeisRes OF The middleware and interface system we call the operating framework, henceforth abbreviated (OF).
- E&P Exploration & Production
- the plurality of computer applications may be designed for disparate operating systems, may apply different processing algorithms or analytic assumptions, and may use input data and/or supply processed output data in formats (e.g., data input/output formats using different measurement units, benchmarks, time frames, or terminology, sets of variables) that are not consistent with the respective data formats used by another computer analysis application being applied to the same reservoir.
- formats e.g., data input/output formats using different measurement units, benchmarks, time frames, or terminology, sets of variables
- the prior art has not contained a satisfactory system and method for creating and managing an automated workflow process to integrate many analytic computer applications and their handling, processing, and output of data relative to a reservoir in such fashion that: (a) data handling is largely automated; (b) disparate analytic applications are integrated or "wrapped" in a common user interface, which may be made remotely available; (c) workflow or processing hierarchies among the applications may be adjusted readily; (d) characterization data from the multiple applications is optimized, on an automated basis, to allow access to an accurate aggregate OF; and (e) the history of not only the data and the OF, but of the analytic workflow applied to reach such characterization, is stored and made available for immediate retrieval so that historical profiles of not only the approximated reservoir traits, but of the assumptions used to reach such approximation, are readily available and may be updated, rerun, and re-evaluated using different assumptions or analytic metrics.
- the invention disclosed and claimed herein is a system and method for managing and optimizing handling and analysis over a period of time of data relative to a characterization of the state, location, and quantity of fluids within a subterranean petroleum reservoir.
- the invention disclosed herein allows for seamless integration of a number (potentially a large number) of disparate computer analytical tools for performing complementary or overlapping analytic tasks on reservoir data or subsets of the data (including on data that originated as the intermediate output of another of the plurality of analytic applications). Additionally, conflicts (whether in data formatting or handling regimes, or in characterization-related conclusions) among the various analytic applications may be minimized by an iterative process of optimizing the data and outputs associated with each of the different applications.
- the present invention provides a networked operating framework ("OF") that sources, then integrates multi -vendor scientific, business and engineering applications and data sets.
- OF networked operating framework
- Our OF manages, versions, and coordinates execution of multiple applications. It handles the trafficking of data between applications; updates geospatially aligned earth and reservoir models, and pushes the outcomes through optimization loops.
- the field user interfaces with our platform and other members of the interdisciplinary asset team through a World Wide Web-based "dashboard.”
- the clients have access to real-time versions of multiple projects allowing 24-hour-by-7-day processing by virtual teams using distributed resources.
- “Versioning” in this context refers to the software techniques for keeping track of, accounting for, and/or recording changes over time in the state of a set or sets of parameters, data, and data analysis outputs such that changes in the set or sets (or subsets thereof) can be traced longitudinally over time, reconstructed, and mapped for archival, analytic purposes over time.
- FIGURES Figure 1 provides a process overview of the four-dimensional (4D), i.e., time- dependent, processing loop for integration of analytical applications handling reservoir-related data, and for optimization of characterization results therefrom.
- 4D four-dimensional
- Figure 2 provides an illustrative system architecture framework for illustrating the layers of input output, processing, and optimization of OF-related data.
- seismic and fluid flow models and data must be computed and recomputed until they converge
- Figure 1 of one highly useful embodiment of our invention is illustrated in Figure 1 and consists of the
- A. 4D seismic workflow for non-linear inversion of two 3D seismic volumes acquired at different times during the production history of a field, and their time-depth conversion, normalization and differencing (as will be more fully understood in light of the disclosure of U.S. Patents No. 5,798,982 and 5,586,082, which are incorporated herein in full by reference) is used to compute seismic differences over time;
- embodiment consists of C++ OF system code, and of scripts, wrappers, and implementation
- the architecture of the OF Operating Framework contains the following major
- MultiMesh a meshing system from IBM that is topological so that whatever the gridding requirements of an Application are, we can quickly deliver it.
- a web User Interface Layer uses a model of serving whole web based applications (called products) to users as opposed to individual web pages.
- Zope uses an object database with a web object publisher that does its task. Best-practices in the execution of science and engineering computations is to keep a notebook with a record of the experiment and its trail and errors (the modem version of the researcher's notebook).
- a "notebook” will necessarily have structure to it — broken down into sections describing the multiple tasks involved in the workflow and sub-investigations done along the way to complete the overall task.
- Such a network may include a private network, virtual private network, or a public network such as the Internet or world wide web.
- a private network such as the Internet or world wide web.
- public network such as the Internet or world wide web.
- field users can have instantaneous, on- demand, and persistent interactive access to the reservoir characterization functionality of the of invention.
- a variety of known networking protocols may be used in connection with the user interface. For instance, techniques for establishing encrypted secure remote access to host servers (over the internet or intranets) are well-established and could readily be used to establish data connection for the user interface.
- the latter will be formed for the duration of a specific project and will build on the unique strengths of the user's physical location worldwide.
- the Active Notebook are an integrated set or web pages generated by a web application server, Zope, that monitors and records OF tasks as they are being performed by the user. Past work in OF can be reactivated and investigations can be renewed using the
- the client supports computation orchestrated though the web browser.
- the Active Notebook is based on using VTK as a tclet plugin.
- the browser has an interface to OF repository objects directly within the browser's scripting environments. These include, but are not limited to, tcl, Javascript, or JPython.
- the scripting environment in the browser has access to the browser's document object model (DOM). One such access is Netscape's LiveWire.
- DOM document object model
- One such access is Netscape's LiveWire.
- the "thin" client runs in trusted mode when executing the visualization as well as when interfacing to the repository and event objects, since OF will be used in an intranet situation, initially.
- XML is used to transport structured data between the browser and the OF server.
- the browser makes use of embedded viewers that parse XML for viewing or parse the XML directly in their scripting.
- Embedded viewers can be script based (i.e. tclet), Java based, or pre-built as plugins.
- the client notebook supports an interface to authoring and status. We use the Zope authoring interface for this. 1.1.2 Notebook Server
- the server tracks the workflow progress of a user and dynamically constructs new web content including client-side scripts based on user initiative, OF objects, and metadata about the state of workflow of the on-going OF experiment. Changes in state in the client are tracked with forms submission (http POST), cookies, and direct plugin communication with the server (for example, the ability for a tclet plugin to do http POST).
- the ultimate store of persistent data is the OF metadata store (in Zope and the OF data repository) — cookies are
- Zope has a persistent object system so these
- the main functionality of the server side scripting is to construct client-side
- VTK pipeline network or a user interface on the not so-thin client for example.
- Client-side Scripting and Applet/Plugins access Events and Repository objects directly using the Event and Repository servers. They communicate with Zope metadata using http POST and indirectly with server side scripting.
- the Optimization Tool Kit is designed to be a set of tools can be deployed at any time and any place within the OF to provide parameter estimation services. It is implemented as a loosely-coupled component because the need for parameter estimation varies from app to app.
- the principle underlying this choice of design is that it allows a selection of options, including hybrid options combining algorithms from different categories, to produce the most appropriate procedure.
- the technical goal is to quickly implement sub-optimization loops to facilitate the entire optimization process for the seismic reservoir simulation.
- the optimizer consists of three components: optimization solvers, forward simulation wrappers, and simulation data converters.
- the forward simulation solver and simulation data converters are developed separately for reservoir property characterizer, reservoir simulator, petrophysical property characterizer, and 3D finite-difference simulator.
- the Optimization Laboratory is implemented according to the workflow illustrated below.
- the wrappers are developed to aim for the smooth execution of each individual sub-problem.
- the sub-problems are illustrated as different color in the diagram. It is clear that each sub-problem involves one or more forward simulation processes that generate the predicted data from the optimization model parameters.
- Each sub-problem also involves solving an optimization problem.
- the optimization component consists of several optimization algorithms in the form of executable programs. Each of these optimization wrappers offers the following functionality:
- GENOCOP III The GENOCOP III itself is often considered a hueristic, hybrid solution to
- the GLS wrapper uses a defined data I/O format in conjunction
- the OF 3D Data Viewer has been designed to 1) display a variety of geoscience data types registered in real-world coordinates in a common scene on the Web, 2) use state-of-the-art rendering methods, 3) run on all popular workstations, and 4) be easily extendable by other developers.
- the prototype displays seismic binned data (stack, migrated, acoustic impedance volume, etc.), surfaces and well logs. It is integrated into the OF Data Repository.
- the reservoir is characterized by multiple sequential seismic surveys; seismic attribute volumes; many well logs of different types and vintages; geostatistically-derived data volumes on regular and stratigraphic grids; fluid saturation volumes; four-dimensional fluid- flow maps; fluid-interfaces, horizon, and fault surfaces and possibly other data types. Being able to view all these data — spatially registered with respect to one another in the local real- world coordinate system and rendered in a variety of modes so that the interrelationships can be perceived — is a great help and may be a necessity for understanding spatially complex reservoirs over time.
- the OF uses the Visualization Toolkit (vtk). Vtk is freeware; its source code is available to anyone via internet download.
- the SDV has been developed to the point of visualizing binned seismic data (stacks, migrated volumes, attribute volumes, etc.), well logs and a computer-graphics ASCII file format known as the B YU format. Seismic data can be converted from SEGY and logs can be converted from one of the SigmaView formats.
- the vtk and all the SDV code is portable to Windows The development has been done on both SGI and Sun workstations with no problems other than some makefile and environment variable peculiarities. Many people use vtk on NT or Linux on PCs. Because the top-level code is written in Tcl/Tk, it can be invoked from a web browser.
- the SDV is designed as a central framework and data-specific pipelines.
- a pipeline is a concept inherent in vtk. All data is processed by a pipeline consisting of the serial connection of a reader or source object to import the data in its native form, various filters to convert it into graphical form, a mapper to generate the graphics primitives, an actor to associate 3D transformations, colors, lights and other graphics properties with the data and a Tenderer to draw it all.
- the framework can operate with any one or more of the pipelines, and pipelines can be developed without access to the framework source code.
- the center of attention in the SDV is a single viewing window enclosed in a Tcl/Tk top-level window. All 3D objects are displayed in real-world coordinates here.
- the main window has a typical menu bar across the top.
- the File, Edit and View buttons were placed on the menu bar by the SDV framework.
- the BYU button was placed on the menu bar by the BYU pipeline. Since the GUI is written in Tel script it is easy for pipelines to add objects to it without modifying the framework code.
- the second main element of the framework GUI is a graphical data object tree.
- This window shows the objects that are loaded into the SDV.
- the SDV organizes data objects in a tree hierarchy. At the top level is the single instance of the SDV.
- Second in the hierarchy is a project.
- the data objects are grouped by data type.
- the tree structure below the data-type level is determined by the data-specific pipeline.
- the hierarchy of the well logs and seismic views are different.
- the right hand frame of this window is available for displaying information or
- GUI widgets associated with a single selected component of the tree At present, the framework only prints the name of the object. A variety of data regarding various components can be associated with this selection.
- the framework consists of the viewing window, a graphical data-object tree and the GUI widgets common to all data types. It is not modified by any of the data specific developers. In fact, only the C++ header files , the shared libraries, and the main tcl script is needed for developing new features. A new pipeline is added by adding one or a few lines to the .sdv_resource file, and informing the operating system where to find the tcl scripts and libraries containing the new pipeline code.
- Vtk is distributed in source-code form so that it can be built on most common computers: most Unixes including Sun, SGI, HP, and ATX, Linux and Windows NT. It uses a hardware implementation of OpenGL if one is available on the host computer (Unix or NT) or software implementations of OpenGL, or a Windows-specific graphics language. It has some facilities for multiple graphics pipes such as are found in CAVE environments.
- Vtk is maintained by Kitware, Inc. and is distributed from an ftp server at Rensselaer Polytechnic Institute. It requires a C++ compiler to "make" an executable version. Many examples are provided to allow the user to see how 3D objects can be visualized and to illustrate how the various classes can be used.
- Vtk has APIs for Python and Java. We elected not to use Java since the Java version uses Java3D for its underlying graphics support. Java3D does not perform nearly as well as OpenGL at present. It may be an option in the future. Python is a much better- structured scripting language than is Tcl/Tk but is far less widely used. We have probably avoided many bugs by using Tcl/Tk. The decision to use Tcl Tk needs to be reviewed periodically. A change from Tcl/Tk to Python would be straightforward. A change to Java would likely entail a complete recoding of the Tcl portion of the SDV framework: less than ten pages of code at present.
- the OF persistent storage package (pio) is an important piece of the OF software because geological, geophysical and other data must be persistent in the OF loop. This persistence requires that data objects can be restored to their original form at any time.
- the OF data obj ect repository functions like an obj ect database which stores and retrieves C++ objects.
- the storage for the OF data repository is the unix file system.
- a unix directory is aphysical repository.
- a repository directory has to have two index files: one called the object index file and another called the repository index.
- the pio object repository manages two another managers: the object index manager the and sub-repository index manager.
- the object index manager is responsible for adding, removing, renaming, and retrieving object descriptor and ensuring that the index file is consistent and persistent.
- the repository index manager is responsible for adding, removing, and retrieving a repository index object and making sure the repository index file is consistent and persistent.
- the data repository client-server is implemented on top of the system socket layer. TCP/IP protocol is used for communication, that is point-to-point connection is guaranteed for each client.
- the data repository client-server has two high-level interface classes pioClient and pioServer.
- the class pioClient is the interface class for all applications and pioServer class is the interface class to pioRepositoryManager, which implements all functionality of the pio server.
- the communication protocol is TCP/IP. Data transferred is the either fixed-size or variable-size byte stream.
- TCP/IP Transmission Control Protocol/IP
- Data transferred is the either fixed-size or variable-size byte stream.
- On the client side it creates a socket, binds the socket to the server address, then calls connect to make a point-to-point connection to the server.
- On the server side it creates a socket, binds the socket to the IP address of the host, then listens for the client connection. As a client request comes in, it calls accept to create a temporary socket for that particular client. Data will be received through the socket returned from the accept call.
- a client requests a service by sending a message.
- a request message consists of three parts: the first part is the request code, the second part is the client information which includes user name, machine name, process id, time stamp, and unique client id, and the third part is the parameters related to the request.
- An exemplary sequence of communication on the client side is:
- the acknowledge message will be an integer which tells the client if the
- the pioServer uses pioObjRepositoryManager to do all work requested by a
- the pioObjRepositoryManager uses two index managers to manage the directories and
- PioRepositorylndexManager manages directories
- pioObjIndexManager manages object files.
- the OF Data Repository has four high level components: the Repository
- the repository stores objects with the assistance of the Object Descriptor Manager, which
- the Persistence I/O Handler is responsible for the construction and casting of objects. It uses the XDR Streamer to serialize objects to files in XDR format, which are then stored in the repository. Note that the repository is a set of files stored in NFS. This concept is analogous to the repository of a source code versioning system such as CVS or RCS. The difference is that we store machine independent binary files (XDR format) representing serialized versions of objects. pioRepositoryManager
- a project repository is simply a Unix directory.
- the Repository Manager adds and removes objects into/out of the repository. It uses the Object Descriptor Manager to catalog the objects into the repository.
- Each project directory has one file, which contains a collection of descriptors (pioObjDescriptor) objects. These object descriptors have information about all objects stored in the repository.
- the Object Descriptor Manager class (pioObjDescriptorManager) adds, retrieves and deletes any object descriptor from the repository.
- each object is a single file in ASCII or XDR format.
- the Object Descriptor contains relevant information associated with the objects to be stored such as file format (ASCII or XDR), name, type, project name, id number, owner, time-stamp, object description and xdr version string.
- the Persistence I/O Handler is responsible for registration, construction, initialization, and proper casting of the stored objects.
- the Handler is a placeholder for the types that we want to store in the repository.
- the Handler uses XDR to serialize and write the objects into the repository. Reading the serialized objects from the repository is more complicated, because the Repository Manager does not know the type of the object it is going to read.
- the Repository Manager only has a string containing the object name. Therefore, it utilizes the string-to-pointer function map to locate the proper method to construct the object and return it as a pioObjBase pointer.
- This pointer is then cast (narrowed) by the user using the objCast method that is essentially a dynamic cast checking in addition to an object registry into the repository. It is not possible to retrieve unknown object types from the repository and if the user tries to retrieve an unregistered type then an invalid pointer (nil) is return. If the object is not registered at all an exception is thrown.
- XDR Streamer
- the xdrStream class wraps the XDR serialization functions for the fundamental built-in types in C++.
- the XDR was created by Sun Microsystems, Inc and is freely available. It is normally built in the libc of Unix systems for remote procedure calls.
- XDR provides a conventional way for converting between built-in data types and an external bit-string representation. These XDR routines are used to help implement a type encode/decode routine for each user-defined type.
- the XDR handle contains an operation field which indicates which of the operations (ENCODE, DECODE or FREE) is to be performed.
- the OF is a computational system which involves many software applications from vendors as well as proprietary legacy codes from Western Geophysical.
- the OF workflow may involve many different asset team members working on many different applications which may be distributed on different machines in different countries that are connected through the network. Making trafficking and versioning among many applications in a workflow efficiently in uniform and synchronized ways is what this component wrapper does.
- a OF wrapper is like a black box that contains an application within. There are pipes connected to the both sides of box, one side is the input and the other is the output. One box can be connected to another box by connecting outputs of one to the input of another, as long as data types in and out of pipes are the same. Each pipe of the box is a port that is identified by name. There is only one type of data that is allowed to flow through the pipe, and that is defined by each application.
- Each wrapper box has the following functionality. First it can execute the application as soon as inputs required by the application are all satisfied. Second, it sends events about the status of the execution to the event server. Third, it checks data types to match those coming through the pipe to those needed by the specific application. If the data type coming from another pipe is different from the data type required, then the wrapper invokes the appropriate formatting program to convert the data to the proper type, if there is a formatting program in the registry for the type of conversion required. Each wrapper is implemented in C++, and then compiled and tested for unix systems running on Sun, SGI and Linux operating systems.
- the application To wrapper a new vendor application, the application must be registered. This registration creates an application specification that is in the form of an appSpec object. This application spec object is stored in the data repository so that in the future, the wrapper can obtain information about this application. If the application reads and writes files, then information about these files must also be registered. This process is called to create a file specification in the form of a fileSpec object. This fileSpec object will also be stored in the data repository for the wrapper to use to obtain information about the kind of data needed in the pipe.
- the srWrapper class is designed to be an automatic wrapper box. There are two kinds of I O ports, the file port and parameter port.
- the file port indicates a file will be attached to the port.
- the parameter port means that port holds a parameter value such as a string, integer, float etc.
- the srWrapper class provides mechanisms to create a box, add input and output ports, set values for the port, connect ports from one wrapper to another, and execute the application. It also provides the query mechanism to allow the user to ask the box to obtain information about what is going on inside the box.
- the OF (Operating Framework) software is an integrated, distributed system that seamlessly connects (or includes) many vendor applications and codes.
- a typical OF job involves many vendor programs running at any particular time.
- traditional computer applications such as sequential batch processing, the user of each is responsible for monitoring the status of his job. There is normally no communication among individual application programs.
- a OF event is a piece of information generated from the client application. This information is delegated to interested parties who are expecting such information. For example, if one wants to visualize intermediate results when running a simulation, then the OF simulator can send an event to the visualizer. Also the data object can be delivered to the other application.
- the Event Handler keeps books on all information vital to the end-users and synchronizes multiple executions. The synchronization is achieved through the Event Handling Service by utilizing a centralized messaging system that allows all job processes to communicate with each other and report their status and exceptions.
- the Event Handler is implemented as a centralized server which uses sockets to communicate with clients. The event handler server can register clients as either event producers or event consumers.
- the EE client is for applications to send and receive events.
- the EE server serves all EE clients and manages events.
- the event manager provides APIs to the event server.
- the producer manager provides APIs for the event manager to manage from producers, and the consumer manager provides APIs for the event manager to manage for consumers.
- the event client is designed for applications to communicate with the event server through the TCP/IP connection.
- This client provides all necessary APIs for an application to send and receive events and to do queries.
- a client first has to register itself as a producer or consumer. To produce events, the client must be registered as producer, then adding events to the server is granted. To consume events, the client must be registered as consumer, then the client can poll the server about events it is interested in. Also the client can tell the event server what events it is interested in or producers it is expecting events from. Then the sever can push events back to the registered consumers.
- the Event Server is a service provider that serves event clients. It is responsible to register clients, manage events, answer client's queries, push events back to registered clients, etc. The server depends on the event manager to do all the work.
- the event manager then further manages two other managers, producer manger and consumer manager.
- the producer manager is responsible for the addition of events from the client to the event queue, and retrieval of events for consumers.
- the producer manager manages a list of producer, and each producer will then manage an event queue and a consumer queue. Events produced by this producer are queued to the event queue which has apriority protocol of first-in-first-out (FIFO). All consumers who are interested in this producer are queued on the consumer queue.
- the consumer manager manages a list of consumers registered on the server.
- Each registered consumer then manages its own event type queue and producer queue.
- the event type queue stores all event types this consumer is interested in, and the producer queue stores producers interested by this consumer.
- EE server implemented: single thread and multi-thread . The latter is designed to handle multiple clients at the same time.
- FC OF Foundation Classes
- the util package provides a set of C++ classes categorized into data containers, such as arrays, algorithm classes related to containers, and utility classes for string, system and resource information, unix file and directory manipulation, and pattern matching.
- Data containers are arrays of up to 6 dimensions.
- Matrices and base array classes are generic numerical arrays and are derived from generic arrays.
- the same design rule applies to matrix in both 2D and 3D.
- Our class strings are a subset of the standard string class provided by the C++ language. However, ours have some special string manipulation methods widely used by all OF packages.
- the pattern matching class does pattern matching.
- the class Systemlnfo allows the application to obtain system information such as time, login name, system resource information etc.
- the class Filelnfo allows applications to get information about a unix file.
- the unixDirUtil class is used to generate file name tree structures of a unix directory.
- Algorithm classes are related to each data container.
- One design rule for OF util classes is that we separate
- the SRFC package contains the foundation classes that implement a set of data
- geological and geophysical softwares are volumetric data (3D seismic, 3D velocity, etc), well
- data including well culture data, well bore (well path geometry), well logs, well pick, zones,
- the mms provides containers for all different kinds of geometry objects and meshes needed by applications such as point, polyline, surface, polygons, tetrahedra, bounding box and more.
- a set of field classes is implemented in this package. These fields are generally designed for 2D and 3D structured and non-structured datasets.
- the 2D structured fields are mapped horizons and faults, 2D non- structured fields are triangulated horizons and faults.
- 3D structured fields include regular, rectilinear and curvilinear fields.
- the 3D non-structured field is an irregular mesh.
- These field containers are like srfc containers. They are objects to store geometry and attributes. Available APIs are set and get methods only. Each field class has methods to encode and decode for overloaded xdr input and output streams.
- the high-level architecture of our modeling framework is a layered architectural software pattern in which each layer has a distinct role in the framework.
- a topological representation based on the Radial Edge Data Structure- REDS- which is used to represent complex non-manifold topologies.
- REDS explicitly stores the two uses (sides) of a face by two regions that share the same face. Each face use is bounded by one or more loops uses, which in turn are composed of an alternating sequence of edge uses and vertex uses.
- the REDS is general and can represent non-manifold topology. We make extensive use of high level topological operators for building earth models because topological data structures are in general too complex to be manipulated directly.
- Edges of REDS may represent well paths, a set of faces or a shell may represent the surface of a fault or seismic horizons, and set of regions may represent geological layers and fault zones.
- the associated meshing and remeshing of these geological objects is based on the connectivity and spatial subdivision information stored in REDS.
- the REDS is the component that stores the topological and geometrical representation of an earth model.
- MMS is the layer that generates and manages numerical meshes associated with earth model sub-regions. It is important to note that meshes are treated as attributes of geological entities such as blocks, horizons, layers and faults. Hence, a mesh is not the model, but only one possible realization of a model or a sub-region of the model.
- the meshing operators can provide multiple mesh representations with multiple resolutions of a given earth model.
- One particular important application of these operators is in the area of OF where it is commonly necessary to upscale geological grids to a resolution that the flow simulation can be executed in available computers. Operations between coarse and fine resolution grids are greatly facilitated in this framework.
- CGC shared earth model builder
- An earth model is built from a set of polygonal surfaces defining the boundaries of geological structures.
- CGC has various geometrical operators built-in to facilitate the creation of proper 3D representations of geological entities such as faulted reservoirs.
- the structural seismic interpretation of the reservoir provides the geometrical elements (set of polygonal surfaces) necessary to create a reservoir earth model and its spatial subdivision.
- the geometrical and topological description of an earth model is obtained incrementally by adding polygonal surfaces sequentially to the model.
- the resulting earth model contains the space partitions (regions of space) defined by these surfaces. Meshes can be generated for the entire earth model as well as for each individual region of the model.
- Each region can have multiple meshes with various resolutions associated with it (below). These region meshes are treated as attributes of the model's region similarly to other physical attributes such as lithology, density and velocity. In the current implementation, a region maintains a list of the name (String) of the mesh objects associated with it. These meshes are stored in the OF repository and can be easily queried and retrieved by name.
- MultiMesh System MMS
- MultiMesh System Meshes necessary as input for some of the OF applications are generated automatically by the MultiMesh System. This system was designed to integrate and transfer information in numerical meshes among applications that require distinct mesh representations.
- the meshes are discrete realizations of the earth model. This is analogous to the OF process in which each reservoir realization is just a possible representation of the reservoir. A particular mesh (regular, curvilinear or tetrahedral is just a possible representation of the earth model).
- Multimesh is able to generate structured (regular and rectilinear) and non-structured (tetrahedral) meshes. It can manipulate all meshes necessary to integrate applications (Eclipse, FDM, EarthGM). IBM has contributed to building reservoir classes (SRFC) on top of some MultiMesh classes, and final work will focus on the integration of Multimesh with other applications.
- SRFC reservoir classes
- Sources of the OF data are from many different legacy software such as traditional interpretation applications (e.g.. Landmark, GeoQuest), complex seismic data processing software (e.g., OMEGA), OF software (e.g., EarthGM 3D), fluid simulation software (e.g., VIP, ECLIPS), visualization software and many others.
- traditional interpretation applications e.g.. Landmark, GeoQuest
- complex seismic data processing software e.g., OMEGA
- OF software e.g., EarthGM 3D
- fluid simulation software e.g., VIP, ECLIPS
- the SRIO package consists of a set of classes that define public APIs to all applications, and derived classes for each different application software package. Every class has two APIs: read and write.
- the read method reads client data and converts to SRFC or MMS objects.
- the write method converts SRFC or MMS objects into client data format.
- SRFC Simple SRFC
- well bore data usually comes with (x, y) coordinates plus vertical depth and measured depth, but a well bore with two-way travel time is often used when comparing with seismic data.
- a time-depth conversion table and algorithm are needed.
- mapping data to a horizon this process involves two different data objects: horizon and volumetric data.
- An interpolation algorithm has to be implemented to obtain data for each point of the horizon.
- the filter package provides a set of classes to filter specific data from OF container objects algorithmically to satisfy the above described requirements.
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Acoustics & Sound (AREA)
- Environmental & Geological Engineering (AREA)
- Geology (AREA)
- General Life Sciences & Earth Sciences (AREA)
- General Physics & Mathematics (AREA)
- Geophysics (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15925299P | 1999-10-13 | 1999-10-13 | |
US159252P | 1999-10-13 | ||
PCT/US2000/028564 WO2001027858A1 (en) | 1999-10-13 | 2000-10-13 | Petroleum reservoir simulation and characterization system and method |
Publications (1)
Publication Number | Publication Date |
---|---|
EP1247238A1 true EP1247238A1 (en) | 2002-10-09 |
Family
ID=22571747
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP00970940A Withdrawn EP1247238A1 (en) | 1999-10-13 | 2000-10-13 | Petroleum reservoir simulation and characterization system and method |
Country Status (6)
Country | Link |
---|---|
EP (1) | EP1247238A1 (no) |
AU (1) | AU8025000A (no) |
CA (1) | CA2383664A1 (no) |
MX (1) | MXPA02003683A (no) |
NO (1) | NO20021739L (no) |
WO (1) | WO2001027858A1 (no) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1299127C (zh) * | 2004-03-26 | 2007-02-07 | 中国石油天然气集团公司 | 地震观测系统优化设计的层状介质双聚焦方法及其应用 |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6675101B1 (en) | 2002-11-14 | 2004-01-06 | Schlumberger Technology Corporation | Method and system for supplying well log data to a customer |
US7496488B2 (en) | 2003-03-06 | 2009-02-24 | Schlumberger Technology Company | Multi-scale finite-volume method for use in subsurface flow simulation |
US6823297B2 (en) | 2003-03-06 | 2004-11-23 | Chevron U.S.A. Inc. | Multi-scale finite-volume method for use in subsurface flow simulation |
CN100590637C (zh) | 2003-09-30 | 2010-02-17 | 埃克森美孚上游研究公司 | 使用最小阻力路径来特征化储层模型中的连通性 |
WO2007149766A2 (en) | 2006-06-18 | 2007-12-27 | Chevron U.S.A. Inc. | Reservoir simulation using a multi-scale finite volume including black oil modeling |
CA2702965C (en) | 2007-12-13 | 2014-04-01 | Exxonmobil Upstream Research Company | Parallel adaptive data partitioning on a reservoir simulation using an unstructured grid |
CA2724002C (en) | 2008-05-16 | 2016-11-01 | Chevron U.S.A. Inc. | Multi-scale method for multi-phase flow in porous media |
WO2010003004A2 (en) | 2008-07-03 | 2010-01-07 | Chevron U.S.A. Inc. | Multi-scale finite volume method for reservoir simulation |
EA201170550A1 (ru) | 2008-10-09 | 2011-12-30 | Шеврон Ю.Эс.Эй. Инк. | Итеративный многомасштабный способ для потока в пористой среде |
US8650016B2 (en) | 2009-10-28 | 2014-02-11 | Chevron U.S.A. Inc. | Multiscale finite volume method for reservoir simulation |
WO2011126585A1 (en) | 2010-04-06 | 2011-10-13 | Exxonmobil Upstream Research Company | Hierarchical modeling of physical systems and their uncertainties |
CN109388843B (zh) * | 2018-08-18 | 2023-04-18 | 西安电子科技大学 | 一种基于vtk的桁架天线的可视化系统及方法、终端 |
CN112835657A (zh) * | 2019-11-05 | 2021-05-25 | 中国石油天然气集团有限公司 | 插件式地球物理图件绘制装置及方法 |
CN112862302A (zh) * | 2021-02-03 | 2021-05-28 | 北京侏罗纪软件股份有限公司 | 一种石油数据模型建模方法及工具 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4969130A (en) * | 1989-09-29 | 1990-11-06 | Scientific Software Intercomp, Inc. | System for monitoring the changes in fluid content of a petroleum reservoir |
US5345586A (en) * | 1992-08-25 | 1994-09-06 | International Business Machines Corporation | Method and system for manipulation of distributed heterogeneous data in a data processing system |
US5959547A (en) * | 1995-02-09 | 1999-09-28 | Baker Hughes Incorporated | Well control systems employing downhole network |
US5873049A (en) * | 1997-02-21 | 1999-02-16 | Atlantic Richfield Company | Abstraction of multiple-format geological and geophysical data for oil and gas exploration and production analysis |
-
2000
- 2000-10-13 WO PCT/US2000/028564 patent/WO2001027858A1/en not_active Application Discontinuation
- 2000-10-13 CA CA002383664A patent/CA2383664A1/en not_active Abandoned
- 2000-10-13 MX MXPA02003683A patent/MXPA02003683A/es unknown
- 2000-10-13 EP EP00970940A patent/EP1247238A1/en not_active Withdrawn
- 2000-10-13 AU AU80250/00A patent/AU8025000A/en not_active Abandoned
-
2002
- 2002-04-12 NO NO20021739A patent/NO20021739L/no unknown
Non-Patent Citations (1)
Title |
---|
See references of WO0127858A1 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1299127C (zh) * | 2004-03-26 | 2007-02-07 | 中国石油天然气集团公司 | 地震观测系统优化设计的层状介质双聚焦方法及其应用 |
Also Published As
Publication number | Publication date |
---|---|
MXPA02003683A (es) | 2002-08-30 |
NO20021739D0 (no) | 2002-04-12 |
AU8025000A (en) | 2001-04-23 |
CA2383664A1 (en) | 2001-04-19 |
NO20021739L (no) | 2002-06-12 |
WO2001027858A1 (en) | 2001-04-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6826483B1 (en) | Petroleum reservoir simulation and characterization system and method | |
Small et al. | The SCEC unified community velocity model software framework | |
US6950786B1 (en) | Method and apparatus for generating a cross plot in attribute space from a plurality of attribute data sets and generating a class data set from the cross plot | |
US11815651B2 (en) | Geologic model and property visualization system | |
US20020082811A1 (en) | Optimization apparatus, system, and method of use and doing business | |
WO2001027858A1 (en) | Petroleum reservoir simulation and characterization system and method | |
US20140068448A1 (en) | Production data management system utility | |
Bello et al. | Next generation downhole big data platform for dynamic data-driven well and reservoir management | |
King et al. | Reservoir modeling: From rescue to resqml | |
US8942960B2 (en) | Scenario analyzer plug-in framework | |
US20190265375A1 (en) | Cloud Framework System | |
US20230325369A1 (en) | Multiple source data change journal system | |
Parkhonyuk et al. | Cloud-based solution for advanced real-time fracturing evaluation | |
Morandini et al. | Using RESQML for Shared Earth Model Data Exchanges between Commercial Modelling Applications and In-House Developments, Demonstrated on Actual Subsurface Data | |
McGaughey | The common earth model: A revolution in mineral exploration data integration | |
Apel | A 3d geoscience information system framework | |
Steiner et al. | formikoj: A flexible library for data management and processing in geophysics—Application for seismic refraction data | |
CA2912776A1 (en) | System, method and computer program product for smart grouping of seismic interpretation data in inventory trees based on processing history | |
US12130400B2 (en) | Geologic model and property visualization system | |
Gawith et al. | Integrating geoscience and engineering for improved field management and appraisal | |
Kolmakov et al. | Design and development of relational geospatial database aimed at gathering and systematization of wide range of geological and geophysical data | |
Chubak | Software framework for geophysical data processing, visualization and code development | |
Zhao | Subsurface Digital Twin and Emergence | |
Guide et al. | Release notes | |
Jaeger-Frank et al. | A three tier architecture applied to LiDAR processing and monitoring |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20020328 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE |
|
AX | Request for extension of the european patent |
Free format text: AL;LT;LV;MK;RO;SI |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN |
|
18W | Application withdrawn |
Effective date: 20070104 |