CN111624967A - Supervisory engine for process control - Google Patents

Supervisory engine for process control Download PDF

Info

Publication number
CN111624967A
CN111624967A CN202010503158.6A CN202010503158A CN111624967A CN 111624967 A CN111624967 A CN 111624967A CN 202010503158 A CN202010503158 A CN 202010503158A CN 111624967 A CN111624967 A CN 111624967A
Authority
CN
China
Prior art keywords
data
work item
task
user
person
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010503158.6A
Other languages
Chinese (zh)
Inventor
M·J·尼克松
K·贝奥特
D·D·克里斯滕森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fisher Rosemount Systems Inc
Original Assignee
Fisher Rosemount Systems Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/028,972 external-priority patent/US11112925B2/en
Application filed by Fisher Rosemount Systems Inc filed Critical Fisher Rosemount Systems Inc
Publication of CN111624967A publication Critical patent/CN111624967A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • G05B19/41865Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by job scheduling, process planning, material flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B23/00Testing or monitoring of control systems or parts thereof
    • G05B23/02Electric testing or monitoring
    • G05B23/0205Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults
    • G05B23/0208Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterized by the configuration of the monitoring system
    • G05B23/0216Human interface functionality, e.g. monitoring system providing help to the user in the selection of tests or in its configuration
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0428Safety, monitoring
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/10Plc systems
    • G05B2219/15Plc structure of the system
    • G05B2219/15045Portable display unit

Abstract

A supervisory engine that cooperates with an expert system in a process control environment for automatically generating, distributing, tracking, and managing work items. The supervisory engine creates a work item based on data received from the expert system, selects available personnel to execute the work item, sends the work item to the selected personnel, schedules execution of the work item, and creates and stores permissions that allow the designated personnel to complete a target function of the work item within a designated time. The supervisor engine determines a desired skill combination, role, authentication, and/or certification associated with the work item and selects a person to perform the work item based on a person profile specifying the skill combination, role, authentication, and/or certification associated with the person. Alternatively or additionally, the supervisor engine assigns work items based on personnel present at or near the target device.

Description

Supervisory engine for process control
The application is a divisional application, the application date of the original application is 3-14 th 2014, the application number is 201410097873.9, and the name of the invention is 'a supervision engine for process control'.
Technical Field
The present disclosure relates generally to process plants and process control systems and, more particularly, to the use of mobile user interface devices in process plants and process control systems.
Background
Distributed process control systems, such as those used in chemical, petroleum or other process plants, typically include one or more process controllers communicatively coupled to one or more field devices via analog, digital or combined analog/digital buses or via wireless communication links or networks. The field devices, which may be, for example, valves, valve positioners, switches and transmitters (e.g., temperature, pressure, level and flow rate sensors), are located in the process environment and typically perform physical or process control functions (e.g., opening or closing valves, measuring process parameters, etc.) to control one or more processes performed within the process plant or system. Smart field devices (e.g., field devices conforming to the well-known fieldbus protocol) may also perform control calculations, alarm functions, and other control functions typically implemented in controllers. Process controllers, also typically located in plant environments, receive signals indicative of process measurements made by sensors and/or field devices and/or other information related to the field devices and execute controller applications that, for example, run different control modules that make process control decisions, generate control signals based on the received information, and communicate with the control modules or with the field devices (e.g., within a plant environment)
Figure BDA0002525522080000011
Wireless
Figure BDA0002525522080000012
And
Figure BDA0002525522080000013
fieldbus field devices) to cooperate. A control module in the controller sends control signals to the field devices over communication lines or links to control the operation of at least a portion of the process plant or system.
Information from the field devices and controllers is typically made available through a data highway to one or more other hardware devices, such as operator workstations, personal computers or computing devices, data historians, report generators, central databases, or central management computing devices, which are typically located in control rooms or other locations remote from the harsher plant environment. Each of these hardware devices is typically centralized across the process plant or across a portion of the process plant. These hardware devices run applications that, for example, may enable an operator to perform functions related to controlling a process and/or operating a process plant (e.g., changing settings of a process control routine, modifying the operation of a controller or a control module in a field device, viewing the current state of a process, viewing alarms generated by a field device and a controller, simulating the operation of a process for the purpose of training personnel or testing process control software, maintaining and updating a configuration database, etc.). The data highway used by the hardware devices, controllers, and field devices may include wired communication paths, wireless communication paths, or a combination of wired and wireless communication paths.
For example, DeltaV, marketed by Emerson Process ManagementTMA control system includes a plurality of applications stored in and executed by different devices located at various locations in a process plant. Each of these applications provides a User Interface (UI) to allow a user (e.g., a configuration engineer, an operator, a maintenance technician, etc.) to view and/or modify various aspects of the process plant operation and configuration. In this specification, the word "user interface" or "UI" is used to refer to an application or screen that allows a user to view or modify the configuration, operation, or state of a process plant. Similarly, the words "user interface device" or "UI device"Is used to refer to a device on which a user interface operates regardless of whether the device is stationary (e.g., a workstation, a wall-mounted display, a process control device display, etc.) or mobile (e.g., a laptop, a tablet, a smartphone, etc.). A configuration application located on one or more operator workstations or computing devices enables a user to create or change process control modules and download those process control modules to dedicated distributed controllers via a data highway. Typically, these control blocks are made up of communicatively interconnected function blocks, which are objects in an object-oriented programming protocol that perform functions in the control scheme based on data thereto and provide outputs to other function blocks in the control scheme. The configuration application may also allow the configuration designer to create or change operator interfaces used by the viewing application to display data to an operator and to enable the operator to change settings, such as set points, in the process control routine. Each dedicated controller (and in some cases one or more field devices) stores and executes a respective controller application that runs the control modules assigned and downloaded thereto to implement the actual process control functions. A viewing application, which may be executed on one or more operator workstations (or on one or more remote computing devices communicatively connected to the operator workstations and the data highway), receives data from the controller application via the data highway and displays the data to a designer, operator, or user of the process control system using the UI and may provide any of a number of different perspectives (e.g., an operator perspective, an engineer perspective, a technician perspective, etc.). The data historian application is typically stored in and executed by a data historian device that collects and stores some or all of the data provided across the data highway, while the configuration database application may run in another computer connected to the data highway to store the current process control routine configuration and data associated therewith. Alternatively, the configuration database may be located in the same workstation as the configuration application.
The architecture of process control plants and process control systems is largely affected by limited controller and device memory, communication bandwidth, and controller and device processor capabilities. For example, the use of dynamic and static non-volatile memory in a controller is typically minimized, or at least carefully managed. As a result, during system configuration (e.g., a priori), the user typically has to select which data in the controller is to be archived or saved, how often it is saved, and whether compression is used, and thereby configure the limited set of data rules to the controller. Thus, data that may be useful in troubleshooting and process analysis is typically not archived, and if it is collected, useful information may have been lost due to data compression.
Further, to minimize controller memory usage in currently known process control systems, and data to be archived or saved is reported to a workstation or computing device for storage, for example, at an appropriate historian or data silo. Current techniques for reporting data utilize poor communication resources and cause unnecessary controller loading. Furthermore, data collection and time stamping are typically not synchronized with the real process due to time delays in communication and sampling at the historian or silo.
Similarly, in batch process control systems, to minimize controller memory usage, snapshots of batch receptions and controller configurations are typically kept stored at a central management computing device or location (e.g., in a data silo or historian) and transmitted to the controllers only when needed. Such a strategy introduces severe bursty loads in the controller and in the communication channel between the workstation or central management computing device and the controller.
Furthermore, the capacity and performance limitations of the relational databases of the process control system, combined with the high cost of disk storage, play a large part in constructing application data into separate entities or silos to meet the objectives of a particular application. For example, at DeltaVTMIn a system, process models, continuous historical data, and batches and eventsThe data is stored and/or archived in three different application databases or silos of data. Each silo has a different interface to access the data stored therein.
Constructing data in this manner creates barriers to access and use of the historical data. For example, the root cause of the change in product quality may be associated with data in one or more of the data files. However, due to the different file structures, no tools are provided that allow this data to be quickly and easily accessed for analysis. In addition, auditing or synchronization functions must be performed to ensure that data across different silos is consistent.
The above-described limitations of process plants and process control systems, as well as other limitations, may undesirably manifest themselves in the operation and optimization of the process plants or process control systems, such as during plant operation, troubleshooting, and/or predictive modeling. For example, such restrictions force that a cumbersome and tedious workflow must be performed in order to obtain data for troubleshooting and generating updated models. Furthermore, the acquired data may be inaccurate due to data compression, insufficient bandwidth, or offset timestamps.
The background description provided herein is for the purpose of summarizing the environment of the present invention. Work of the presently named inventors, to the extent it is described in this background section, as well as aspects of the description that may not otherwise qualify as prior art at the time of filing, are neither expressly nor impliedly admitted as prior art against the present disclosure.
Disclosure of Invention
In a first embodiment, a computer-implemented automated method of assigning tasks to employees in a process plant is performed by a supervisor module and includes receiving data from an expert system and creating task-specified work items from the data received from the expert system. The method also includes selecting a person to perform the task specified in the work item, sending the work item to a device associated with the selected person, and receiving an indication that the selected person has accepted the work item.
In another embodiment, a process control system includes a plurality of process control devices, a historical big data storage sensor and parameter data of the process control system, and an expert system coupled to the historical big data and configured to analyze the data stored by the historical big data. The process control system also includes a supervisor module coupled to the expert system and configured to assign tasks to personnel within the process plant. The supervisor module is configured to receive data from the expert system and create a task-specified work item based on the data received from the expert system. The supervisor engine may also be configured to select a person to perform the task specified in the work item, send the work item to a device associated with the selected person, and receive an indication that the selected person has accepted the work item.
Drawings
FIG. 1A is a block diagram of an example process control network operating within a process control system or a process plant.
FIG. 1B is a block diagram illustrating a broader control network.
Fig. 2 is a block diagram illustrating a communication architecture including a mobile control room in accordance with the present description.
FIG. 3 is a block diagram illustrating an embodiment of a supervisor engine according to the present description.
FIG. 4 illustrates exemplary work items that may be generated by the supervisor engine of FIG. 23.
FIG. 5 is a flow diagram illustrating a method for assigning tasks to personnel within a process plant.
FIG. 6 is a flow diagram illustrating a method for managing workflows in a process plant.
FIG. 7 is a flow diagram illustrating a method for facilitating task completion in a process plant.
Fig. 8 is a block diagram of a UI device.
Fig. 9A illustrates aspects of an exemplary mobile control room.
Fig. 9B illustrates equipment in an exemplary mobile control room.
FIG. 10 illustrates an exemplary device display associated with UI synchronization between UI devices.
FIG. 11 is a flow diagram illustrating an example method for synchronizing UI devices.
Fig. 12A is a block diagram illustrating exemplary data associated with a UI device in an exemplary mobile control room.
Fig. 12B is a block diagram illustrating exemplary data associated with a UI device in another example mobile control room.
FIG. 13 is a flow diagram of an example method for providing session data to a UI device.
FIG. 14 is a flow diagram of an example method for generating a GUI configuration at a UI device.
FIG. 15 is a flow diagram illustrating an example method of direct state information transfer between two UI devices.
FIG. 16 is a flow diagram illustrating an example method for communicating state information between two UI devices coupled to a server.
FIG. 17 is a flow diagram illustrating another method for communicating status information between two UI devices.
FIG. 18 is a flow diagram that illustrates another example method for controlling a process plant using a UI device associated with a mobile control room.
FIG. 19 is a flow chart illustrating a method executed on a server for facilitating mobile control of a process plant using a UI device.
FIG. 20 is a flow diagram illustrating a method for transferring a state of a first UI device to a second UI device.
FIG. 21 is a flow diagram illustrating a method for initiating a UI session on a first UI device.
FIG. 22 is a flow chart illustrating a second method for initiating a UI session on a first UI device.
FIG. 23 illustrates a second aspect of an exemplary mobile control room.
FIG. 24 is a block diagram of an exemplary context-aware UI device.
FIG. 25 is a block diagram of another embodiment of a mobile control room in a process plant.
FIG. 26 is an illustration of another example mobile control room.
FIG. 27 is a flow diagram illustrating an example method for generating a graphical user interface.
FIG. 28 is a flow chart illustrating an example method performed by a UI device.
FIG. 29 is a flow chart illustrating a method for facilitating mobile control of a process plant.
FIG. 30 is a flow chart illustrating a method for determining the location of a mobile device within a process plant.
FIG. 31 is a flow chart illustrating a method for environmental operation of a mobile device in a process control environment.
FIG. 32 is a flow chart illustrating a method for analyzing physical phenomena in a process plant.
Detailed Description
The de-centralization and mobility of the control and maintenance facilities associated with a process plant brings various significant advantages thereto. For example, the cooperation of mobile user interface devices with fixed user interface devices allows operators, maintenance personnel, and other plant personnel to be confined to a central location, allowing personnel to move about the entire process plant without compromising access to information related to the operation and status of the process plant. By implementing the "big data" concept (i.e., collecting, storing, organizing, and mining one or more collections of data that are so large or complex that traditional database management tools and/or data processing applications cannot manage a data set in a tolerable amount of time) in conjunction with expert systems, supervisory systems, and environment-aware mobile user interface devices, a process plant may be beneficially managed and maintained more efficiently (e.g., with less maintenance, greater throughput, less downtime, fewer personnel, less risk of safety to personnel and equipment, etc.), as described throughout this disclosure.
Typically, the context aware mobile user interface devices cooperate with expert systems, supervisory systems, and big data systems to facilitate improved operation of the process plant. Improved operation may be achieved using one or more of the presently described concepts (which include collaboration, mobility, workflow management, personnel management, automation, accountability, verification and diagnosis, etc.). For example, the apparatus, systems, and methods described herein may facilitate seamless transitions from one user interface device to another (e.g., from a workstation to a tablet or from a tablet to a mobile phone) such that a user may have the same or similar information available regardless of movement from one device to another, and/or may facilitate collaboration between multiple people viewing the same or different data at the same or different locations, and/or may facilitate initiation or continuation of a user session regardless of the device that the user happens to be operating. The mobile device in the user interface device may be device-aware and/or location-aware to automatically display relevant information (e.g., maps, procedures, graphs, user manuals), and log-in applications, etc. In addition, collaboration between the expert and supervisory systems and the user interface devices may facilitate automatic generation, distribution, and management of work items related to operator and/or maintenance personnel activities. For example, as will be described in further detail below, an expert system may analyze information stored in a big data system and determine that some tasks should be performed, and may create work items through cooperation with a supervisory system, assign work items to people, create checklists of items necessary for performance of work item tasks, enable the assigned people to spot check the performance of the associated tasks, and track the progress of the tasks. These and other aspects will be described throughout.
Turning first to the overall architecture of an example process plant, FIG. 1A is a block diagram of an example process control network 100 operating within a process control system or process plant 10. The process control network 100 may include a backbone network 105 that provides direct or indirect connections between various other devices. In various embodiments, the devices coupled to the backbone network 105 include the access point 72, the gateway 75 to other process plants (e.g., via an intranet or enterprise wide area network), the gateway 78 to external systems (e.g., to the internet), the UI devices 112, the server 150, the big data facility 102 (e.g., including big data historians), the big data expert system 104, the supervisor engine 106, the controller 11, the input/output (I/O) cards 26 and 28, the wired field devices 15-22, the wireless gateway 35, and the wireless communication network 70. The communication network 70 may include wireless devices 40-58 including wireless field devices 40-46, wireless adapters 52a and 52b, access points 55a and 55b, and a router 58. Wireless adapters 52a and 52b may be connected to non-wireless field devices 48 and 50, respectively. The controller 11 may include a processor 30, a memory 32, and one or more control routines 38. Although FIG. 1A shows only a single one of some of the devices connected to the backbone network 105, it should be understood that each device may have multiple instances on the backbone network 105, and in practice, the process plant 10 may include multiple backbone networks 105.
The UI device 112 may be communicatively connected to the controller 11 and the wireless gateway 35 via the backbone network 105. The controller 11 may be communicatively coupled to the wired field devices 15-22 via input/output (I/O) cards 26 and 28 and may be communicatively coupled to the wireless field devices 40-46 via a backbone network 105 and a wireless gateway 35. The controller 110 may operate using at least some of the field devices 15-22 and 40-46 to implement a batch process or a continuous process. Controller 11 (which may be, for example, DeltaV sold by Emerson Process management Inc.)TMA controller) is communicatively coupled to the process control backbone 105. The controller 11 may also use communication protocols such as standard 4-20mA devices, I/ O cards 26, 28 and/or any smart communication protocol (e.g., USB, etc.)
Figure BDA0002525522080000091
A field bus protocol,
Figure BDA0002525522080000092
Protocol, wireless
Figure BDA0002525522080000093
Protocol, etc.) may be communicatively coupled to the field devices 15-22 and 40-46. In the embodiment shown in FIG. 1A, the controller 11, the field devices 15-22, and the I/ O cards 26, 28 are wired devices, and the field devices 40-46 are wireless field devices.
In operation of UI device 112, in some embodiments, UI device 112 may execute a user interface ("UI"), allowing UI device 112 to accept input via an input interface and provide output at a display. UI device 112 may receive data (e.g., process-related data such as process parameters, log data, sensor data, and/or any other data that may be captured and stored in big data facility 102) from server 150. In other embodiments, the UI may be performed in whole or in part at the server 150, where the server 150 may transmit the display data to the UI device 112. The UI device 112 may receive UI data (which may include display data and process parameter data) from other nodes in the process control network 100 (e.g., the controller 11, the wireless gateway 35, or the server 150) via the backbone network 105. Based on the UI data received at the UI device 112, the UI device 112 provides output (i.e., visual representations or graphics) representing various aspects of a process associated with the process control network 100, allowing a user to monitor the process. The user may also affect control of the process by providing input at the UI device 112. For illustration purposes, the UI device 112 may provide graphics representing, for example, a tank filling process. In such a scenario, a user may read the tank level measurement and determine that the tank needs to be filled. The user may interact with the inlet valve graphic displayed at UI device 112 and input a command that causes the inlet valve to open.
In further operation, the UI device 112 may execute a number of routines, modules, or services in addition to the UI. In one embodiment, UI device 112 may execute context-aware routines, which may include, for example, various routines or subroutines related to location-awareness, device-awareness, or scheduling-awareness (as shown in fig. 27). These environment routines may enable the UI device 112 to present a graphical user interface configuration ("GUI" configuration) suitable for the particular environment or environments in which the UI device 112 is operated. UI device 112 may also execute state determination routines that enable UI device 112 to track and save the state of UI device 112, including the state of applications (e.g., UIs) executing at UI device 112. By tracking the state of the application on the UI device 112, the UI device 112 may allow a user, for example, to initiate a session on a first device 112 and begin using a second UI device 112 to continue the workflow from its previous session with minimal interruption.
The UI device 112 (or a server providing applications or screens to the UI device 112) may also execute routines related to managing plant assets. For example, some routines may be used to install, replace, maintain, calibrate, diagnose, or commission assets in a process plant. Other routines may be used to prepare or complete work orders associated with particular assets and/or to notify plant personnel of work orders (e.g., personnel located near particular equipment). UI device 112 may execute routines related to the monitoring process. For example, some routines may be used to log instrument data in the field, report laboratory samples, and display real-time asset parameters, among others. The UI device 112 may also execute routines related to compliance with plant procedures and workflows. For example, some routines may provide information related to Standard Operating Procedures (SOPs), turn-on procedures, turn-off procedures, lock procedures, work instructions, or other product/asset documentation. Still other additional routines may facilitate immediate delivery of work orders and immediate system availability to offline, manually entered data when UI device 112 is coupled to a network. The communication routines may include email routines, text messaging routines, instant messaging routines, etc. to facilitate communication between external parties and/or plant personnel providing technical or other support.
UI device 112 (or a server that provides applications or screens to UI device 112) may also include routines that support and/or facilitate one or more auditing processes. The audit process may include, for example, a work audit and/or a routine audit. In embodiments, the routine may allow a user to view data and/or generate reports related to collected, maintained, and/or verified data for the purpose of meeting routine requirements. For purposes of illustration, where the mobile control room is implemented in a pharmaceutical plant, the mobile control room may facilitate viewing or reporting of the collected data for the purpose of meeting governmental requirements related to the safety of the product output of the plant. In embodiments, the routine may allow a user to view and/or generate reports related to work orders, maintenance, or other auditing of the plant process.
In particular embodiments, UI device 112 may implement any type of client (e.g., a thin client, a network client, or a thick client). For example, UI device 112 may rely on other nodes, computers, or servers to obtain the bulk of the processing required for operation of UI device 112. In such an example, the UI device 112 may be in communication with the server 150, where the server 150 may be in communication with one or more other nodes on the process control network 100 and may determine display data and/or process data to send to the UI device 112. Further, UI device 112 may communicate any data related to the received user input to server 150 such that server 150 may process the data related to the user input and perform corresponding operations. In other words, UI device 112 may essentially only work as follows: the graph is rendered and used as a port to one or more nodes or servers that store data and execute routines required for operation of the UI device 112. The thin-client UI device provides the benefit of minimizing the hardware requirements of the UI device 112.
In other embodiments, UI device 112 may be a web client. In such an embodiment, a user of the UI device 112 may interact with the process control system via a browser at the UI device 112. The browser enables a user to access data and resources at another node or server 150 (e.g., server 150) via the backbone network 105. For example, the browser may receive UI data (e.g., display data or process parameter data) from the server 150, allowing the browser to show graphics for controlling and/or monitoring some or all of the process. The browser may also receive user input (e.g., a mouse click on a graphic). The user input may cause the browser to retrieve or access information resources stored on server 150. For example, a mouse click may cause a browser to retrieve and display information (from server 150) related to the clicked graphic.
In another embodiment, a large amount of processing for the UI device 112 may be performed at the UI device 112. For example, UI device 112 may execute the previously discussed UI, state determination routines, and context awareness routines. UI device 112 may also store, access, and analyze data locally.
In operation, a user may interact with the UI device 112 to monitor or control one or more devices in the process control network 100, such as any of the field devices 15-22 or devices 40-48. The user may interact with the UI device 112, for example, to modify or change parameters associated with control routines stored in the controller 11. The processor 30 of the controller 11 implements or oversees one or more control routines (stored in the memory 32), which may include a control loop. The processor 30 may communicate with the field devices 15-22 and 40-46 and with other nodes communicatively coupled to the backbone network 105. It should be noted that any of the control routines or modules described herein (including the quality prediction and fault detection modules or functional blocks) may have portions thereof implemented or executed by different processors or other devices, if desired. Likewise, the control routines or modules described herein to be implemented in the process control system 10 may take any form, including software, firmware, hardware, and so forth. The control routines may be implemented in any desired software format, such as using object oriented programming, ladder logic, sequential function charts, function block diagrams, or using any other software programming language or design drawing. In particular, the control routine may be implemented by a user through the UI device 112. The control routines may be stored in any desired type of memory, such as Random Access Memory (RAM) or Read Only Memory (ROM). Likewise, the control routines may be hard-coded into, for example, one or more EPROMs, EEPROMs, Application Specific Integrated Circuits (ASICs), or any other hardware or firmware elements. Thus, the controller 11 may be configured (in particular embodiments by a user using the UI device 112) to implement a control strategy or control routine in any desired manner.
In some embodiments of the UI device 112, a user may interact with the UI device 112 to implement a control strategy at the controller 11 using what are commonly referred to as function blocks, where each function block is an object or another part (e.g., a subroutine) of an overall control routine and operates in conjunction with other function blocks (via links of communication calls) to implement process control loops in the process control system 10. Control-based function blocks typically perform at least one of an input function (e.g., an input function associated with a transmitter, a sensor, or other process parameter measurement device), a control function (e.g., a control function associated with a control routine that performs PID, fuzzy logic, etc. control), or an output function that controls the operation of a device (e.g., a valve) to perform a physical function in a process control system. Of course, hybrid and other types of functional blocks exist. The function blocks may have graphical representations provided at the UI device 112 that allow a user to easily modify the types of function blocks, the connections between the function blocks, and the inputs/outputs associated with each of the function blocks implemented in the process control system. The function blocks may be stored in the controller 11 and may be executed by the controller 11, which is typically the case where these function blocks are used in or associated with standard 4-20mA devices and some types of smart field devices (e.g., HART devices), or may be stored in and implemented by the field devices themselves, which may be the case with fieldbus devices. The controller 11 may include one or more control routines 38 that may implement one or more control loops. Each control loop is commonly referred to as a control module and may be implemented by the execution of one or more functional blocks.
In an embodiment, UI device 112 interacts with big data facility 102 and/or expert system 104 and/or supervisor engine 106. The big data facility 102 may collect and store all types of process control data from the process plant 10, including sensor data, control parameters, manually entered data (e.g., data collected as personnel move around within the process plant 10), personnel location and command inputs, timestamps associated with all data, and any other types of data available within the process plant 10. The expert system 104 communicatively coupled with the big data facility 102 may operate independently or according to specific user inputs to analyze the process plant data stored in the big data facility 102. The expert system 104 may develop and/or use models, identify data trends and/or correlations, alert plant personnel to predicted or actual problems and/or abnormal situations and/or sub-optimal conditions, etc. that may affect or soon may affect the process plant 10. In some embodiments, the expert system 104 performs these functions without being specifically programmed to associate a particular set of data or trends with a particular problem or condition, but rather, identifies that a current trend or concurrency of data has occurred at or near a previous condition (which may be an active/desired condition or a passive/undesired condition). Based on the recognition of a trend or an existing occurrence of data concurrency, the expert system 104 can predict a condition ("pro-diagnostics"). The expert system 104 may also determine from the data stored in the big data facility 102 which process variables, sensor readings, etc. (i.e., which data) are most important in detecting, predicting, preventing, and/or correcting abnormal situations within the process plant 10. For example, the expert system 104 may determine that hydrocarbons are being emitted from the chimney, and may automatically determine the cause of the hydrocarbon emissions (e.g., by the supervisor engine 106) and/or cause a work item to be generated to correct the problem causing the hydrocarbon emissions and/or cause a work item to be generated to check equipment or observe/record parameters that are not available via the network. As another example, the expert system 104 may determine that the trend indicated by a series of previous data points indicates a predicted abnormal condition, a predicted maintenance transaction, a predicted failure, and/or the like.
As described in detail below, supervisor engine 106 may interact with big data facility 102 and/or expert system 104 to automatically perform and/or facilitate various supervisory activities. For example, the supervisor engine 106 may monitor trends identified by the expert system 104 and create work items for plant personnel. As another example, the supervisory engine 106 may monitor the calibration status of process plant resources and may create work items for plant personnel. In association with these functions, supervisory engine 106 can also manage personnel credentials, access rights to devices during the execution of scheduled work items, and opportunities for work item execution. Supervisor engine 106 may interact with UI device 112 to distribute and track the execution of work items, and subsequently verify that indications or conditions (e.g., identified trends, exceptions, etc.) that led to the creation of the work items have been resolved after completion of the work items. For example, the supervisor engine 106 may determine a valve failure and create a work item from the expert engine 104. The supervisor engine 106 may subsequently determine that a maintenance worker carrying the UI device 112 is in the vicinity of the failed valve and request that the maintenance worker be assigned a work item that the worker may accept via the UI device 112. Supervisor engine 106 may verify that the maintenance worker has the appropriate skill set to perform the work item and may provide the maintenance worker with the required privileges to perform the work item. In addition, the supervisor engine 106 may reschedule process control activities so that work items may be completed. Prior to or during execution of the work item, supervisor engine 106 may provide the person with standard operating procedures, manuals, and other documentation. These are just some examples of supervisory engine 106, which will be explained further below.
Still referring to FIG. 1A, the wireless field devices 40-46 communicate in a wireless network 70 using a wireless protocol, such as the WirelessHART protocol. In particular embodiments, the UI device 112 is capable of communicating with the wireless field devices 40-46 using the wireless network 70. Such wireless field devices 40-46 may communicate directly with one or more other nodes of the process control network 100 that are also configured to communicate wirelessly (e.g., using a wireless protocol). To communicate with one or more other nodes not configured for wireless communication, the wireless field devices 40-46 may utilize a wireless gateway 35 connected to a backbone network 105. Of course, the field devices 15-22 and 40-46 may conform to any other desired standard or protocol, such as any wired or wireless protocol, including any standard or protocol developed in the future.
The wireless gateway 35 is an example of a provider device 110 that may provide access to various wireless devices 40-58 of the wireless communication network 70. In particular, the wireless gateway 35 provides a communicative coupling between the wireless devices 40-58 and other nodes of the process control network 100, including the controller 11 of FIG. 1A. In some cases, the wireless gateway 35 provides communicative coupling to the lower layers of the wired and wireless protocol stacks (e.g., address translation, routing, packet segmentation, prioritization, etc.) through routing, buffering, and timing services while tunneling through one or more shared layers of the wired and wireless protocol stacks. In other cases, the wireless gateway 35 may translate commands between wired and wireless protocols that do not share any protocol layers. In addition to protocol and command conversion, the wireless gateway 35 also provides synchronized timing for use by time slots and superframes (sets of communication time slots equally spaced in time) of the scheduling scheme associated with the wireless protocol implemented in the wireless network 30. In addition, the wireless gateway 35 may provide network management and administration functions for the wireless network 70, such as resource management, performance tuning, network fault mitigation, monitoring traffic, security, and the like.
Like the wired field devices 15-22, the wireless field devices 40-46 of the wireless network 70 may perform physical control functions within the process plant 10 such as opening or closing valves or measuring process parameters. However, the wireless field devices 40-46 are configured to communicate using the wireless protocol of the network 70. As such, the wireless field devices 40-46, the wireless gateway, and the other wireless nodes 52-58 of the wireless network 70 are manufacturers and consumers of wireless communication packets.
In some cases, wireless network 70 may include non-wireless devices. For example, the field device 48 in FIG. 1A may be a legacy (legacy)4-20mA device, while the field device 50 may be a conventional wired HART device. To communicate in the network 30, the field devices 48 and 50 may connect to a wireless communication network 70 via Wireless Adapters (WAs) 52a or 52 b. In addition, the wireless adapters 52a, 52b may support other communication protocols, for example
Figure BDA0002525522080000161
Fieldbus, PROFIBUS, DeviceNet, etc. Further, the wireless network 30 may include one or more network access points 55a, 55b, which may be separate physical devices in wired communication with the wireless gateway 35 or may be provided with a wireless gateway as an integrated device. The wireless network 70 may also include one or more routers 58 to forward packets from one wireless device to another within the wireless communication network 30. The wireless devices 32-46 and 52-58 may communicate via wireless communicationThe wireless links 60 of the network 70 communicate with each other and with the wireless gateway 35.
Accordingly, FIG. 1A includes several examples of provider equipment that is primarily used to provide network routing functions and oversight to various networks of process control systems. For example, the wireless gateway 35, the access points 55a, 55b, and the router 58 include functionality to route wireless packets within the wireless communication network 70. The wireless gateway 35 performs traffic management and policing functions for the wireless network 70 and routes traffic to and from a wired network communicatively connected to the wireless network 70. The wireless network 70 may utilize a wireless process control protocol (e.g., WirelessHART) that specifically supports process control messages and functions.
In some embodiments, the process control network 100 may include other nodes connected to the backbone network 105 that communicate using other network protocols. For example, the process control network 100 may include one or more wireless access points 72 using other wireless protocols (e.g., WiFi or other IEEE802.11 compatible wireless local area network protocols, mobile communication protocols (e.g., WiMAX (worldwide interoperability for microwave access), LTE (long term evolution), or other ITU-R (international telecommunication union radio communication sector) compatible protocols), short wavelength wireless communications (e.g., Near Field Communication (NFC) and bluetooth), or other wireless communication protocols). Generally, such wireless access points 72 allow handheld or other portable computing devices to communicate over a corresponding wireless network that is different from the wireless network 70 and supports a different wireless protocol than the wireless network 70. In some embodiments, the UI devices 112 communicate over the process control network 100 using the wireless access points 72. In some scenarios, one or more process control devices (e.g., the controllers 11, the field devices 15-22, or the wireless devices 35, 40-58) may communicate using a wireless network supported by the access point 72 in addition to the portable computing device.
Additionally or alternatively, the provider equipment may include one or more gateways 75, 78 to systems located outside of the current process control system 10. In such embodiments, UI device 112 may be used to control, monitor, or communicate with the external system. Generally, such systems are consumers or providers of information generated or operated by the process control system 10. For example, the plant gateway node 75 may communicatively connect the current process plant 10 (the process control data backbone 105 having its own response) and another process plant having its own corresponding backbone. In one embodiment, a single backbone network 105 may serve multiple process plants or process control environments.
In another example, the plant gateway node 75 may communicatively connect a current process plant to a legacy or prior art process plant that does not include the process control network 100 or the backbone network 105. In this example, the plant gateway node 75 may convert or translate messages between the protocol utilized by the process control big data backbone 105 of the plant 10 and a different protocol utilized by legacy systems (e.g., Ethernet, Profibus, Fieldbus, DeviceNet, etc.). In such an example, the UI device 112 may be used to control, monitor, or communicate with a system or network within the legacy or prior art process plant.
The provider facility may include one or more external system gateway nodes 78 to communicatively connect the process control network 100 with a network of external public or private systems, such as laboratory systems (e.g., a laboratory information management system or LIMS), personnel patrol databases, material handling systems, maintenance management systems, product inventory control systems, production scheduling systems, weather data systems, transportation and processing systems, packaging systems, the internet, another provider's process control system, or other external systems. The outside system gateway node 78 may, for example, facilitate communication between the process control system and personnel outside of the process plant (e.g., personnel at home). In one such example, an operator or maintenance technician may use the UI device 112 from her home to connect to the backbone network 105 via a home network (not shown), the internet, and the gateway 78. In another example, an operator or maintenance technician may connect to the backbone network 105 from any location via a mobile phone network (not shown), the internet, and the gateway 78 using the UI device 112. The gateway node 78 may also facilitate communication between plant personnel within the process plant and entities or persons outside the process plant. For example, a technician servicing a process control device in a process plant may communicate from her UI device 112 with a support representative from the manufacturer of the process control device. In yet another example, the supervisor engine 106 can monitor weather, track incoming supply goods, track financial data (e.g., commodity futures), and the like, to assist the supervisor engine 106 in scheduling work items, managing production schedules, and the like. Of course, all connections made via gateway 78 (or gateway 75, or indeed between any two devices) may be secure connections (e.g., encrypted connections, firewall connections, etc.).
Although FIG. 1A illustrates a single controller 11 having a limited number of field devices 15-22 and 40-46, this is merely an illustrative and non-limiting embodiment. Any number of controllers 11 may be included in the provider equipment of the process control network 100 and any one of the controllers 11 may communicate with any number of wired or wireless field devices 15-22, 40-46 to control a process within the plant 10. Additionally, the process plant 10 may include any number of wireless gateways 35, routers 58, access points 55, wireless process control communication networks 70, access points 72, and/or gateways 75, 78.
FIG. 1B is a block diagram illustrating a broader control system 120, which broader control system 120 may include a variety of different systems or system functions. The control system 120 includes a process plant 10, which may be, by way of non-limiting example, a crude oil refinery. The system 120 may also be coupled to a subsea system 122, such as a drilling or exploration system. Various security systems 124 may also be included in the system 120 as well, such as a fire gas system 126, a monitoring system 128, and a transportation system 130 (e.g., for transporting crude oil to a refinery). Although FIG. 1B shows each of the cells 10 and 122-130 as separate aspects, it should be noted that various of these aspects may be combined. For example, in some embodiments, the process plant 10 may include a safety system 124 and/or a fire and gas system 126. FIG. 1B is intended to illustrate that the present description is not limited in scope to the process plant described with respect to FIG. 1A, and may be applied to other control, monitoring, and safety systems, etc. While the present description describes the embodiments in terms of the process control plant 10, this illustration is for convenience purposes only and is not intended to be limiting.
The following examples illustrate a number of scenarios implemented in a process plant, such as the process plant 10, the concepts described in this specification, and highlight the advantages of such implementations.
Example 1
A first user assigned to a particular area of a plant may monitor the assigned plant area via a fixed workstation within the control room. The first user monitors and controls the process via a browser or other application running on the workstation, which browser or application communicates with routines executed at the server. The first user may decide to go to a location of the process plant to, for example, inspect the plant. When the user leaves the control room, the user may pick up the touch-sensitive tablet device (i.e., the second, mobile user interface device) and walk out of the control room to the factory. The tablet device, like a workstation, enables a first user to access a routine at a server via a browser or application executing on the tablet device. The first user may have been authenticated on the tablet device, or the tablet device may be associated with the first user. The tablet device communicates with the server to establish a session unique to the first user. The server may store state information associated with the first user at the workstation and provide a user interface on the tablet device to the first user via a browser or application running on the tablet device according to the stored state information. In this way, the first user can continue the workflow initiated at the workstation.
In some cases, the routine operating on the mobile device may generate a routine for the first user. Routines, which may cooperate with experts and/or supervisory systems, may identify plant assets that need to be monitored or serviced. In some cases, there may be a priority associated with each asset that needs to be monitored or serviced, indicating the urgency of monitoring or servicing the asset. The routine may determine a route for the first user that allows the user to efficiently access at least some of the assets that need to be monitored or serviced.
As the first user moves within the factory, the context awareness routine executing at the tablet device receives data from various sensors and receivers (e.g., NFC or RFID transceivers) in the tablet device. The sensors and receivers detect devices, equipment, and/or tags in the vicinity of the tablet device. In other embodiments, the tablet device may have a GPS receiver for receiving location data and may upload the location data to a server so that execution of the routine may be aware of the user's location. In either case, the routine may identify the location of the tablet device or the proximity to a particular device and cause the tablet device to display a process plant overview map/graphic for the first user that is zoomed to the approximate location of the first user and the tablet device. As the first user walks through the factory, the factory map display may dynamically change to focus on an area of the map corresponding to the location of the tablet device.
In some instances, the plant map may include navigation functionality. For example, a first user may select a particular plant area/device/asset as a destination. The routine may then use the location data (e.g., received from the GPS receiver) to provide instructions to a particular plant area/device/asset.
The tablet may also display various process data or alarms as the first user walks through the plant 10. For example, a first user may pass the pump such that the tablet device displays operational data, graphics, and alarms associated with the pump (particularly where the pump requires attention). The tablet device may receive the unique identifier, for example, from an NFC or RFID tag on or near the pump. The tablet device may send the unique identifier to the routine via the server. The routine may receive the unique identifier and access a database that associates the unique identifier to entities within the process plant. For example, the unique identifier may be associated with pump data, such as display data, parameter data, and alarm data associated with the pump. After identifying the pump data, the routine may send the pump data to the tablet device, causing the tablet device to render graphics and provide graphics, parameters, and/or alarms associated with the pump.
In another embodiment, the first user may recognize that the pump is malfunctioning. The first user may interact with a pump graphic or menu shown at the flat panel display and may touch the flat panel device display at a graphic location, the graphic representing a shutdown command. The tablet device may detect a first user input (e.g., a capacitive touch input) and generate corresponding input data. The tablet device may then send the input data to a server, which receives the input data and sends a shutdown signal to a controller that controls the pump. The controller receives the signal and turns off the pump. The first user may create a task or work item associated with the pump. For example, the work item may be a request for maintenance personnel to inspect and/or repair the pump.
The routines on the tablet device may also facilitate the lockout/tagout process. For example, the routine may display the appropriate lockout/tagout process for a particular pump. In some instances, a first user desiring to lock the pump for security reasons may interact with a task list displayed by the tablet device to indicate, for example, that a particular task in the locking process has been completed. In other cases, the first user may interact with the routine to test for a fail-safe condition of the pump. For example, a simulation signal may be generated to simulate a fail-safe condition, allowing a first user to observe the response of the pump.
Example 2
A first user still carrying a tablet device may begin walking from the process plant to the control room of the process plant. The first user may pass through the boiler. When the first user walks into proximity of the boiler, the tablet device establishes RFID communication with the boiler environment ID device. The tablet device may receive a unique identifier from the environment ID device and send the unique identifier to the server. The server may identify the boiler based on the unique identifier. The server may access the environmental data to determine that the boiler has an associated work item, and compare a skill threshold associated with the work item to a skill level associated with the profile of the first user. Upon determining that the first user is not eligible to work on a work item associated with the boiler, the server may pre-alert the display of the tablet device instead of updating the display with information related to the work item.
The user may continue to walk through the factory, still carry the tablet device, and may walk through the valve. As described above, the tablet device may establish communication with the valve environment ID device. The tablet device may then receive the unique identifier from the device and send the unique identifier to the server. The server may identify the valve based on the unique identifier. The server may then access the environment data to determine that the valve has an associated schedule indicating that the valve is currently scheduled to be shut down for maintenance. The server sends data to the tablet device causing the tablet device to provide information to the first user, wherein the information indicates to the first user that the valve is currently being scheduled for maintenance.
Example 3
The first user continues to walk through the factory, still holding the tablet device in his or her hand. A second user, located in the control room and now logged into the workstation previously occupied by the first user (or a different workstation), may notice that the critical O2 gas measurement associated with the furnace is falling. The second user creates a work item requesting assistance for the furnace. When the first user passes the furnace on its way back to the control room, the tablet device may automatically establish communication with the furnace environment ID device so that the tablet device receives a unique identifier associated with the furnace. The tablet device can send the unique identifier to a server, which can return information associated with the unique identifier (e.g., information about the furnace), including notification graphics that the furnace needs attention to. The first user can see and select the notification graphic such that information related to the created work item is displayed. The first user may select a graphic to indicate acceptance of the work item.
The work item may request that the first user take one or more pictures of the flame at the furnace (e.g., because the flame color may indicate insufficient airflow). A picture of the flames may be sent to a server. An analysis routine operating on a server, either at the big data facility or as part of an expert system, may analyze various aspects of the image or may compare the image of the flame to other images stored at other times and/or taken under other conditions at the big data system or facility. The analysis routine may analyze the image (e.g., by comparing a set of previous flame images with corresponding operational data). The big data analysis routine may indicate that the airflow at the fire is low. Based on the analysis, the expert system may direct the first user to increase the airflow to the fire. In some embodiments, the first user may use the tablet device to acquire and display an operational course for adding air to the fire, and indeed, in some embodiments, when the expert system directs the user to increase the airflow, the tablet device may automatically display the course. If desired, the first user may take additional images of the flames after adjustment and send the images to an analysis routine to confirm that the furnace is operating properly.
The first user may also use the tablet device to capture audio associated with the fire and send the audio to a server, a big data facility, or an expert system. An analysis routine operating on an expert system, for example, may compare the audio to a sound signature associated with the fire to determine if the fire is operating normally. The analysis routine may also compare the captured audio to audio associated with known problems. For example, a belt or motor problem may be associated with a particular sound, and the analysis routine may detect such a problem by comparing the captured audio to the sound. Similarly, the first user may place the tablet device on or near the fire to detect vibrations associated with the fire. The tablet device may receive vibration data via the motion sensor and send the vibration data to a server or big data facility. The analysis routine may compare the detected vibrations to signature vibration levels associated with the fire (or to vibration levels associated with known problems) to determine whether the fire is operating normally. In either case, analysis of the audio/vibration may reveal that there are no other problems associated with the fire and/or confirm that the fire requires increased airflow.
When the first user increases the airflow at the fire, the second user may query the fire to see if the previous user has also increased the airflow in the past several shifts. The query validation is such. The second user may generate a graph showing the airflow through the fire using event information for each airflow increase, who made the change, etc. (all of which are stored in the big data facility). The second user may share the information with the first user, for example, through a user-interface (UI) session requesting the sharing. The first user may receive a request for the UI session via the server. If the first user accepts the request, the server may capture state information associated with the UI displayed to the second user, and may cause the display of the tablet device being used by the first user to display data according to the state information from the second user. The first and second users may together review data relating to the fire and may determine that the fire frequently experiences similar problems. The second user may then query the big data system for conditions regarding a low O2 gas measurement event at the fire. The big data system may provide a number of events, devices, users, time, and other factors associated with the low O2 gas measurement event at the fire. For example, a big data analysis may reveal that the low O2 gas measurement is significantly correlated to events at the associated process cell, where the associated events frequently precede the low O2 gas measurement. In another example, the analysis may reveal that a particular user is significantly associated with a low O2 gas measurement event. In other words, the analysis may reveal that a particular user is controlling the furnace in a manner that results in a low O2 gas measurement. Although this example illustrates a user utilizing a UI device to request analysis and display the analysis results, it should be noted that the big data system may also use data from and collected by the UI device (in this scenario, a tablet device) for other analyses that may or may not be associated with the UI device. In either case, the second user may mark the work item for future inspection and create a maintenance ticket so that someone can inspect the fire at some point in the near future.
Example 4
At a later time, maintenance personnel may inspect the fire and find that the fire is operating improperly due to the point at which the fuel input is coupled to the fire, and may create work items to correct the problem. The work item may have an associated task indicating that the fuel input pipe should be welded to the stove fuel input, specify the target equipment (i.e. stove), and indicate the skills required to perform the task (i.e. welding skills). The work item may optionally specify a time limit for execution of the work item.
The supervisor module may schedule execution of the work items. For example, the supervisory module may schedule the execution of work items to the day when the plant (or plant area where the furnace is located) is scheduled to be offline for maintenance. Alternatively or additionally, the supervisory module may schedule personnel with the required skills according to their availability. Having identified the welding hours with the appropriate skill, the supervisor module may assign the work item to the welder and wait for the welder to accept the assigned work item. In the event that the welder accepts the work item, the supervisor module creates an authority token that authorizes the welder to access the required plant functions and devices when the work item is to be executed.
At the designated time, the welder may arrive at the equipment room with his assigned mobile user interface device, which may alert her that she is scheduled to perform the work item (i.e., welding furnace connection). Upon being notified of the reminder, the UI device may display a checklist generated by the supervisor module that is associated with the work item. The checklist may alert the welder that she needs to take safety devices (e.g., welder masks and gloves), welding equipment (e.g., welding power supply, electrodes, filler material, etc.) with her, and any other necessary things (e.g., replacement parts) needed to perform the task. The checklist may also specify particular tasks to be performed before moving to the target device. When the welder confirms to the supervisor module (e.g., via the user interface device) that she has all of the devices on the checklist and has performed the specified tasks on the checklist, the welder may leave the equipment room.
After sensing that the welder has left the equipment room, the UI device may switch to a map or guidance mode and display information related to the welder's location within the process plant, as well as information guiding the welder to the target equipment (in this case a furnace). When the UI device senses that the welder has arrived at the fire, the UI device automatically displays a procedure (which may be provided, for example, by a supervisor module) related to the work item task. For example, the UI device may first display safety procedures and information for the welder that confirm that the work item task may be safely performed, such as displaying information about what materials are typically carried to weld the fuel pipe to the furnace, what materials were the last material flowing through the pipe, whether the pipe has been evacuated, whether the pipe is currently in service, and whether any remaining materials are detected in the pipe. The UI device may also purge any residual material from the pipe step by step and/or using a graphical display to confirm that the welding procedure may be performed safely (e.g., without causing an explosion). The UI device may also provide instructions for and facilitate, shut off, and/or lock various parts of the system, such as pipes that may allow gas to flow into the furnace, valves upstream of the igniter, and any other devices that may subject the procedure, welder, or process plant to unnecessary risk. The welder may then follow the instructions or other guidance (if provided by the UI device) to perform the welding procedure, then release any locks and indicate to the supervisor module via the UI device that the procedure is complete, after which the supervisor module may automatically create a work item to enable another person to check the weld before the plant (or a portion of the plant) resumes service.
There is illustrated and described herein some of the advantages of the systems, devices and methods described in the remainder of this specification.
It should be appreciated that the presently described concepts are integrated with systems that have been implemented in process control plants. That is, in embodiments, the implementation of these concepts does not require entirely new process control systems, but rather, can be seamlessly integrated with existing software and hardware units in the plant.
Big data network
In some embodiments, the disclosed UI devices, servers, and routines may be implemented in a process control network that supports a big data infrastructure (i.e., a big data network). Large data networks may support extensive data mining and data analysis of process data. The big data network or system may also include a plurality of big data network nodes to collect and store all (or substantially all) data generated, received, and/or observed by devices included in and associated with the process control system or plant 10. The big data network may include a big data facility (e.g., the big data facility 102), which may include a unified, logical data storage area configured to store (sometimes using a common format) multiple types of data generated by or associated with the process control system, the process plant 10, and one or more processes controlled by the process plant 10. For example, the unified, logical data storage area may store time stamped configuration data, continuity data, event data, plant data, data indicative of user actions, network management data, and data provided by or to a process control system or a system external to the plant. Such data may also include personnel related data, raw and/or processed materials related data, personnel limitations, qualifications, and certification related data, calibration and maintenance scheduling related data, and the like. The data collected by the big data network may be, for example, a data log that tracks people and inputs received from those people. Such data is helpful to improve plant operation and efficiency. For example, log data may be mined and analyzed by an expert system to provide valuable insight into operator input in various situations. Such results may be used to improve operator training and/or improve response in various situations (either automatically or manually). In either case, such data is required in many cases for routine purposes.
The word "unified" as used herein, when applied to a logical data storage area of the big data facility 102, is not intended to represent a single storage device. As is generally known, a plurality (in fact many) of storage devices of a first size (or various first sizes) may be communicatively coupled to form a storage area of a second, larger size. However, for the purposes of this description, these are also considered "unified" logical data stores. Generally, big data facility 102 is configured to receive data from big data network nodes of a big data network (e.g., via streaming and/or via some other protocol) and store the received data. As such, the process control big data facility 102 may include a unified, logical data storage area for historian or storage of data received from big data nodes, a plurality of facility data receivers for receiving data, and a plurality of facility request servers (as described in U.S. patent application 13/784,041, which is incorporated herein by reference for all purposes).
The process control big data system may automatically collect all data generated at, received by, or acquired by the nodes (automatically as the data is generated, received, or acquired) and cause the collected data to be delivered to the process control system big data facility 102 for storage (and optionally to other nodes of the network) with high reliability (e.g., without using lossy data compression or any other technique that may cause loss of raw information). The process control system big data system may also be capable of providing sophisticated data and trend analysis of any portion of the stored data. For example, a process control big data system may be able to provide automatic data analysis across process data (i.e., contained in different database silos in prior art process control systems) without requiring any a priori configuration and without requiring any translation or conversion. Based on this analysis, the process control system big data system may be able to automatically provide in-depth knowledge discovery and may suggest changes or suggest additional entities for the process control system. Additionally or alternatively, the process control system big data system may perform an action (e.g., specified, predicted, or both) based on knowledge discovery. The process control system big data system may also enable and assist users in performing manual knowledge discovery and planning, configuring, operating, maintaining, and optimizing the process plant and the resources associated therewith.
Expert system
Expert system 104 is a collection of routines and/or modules configured to access and analyze data collected and stored by big data facility 102. Although illustrated and described herein as a separate module from the big data facility 102, in some embodiments, the expert system 104 may be integrated in the big data facility 102. Further, the expert system 104 may include a number of modules or routines that operate in different process areas and/or different process devices. For example, expert system functionality may be located in one or more of the controllers 11, one or more of the process control devices 15-22, and so on. In either case, the expert system 404 uses the data collected and stored by the big data facility 102 to identify trends, perform diagnostics, monitor operator input, improve modeling of the process plant and/or various portions of the process plant, monitor material supply, monitor output quality and quantity, model various aspects of the operation of the plant, and countless other activities. The expert system 104 may perform analysis of the collected data using a predefined model and/or may proactively (and possibly automatically) generate a model based on the analysis of the data. Expert systems may perform many different types of analysis, some examples of which are provided below. The examples are not intended to limit the scope of the functionality of the expert system 104, but rather to illustrate a portion of the possible functionality.
In one example, expert system 104 monitors data collected and stored by big data facility 102 (either in real time or after collection and storage) and performs data analysis related to a particular alarm or alarm type. The expert system 104 may be programmed to analyze process parameters, process inputs, sensor data, and any other data stored in the big data facility 102 to determine any common characteristics (trends, values, etc.) associated with a particular alarm. The association may be temporal, but not necessarily concurrent with the alert. For example, the expert system 104 may analyze the data to determine whether a particular operator input occurred in a similar temporal relationship to an alarm. More specifically, the expert system 104 may determine a compilation of factors that drive or predict an alarm condition, such as determining when the temperature in a particular tank rises and an operator releasing a quantity of a particular catalyst into the tank, the pressure in the tank rising at a particular rate and generating the alarm condition.
In another example, the expert system 104 may be programmed to perform statistical analysis on the data collected and stored by the big data facility 102 to determine the strength of the correlation between the event and the process parameter. For example, where a skilled operator has a sense of "instinct" with respect to relationships between various processors, the operator's intuition may not be reliable in comparison to rigorous data analysis, which may enable the operator to respond to process control conditions (e.g., increasing tank temperature, immersion pressure, etc.) by adjusting the process to a process that may worsen the condition or at least may not correct the condition as quickly, or to the same degree as other or additional adjustments. Thus, the expert system 104 may improve the overall control, safety, quality, and output of the process by providing information to operators and other personnel that they may not know or understand.
In yet another embodiment, the expert system 104 is programmed to adjust the operation of the process on the process plant 10 based on an analysis (such as the analysis described in the preceding paragraph). The expert system 104 may identify non-optimal or abnormal conditions and may correct the conditions by changing one or more process inputs and/or set points. Additionally, the expert system 104 may be integrated with other safety systems in the process plant 10 to prevent and/or correct process conditions that may result in safety risks to equipment and/or personnel.
Supervision engine
Implementing a mobile control room via UI device 112 facilitates decentralization of control, maintenance, and other aspects of a process plant (or other similar environment). That is, the operator is no longer tethered to the workstation to maintain optimal control of the process plant, and accordingly, the lines between the operator and maintenance personnel (which previously spent time, typically in the plant rather than in the control room) are obscured or removed. More personnel are available to move through the factory environment. At the same time, the big data facility 102 stores more complete data related to various aspects of the plant environment, and the expert system 104 provides a more complete analysis of the operation and conditions of the process plant. The expert system 104 and the big data facility 102 cooperate to provide information about the status of processes operating in the plant, the status of equipment in the plant, locations and tasks related to personnel in the plant, and countless other aspects related to plant management, material management, personnel management, optimization, and the like.
The supervisory engine 106 utilizes the data and analysis provided by the expert system 104 to manage personnel within the process plant. In particular, the supervisor engine 106 may monitor trends identified by the expert system 104 and may create work items for plant personnel. Although illustrated and described herein as a separate module from the big data facility 102 and the expert system 104, in some embodiments, the supervisor engine 106 may be integrated in the big data facility 102 and/or the expert system 104. Fig. 2 is a block diagram illustrating the mobile control room UI device 112, the supervisor engine 106, and the communication architecture between the expert system 104 and the big data facility 102. As described above, the expert system 104 may acquire and analyze data stored in the big data facility 102, and in some embodiments, may store data in the big data facility 102. For example, the expert system 104 may acquire data related to aspects of the process control system and perform one or more analyses on the acquired data. The analysis performed by the expert system 104 may be performed according to a preprogrammed model, or in some embodiments, may be performed without a model (i.e., the expert system 104 may search the data for unknown correlations or relationships). In either case, the expert system 104 may store the analytical data (e.g., regression data, correlation data, etc.) in the big data facility 102.
The supervisor engine 106 may use data received/obtained from the expert system 104 and/or data received from the big data facility 102. For example, the supervisor engine 106 may receive data from the expert system 104 indicating that a particular parameter is most closely related to a particular abnormal situation or a particular optimal situation. As another example, the supervisor engine 106 may receive data from the expert system 104 indicating that certain parameters should be checked or that certain adjustments to process processing devices/routines are required to avoid abnormal situations. As yet another example, the supervisor engine 106 may receive data from the expert system 104 indicating that the expert system 104 has identified a propensity to indicate a need for maintenance or will need to be performed at a predetermined time. Alternatively or additionally, supervisory engine 106 may receive or retrieve data from big data facility 102. For example, a routine executed by the supervisor engine 106 may be associated with periodic scheduled maintenance (i.e., maintenance occurring at a routine, scheduling interval, or interval determined by plant parameters). That is, the supervisory engine 106 may monitor parameters of the process plant or of devices within the process plant, for example, to determine how many hours the device has been serviced since the last maintenance, or how many times the device (e.g., a valve) has been actuated since the last maintenance. This type of data may be stored in big data facility 102 and retrieved by supervisory engine 106.
Work item creation
The supervisory engine 106 may use the received data to create work items for plant personnel and/or to cause specific actions to be taken within the process plant. Fig. 3 is a block diagram illustrating an embodiment of supervisor engine 106. The supervisor engine 106 may include a work item manager 300. Work item manager 300 may be a set of routines and/or instructions stored on a computer readable medium and executed by a processor that may be used to create work items. Each work item may be a task or procedure to be completed by one or more process plant personnel. For example, a work item may include replacing or repairing a device, making parameter readings, making adjustments to a device or parameter, inspecting a device or product, performing a calibration procedure, programming a device, or any other action that requires a person to complete. When work items are generated by work item manager 300, the work items may be stored in work item list 302, which work item list 302 exists in a memory associated with supervisory engine 106. Referring to fig. 4, an exemplary work item 400 can include various information including a work type or function 402 (e.g., wireline inspection, equipment replacement, equipment calibration, maintenance (e.g., lubrication, etc.); a list of devices needed to execute the work item 404; a target device field 406 that identifies a device to which the work item relates; target start time/date 408; target completion time/date 410; a priority field 412 (e.g., "immediately," "within 12 hours," "within 24 hours," "after the current batch," "during the next shutdown," "high," "medium," "low," etc.); required skill set field 414 and/or required credential field (not shown); and a target device type field 416. Of course, fewer or more fields may be included in the work item 400.
Referring again to FIG. 3, supervisor engine 106 may also include a set 304 of people profiles 306. Each personnel profile 306 includes information relating to a particular operator, maintenance technician, or otherWorker's toolInformation about plant personnel. The information in personal profile 306 may include skill sets, certificates and/or credentials, roles (e.g., operator, maintenance, security, safety), work hours/schedules, itinerary schedules (i.e., routines and/or scheduled routines for personnel to pass through the plant to record parameter data or visually inspect various aspects of the process plant), and/or any other information related to the performance of various duties in the process plant.
Workflow management
Work item scheduler 308 may be stored as a set of instructions on a machine-readable medium. The instructions may be executed by a processor to perform scheduling of work items stored in the work item list 302. Work item scheduler 308 may schedule work items based on any of a variety of factors. For example, work item scheduler 308 may determine the priority of each work item; according to a person scheduled to be at a location ("target location") in proximity to a device ("target device") related to the work item; according to a person currently located at a target location adjacent to the target device; based on the current availability of personnel (e.g., personnel who will shift shifts at the desired time of work item initiation/completion and/or personnel who will not assign tasks at the desired time of work item initiation/completion); according to the person's need/required/desired skill set, role, certificate and/or credentials; the work items are scheduled according to scheduled plant maintenance and/or shutdown schedules, etc. By way of example and not limitation, work item dispatcher 308 may track work items in work item list 302, noting the target location and/or target device associated with each work item. Work item dispatcher 308 may receive information from a person tracking routine 310 that tracks the location of a person via a UI device 112 carried by the person. When the personal tracking routine 310 reports (e.g., by determining that a person is logged into or assigned to a UI device 112 having a known location) that the location of a mobile operator is proximate to a target location or target device, the work item dispatcher 308 may query the personal profile 306 associated with the mobile operator to determine whether the mobile operator has the skill sets and/or credentials needed to perform the task ("target function") associated with the work item. If the mobile operator has the appropriate skill set and/or credentials, work item dispatcher 308 may assign the work item to the mobile operator, and if the operator accepts the work item, work item dispatcher 308 may create any required permissions of the operator to perform the target function for the target device. Of course, it should be understood that one or more persons may be assigned a single work item, as certain tasks require multiple persons to complete.
In some embodiments, the permissions are created as tokens or entities in a database 312 stored in a memory associated with supervisor engine 106. Each entitlement token defines a target function (e.g., a wiring check), a target device, an ID of the worker assigned to the work item, and optionally, an expiration time and date of the token. A permission token may be required for all work items, some work items, and work items associated with a particular device or device type, with a particular target function (i.e., work item task), etc. The rights token gives the mobile assigned to the work item specific access rights and can be revoked by the system at any time. In some embodiments, the rights may depend on external factors. For example, the authorization token may specify that the mobile worker has authorization to perform the target function during a particular time period, during a particular factory event (e.g., during a shutdown of an area of the factory).
In addition, the supervisory engine 106, and more particularly the work item scheduler 308, can schedule work items based on external factors, particularly (but not exclusively) where work items would result in production scheduling changes or significant downtime. For example, the supervisor engine 106 may communicate with systems outside of the current process plant via the backbone network 105 and the gateway 78 to obtain data related to the delivery of weather, raw materials or other supplies, the delivery of components, tools or equipment needed to perform work items, and product transportation schedules, among other things. As a non-limiting example, the work item scheduler 308 may delay scheduling of a work item if the work item will interfere with production and the transportation of perishable raw materials is scheduled to be received before the work item is completed. As another example, a particular work item in an outdoor location may require dry conditions (e.g., no rain) to complete a target function (e.g., a wiring check), and the work item scheduler 308 may schedule the work item according to a weather forecast.
A method 500 of assigning tasks to personnel within a process plant is illustrated in the flow diagram of FIG. 5. The method 500 may include receiving data from an expert system (block 505) and creating work items for a specified task from the data received from the expert system (block 510). The method also includes selecting a person to perform a task specified in the work item (block 515), transmitting the work item to a device associated with the selected person (block 520), and may include receiving an indication that the selected person has accepted the work item (block 525). Receiving data from the expert system may include receiving data indicative of predicted problems in the process plant, receiving data indicative of trends associated with process parameters, receiving requests to provide parameter values to the expert system, and receiving instructions to perform particular actions with respect to the process control devices, among other things. When receiving data includes receiving a request to provide a parameter value, creating a work item may include creating a work item in which the specified task is to observe and record a parameter value that is not automatically sent from a device that senses or receives the parameter. In embodiments, creating a work item may include creating a work item in which the specified task is to perform a maintenance task, a calibration task, a replacement task, an inspection task, or a repair task. Creating a work item may also include specifying a device target (e.g., one device on which a specified task is to be performed) associated with the specified task. Selecting a person to perform a task may include selecting a person based on location data received from a device (e.g., a mobile user interface device, a GPS device, a proximity card device, etc.) associated with the selected person. The method 500 may also include creating and storing a rights token associated with the specified task, associated with the process control device associated with the specified task, or both. The authorization token may be needed for the selected personnel to perform the specified task on the process control device associated with the specified task. A rights token may be an entry in a database, discrete file, or any computer structure that is implemented for the purpose of creating and/or authorizing a person to perform an action on or with a piece of equipment. Selecting the person to perform the task may also include selecting the person based on a task specified in the work item, a process control device associated with the specified task, or both, and a plurality of person profiles accessible by the supervisor module. In an embodiment, selecting the person according to the plurality of person profiles includes selecting the person according to a skill set, a role, a certificate, and/or a credential. Selecting a person may also or alternatively include storing the work item in a database from which the person selects a work item to be performed, and/or receiving a request from a device associated with the person to perform the work item and comparing a profile associated with the person to information stored in the work item to determine whether the person is eligible to perform the work item.
Receiving data from the expert system may include receiving instructions for performing actions such as observing and recording parameters, checking process control devices, calibrating process control devices, recording audio samples, capturing images or video, performing maintenance on process control devices, repairing process control devices, replacing process control devices, and/or adjusting process control parameters. Creating a work item may include specifying the tools or devices needed to perform the specified task, a priority level for the work item, a skill set needed to perform the specified task, a start time and/or date needed, and/or a completion time and/or date needed. The method 500 may also include scheduling execution of the work item according to a scheduled route through the process plant associated with the selected person, a scheduled delivery of input material for a process performed by the process plant, a scheduled delivery of a product produced by the process plant, a predicted weather condition, a scheduled transit time of a product produced by the process plant, a predicted or scheduled completion time of a process of the process plant, and/or a predicted or scheduled arrival of a tool, equipment, or component required to complete a specified task.
Referring again to fig. 3, the supervisor engine 106 may also store (or access) documents such as equipment manuals, maintenance manuals, and Standard Operating Procedures (SOPs) 316. Documents may be automatically provided to the mobile operator via the UI device 112 as the mobile operator performs tasks within the process plant or tasks associated with particular work items. In an embodiment, the documents are provided to the mobile operator at an appropriate (i.e., useful) time during execution of the target function associated with the work item. For example, a person performing a wiring verification function associated with a work item may be presented with an SOP for performing wiring verification. As another example, personnel performing routine maintenance (e.g., lubrication, cleaning, etc.) on the valve may be presented with the SOP for each procedure and/or with a manual for the targeted valve. In some embodiments, at each step in the process of performing the target function, the relevant portion of the document is provided to the person. That is, the maintenance technician may first be presented (via the mobile UI device 112) with an SOP for locking and taking out of service the valve. Subsequently, the maintenance technician may be presented with a page from the valve's operating manual relating to performing cleaning and/or lubrication of the valve. Subsequently, a maintenance technician may be presented with an SOP for returning the valve to operational service and removing the lock of the equipment. Of course, these examples are non-limiting, as there are a number of situations where SOPs and manuals may be presented to personnel during task performance.
While a mobile operator or technician is performing a targeted task associated with a work item, supervisory engine 106, and in particular work item tracking module 318, may track the progress of the task associated with the work item. In some embodiments, supervisor engine 106 cooperates with mobile UI device 112 to guide a mobile operator through each step of one or more processes required to execute a work item. Booting may include locking procedures, shutdown procedures, equipment disassembly, equipment repair, maintenance steps (e.g., calibration and lubrication, etc.), verification and verification procedures, equipment re-installation, start procedures, unlock procedures, and any other steps of the process. The work item tracking module 318 may communicate with, e.g., receive an indication of, the mobile UI device 112 as each subsequent instruction, step, or guidance is requested by the mobile operator. When work item tracking module 318 receives an indication that each subsequent instruction, step, or lead is requested, work item tracking module 318 may assume that the previous step was completed, thereby tracking the progress of the execution of the work item. In an embodiment, the work item tracking module 318 may be operable to communicate with a target device (i.e., a device that is the target of a work item) or a device communicatively coupled or proximate to the target device to verify that one or more steps are completed. In yet another embodiment, two mobile operators may participate in a collaboration session, and as one mobile operator completes each step of a work item presented to a technician via a mobile UI device 112, a second technician may mark each step completed on the other UI device 112, sending an indication to the work item tracking module 318 that each step is completed. That is, two users collaboratively using the respective UI device 112 need not view the same display of information, and need not view the same information at all. As another example, a first user may be viewing a standard operating protocol for executing a work item on a first UI device 112 while another user is viewing real-time data related to one device associated with the work item on a second UI device 112. Upon completion of a work item, the supervisor module 106, and in some embodiments the work item tracking module 318, may flag the project completion, remove it from the list of active work items such that any permissions associated with the work item are removed or invalidated, assign another work item, notify personnel that the work item is completed, notify personnel that a work item may be relied upon (i.e., a work item that may begin to begin relying upon completion of a previous work item).
Turning to FIG. 6, a flow diagram is shown illustrating a method 600 for managing workflows in a process plant. The method 600 includes creating a work item that specifies a task to be performed in the process plant (block 605), determining a set of procedures for performing the work item from the specified task (block 610), generating an associated display for each procedure in the set of procedures (block 615), and displaying the associated set of displays in order of execution of the set of procedures on the mobile user interface device (block 620). Creating the work item may include receiving data from the expert system and/or specifying a task based on the data received from the expert system. Receiving data from the expert system may include receiving instructions to perform particular actions for the process control device. Creating a work item may also or alternatively include specifying maintenance tasks, calibration tasks, replacement tasks, inspection tasks, and/or repair tasks. Creating a work item may also include specifying tasks that require a security locking procedure, a shutdown procedure, and/or a start procedure, among other things. Generating the associated display may include generating a display presenting a set of steps for performing a procedure, generating a display including one or more images showing performance of a procedure, generating a display including an image of one target device in its surroundings to assist a person in locating the target device, generating a display including parameter input fields for recording parameters associated with one target device, and/or generating a display including a set of standard operating procedures. The method also includes showing a location of a target device in the environment of the process plant on a display of the mobile user interface device, which may include providing user interface controls that cause the display to zoom in on the target device and/or providing a set of user interface controls to allow a user of the mobile user interface device to navigate between associated sets of displays. In some embodiments, a protocol context window may be displayed to indicate which protocol of the set of protocols associated with the specified task is currently being executed. The method also includes providing access to documents relevant to one target device associated with a particular task. Further, the method may include determining a set of tools and devices needed to perform the set of procedures, generating a checklist display including a list of the determined set of tools and devices, and displaying the checklist. Creating a work item may include specifying a manual data collection task. Determining the set of procedures may include determining a route for collecting the manual data.
Supervisor engine 106 may also store data associated with the execution of work items. In particular, the supervisor engine 106 may store data captured by the mobile UI device 112 associated with the execution of work items, may store data related to the impact of the execution of work items on the operation of the process plant (e.g., performance changes of the process plant due to or related to the execution of work items), and so forth. In embodiments, the mobile UI device 112 may capture video, audio, or vibration data as part of a diagnostic procedure or a repair or maintenance procedure, and the mobile UI device 112 may send the captured data back to the supervisor engine 106, which supervisor engine 106 stores the data as being associated with a particular work item, and alternatively or additionally, may store the data in the big data facility 102.
Checklist
Referring back to fig. 3, supervisor engine 106 may perform other tasks related to supervising mobile personnel. As just one example, supervisor engine 106 may include checklist generation routine 314. Checklist generation routine 314 may generate a checklist of mobile personnel corresponding to the work items assigned to the mobile personnel. The checklist generated by routine 314 may include, for example, safety equipment required for the region or procedure (e.g., gas masks, safety belts, safety hooks, radiation detection devices/dosimeters, etc.), as well as tools required to perform the procedure, components required to perform the procedure (e.g., replacement components or maintenance components, such as seals, lubricants), etc. In some embodiments, checklist generation routine 314 may generate a checklist and store the checklist as being associated with the work item. Alternatively, the checklist generation routine 314 may generate and display the checklist in real-time. In either case, it is contemplated that the mobile personnel will be presented with the checklist shortly before the work item is executed. For example, when the mobile operator indicates that he or she is preparing to execute a work item, a checklist will be automatically presented to the mobile operator. In other embodiments, the checklist will be manually requested by the mobile operator when the operator is ready to perform the work item. In some embodiments, supervisor engine 106 determines that the operator is preparing to execute the work item and automatically presents the checklist to the mobile operator. For example, the supervisor engine 106 may receive an indication that the mobile operator has transferred the state of the workstation UI device 112 to the mobile UI device 112 when a work item is to be scheduled to be executed. Upon detecting a state transition, supervisor engine 106 can present a checklist prompting the mobile operator to confirm that he or she has the appropriate equipment and resources to execute the work item. Alternatively, when the mobile operator opens a work item, the mobile UI device 112 may automatically obtain (e.g., from the supervisor engine 106) a checklist, preferably prior to entering the process plant environment. In yet another embodiment, the mobile UI device 112 may check that it has entered a supply room or a preparation room and may automatically present a checklist to the mobile operator so that the mobile operator may collect the required tools, devices, supplies, etc.
Turning now to FIG. 7, a flow diagram is presented illustrating a method 700 for facilitating task completion in a process plant. The method includes receiving a selection of a work item from a plurality of work items stored in a database of work items (block 705), and determining one or more items required for execution of the work item from the selected work item (block 710). A checklist of one or more items is generated for display to the person performing the work item (block 715), and the checklist is displayed to the person performing the work item (block 720). In an embodiment, determining one or more work items based on the selected work item includes reading one or more fields of the work item, which may include reading required device fields, reading required security device fields, reading required tool fields, and/or reading fields indicating a location of a target device within the process plant. Reading the one or more fields may include reading a field that specifies a target task to be completed. Reading the field may include reading a field that specifies the target task to be completed and the target device or type of target device. Further, determining one or more items needed to execute a work item may include determining a task to be completed, a device on which the task to be completed is to be performed, or both. Generating a checklist of one or more items for display to personnel performing a work item may include generating a checklist including any one or combination of a safety device, a tool, a process control device, a component of a process control device, maintenance materials, and the like. Generating the checklist may include obtaining information from one or more of a device manual associated with the process control device associated with the work item, a device manual associated with tools required to execute the work item, a safety document, a standard operating protocol, and/or a document associated with a location of the process control device associated with the work item. In some embodiments, generating the checklist may also include determining a location associated with the work item or a location that must be traversed by the assigned personnel to reach the location associated with the work item, and obtaining the specific security devices and/or tools needed to access or traverse the location. Displaying the checklist to the person performing the work item may include receiving an indication that the user assigned to perform the work item has actuated the mobile user interface device and displaying the checklist to the user on the actuated mobile user interface device. Receiving a selection of a work item may include receiving a portion on a first user interface device, and displaying a checklist to a person performing the work item may include receiving an indication that a status of the first user interface device has been communicated to a second user interface device and displaying the checklist on the second user interface device. The method 700 may also include receiving, for each of one or more items on the checklist, an indication that a person browsing the displayed checklist has the item.
UI device
Fig. 8 is a block diagram of a UI device 803 in the environment of a mobile control room 800. The mobile control room 800 may cause the UI device 803 to transmit and/or receive an operating UI state to and/or from another system or device. The mobile control room 800 further includes a UI device 803a, a server 150, and a UI device 803 b. Each of the UI devices 803, 803a, 803B may be any of a variety of UI device types as described below with reference to fig. 9B. The server 150 may include a web service or web routine 152 that may be stored in memory at the server 150 and executed by a processor at the server 150. Each of the UI devices 803a and 803b (and any other UI devices 803) includes a processor 810, a memory 815, a display 820, a network interface 825, an input interface 830, a system bus 835, and one or more transceivers 850. The UI devices 803a, 803b may also include one or more positioning devices, including, for example, a Global Positioning System (GPS) (or any other satellite navigation system) receiver 832, an inertial positioning system chip 834, independent positioning components (e.g., compass 836, gyroscope 838, accelerometer 840), and the like. The memory 815 may include an operating system 880, user interface ("UI") routines 882, context awareness routines 884, state determination routines 886, browser routines 888, image capture routines 890 and sound capture routines 892, local process control data storage 894, UI state information 896, and other data. In some embodiments, one or more of the operating system 880, UI routines 882, context awareness routines 884, and/or state determination routines 886 may be located at a memory external to the UI device 803 and may be executed by a processor external to the UI device 803 (e.g., at a system or device such as server 150). It should be understood that the mobile control room 800 described herein is merely one example. Other configurations are contemplated. For example, mobile control room 800 need not include multiple UI devices, and in fact, need not include any particular number of UI devices.
In particular embodiments of the memory 815 of the UI device 803, the memory 815 may include volatile and/or non-volatile memory, and may be removable or non-removable memory. For example, memory 815 may include computer storage media in the form of: random Access Memory (RAM), Read Only Memory (ROM), EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information. The processor 810 is configured to fetch and execute instructions stored in the memory 815. The memory 815 may store data such as operating system data or program data.
The network interface 825 may include or be coupled to one or more antennas for wireless communication, one or more ports for wired connections, or both. In some embodiments, the network interface may be coupled to a GPS receiver 832, enabling the network interface 825 to receive location or coordinate data. The network interface 825 may also or alternatively include a bluetooth transceiver, such that the network interface 825 is capable of establishing a personal area network with external devices or systems. Additionally or alternatively, the network interface may include a near field communication ("NFC") transceiver, a radio frequency identification ("RFID") transceiver, and/or a local area network transceiver (e.g., to enable the network interface 825 to communicate using IEEE802.11 protocols).
The network interface 825 may communicate with the server 150 and/or one of the UI devices 803 via a network, such as the process control network 100 shown in FIG. 1A. The user may interact with the UI device 803 via the input interface 830. Input interface 830 may accept input via mechanical actuation (e.g., keyboard or mouse). Input interface 830 may alternatively or additionally accept input via detection of an electromagnetic field, signal, or property (e.g., a resistive or capacitive touch screen). Further, input interface 830 can accept input via detecting sound, light, or motion (e.g., voice input via microphone 842, image sensor or camera 844, etc.). Additionally, the input interface 830 may accept input from a bluetooth device coupled to the network interface 825. Display 820 may provide input in the form of images or video, and may utilize any type of monitor, projector, or display technology (including CRT, LCD, plasma, LED, and OLED technologies).
In some embodiments, one or more input sources, such as a microphone 842, image sensor or camera 844, or other sensors (e.g., oxygen sensor, toxic gas sensor, motion sensor, vibration sensor, RFID sensor) may be located outside of the UI device 803 and coupled to the UI device 803 via a wired communication channel (e.g., via a headset port or a USB port) or a wireless communication channel (e.g., wireless USB, bluetooth, Wi-Fi, or proprietary protocols). For example, in the case of carrying the UI device 803, a user carrying the UI device 803 may also carry one or more input sources on a belt.
Each of the routines 880, 896 may be one or more of instructions, routines, modules, procedures, services, programs, and/or applications and may be stored on a computer-readable medium, such as the memory 815. The operating system 880 may support basic functions and manage resources of the UI device 803. In particular, the operating system 880 may manage hardware and software of the UI device 803. When executed by a processor, the UI routine 882 may cause the display 820 to display information to a user and may cause the input interface 830 to receive input from the user or from other external stimuli. In response to environmental information received at network interface 825, input interface 830, or at one or more sensors, environment awareness routines 884 can cause display 820 to display the information. The context awareness routines 884 may additionally or alternatively cause the UI device 803 to identify a context (e.g., location, time, or schedule) and/or receive a context from a system or device external to the UI device 803.
The state determination routine 886 may collect information related to the operation of the UI device 803. For example, the state determination routine 886 may collect UI state information 896 by monitoring processes executed by the processor 810 and data associated with the processes. The status determination routine 886 may identify information shown at the display 820 and may identify a process entity associated with the shown information. In some embodiments, the state determination routine 886 may send the collected UI state routine to an external node, such as the server 150 or the UI device 803 b. In embodiments where the UI device 803 implements a thin client or a network client, the state determination routine 886 may be stored at a memory on the server 150, where it may be executed by a processor at the server 150.
The browser routine 888 may be an application for accessing, presenting, and navigating one or more information resources. The information resource may be a web page, image, video, document, or any other content. The browser routine 888 may interact with information resources located on the UI device 803 or with information resources external to the UI device 803. For example, the UI device 803 may access information resources at other systems or devices (e.g., the server 150 or the UI device 803b) via the world Wide Web or via a network such as the process control network 100. In some embodiments, the browser routine 888 may access information associated with and/or generated by a UI routine executing at the server 150. In particular, the browser routine 888 may access a web service 152 at the server 150, where the web service 152 may correspond to a UI routine executed at the server 150. For example, the browser routine 888 may receive an address or identifier (e.g., from a user via the input interface 830), such as a uniform resource identifier or a uniform resource locator. The address or identifier may direct the browser routine 888 to the web service 152. The browser routine 888 may receive UI data (e.g., display data or process parameter data) from the UI routine 882 via the web service 152 such that the browser routine 888 may show graphics for controlling and/or monitoring some or all of the process. The browser routine 888 may also receive user input (e.g., mouse clicks on graphics) and send data representing the user input to the UI routine 882 via the web service 152. In alternative embodiments, the browser routine 888 may be a plug-in or web client application.
While various routines 880 and 895 are described as being stored in memory 815, the UI device 803 may also operate to request, retrieve, receive, and/or download additional routines (e.g., applications, applets, updates, patches, etc.) as desired via the network interface 825. As one of the examples that may be contemplated, the UI device 112 may request and receive information to facilitate direct (or indirect) communication between the UI device 112 and a process control device within a process plant. In any case, it should be understood that UI device 112 is not limited to the applications, routines, and modules that reside in memory 815 and are described herein.
The image capture routine 890 may operate to capture images via an image sensor or camera 844. In some embodiments, the image may be transmitted via the network interface 825 to a node on the network 100, where the node may analyze the image to identify process data. For example, in one embodiment, the image capture routine 890 may cause the image sensor 844 to capture an image of a flame. The image capture routine 890 may send an image of the flame via the network 100 to a node (e.g., the server 150, the expert system 104, etc.), where the node may analyze the image to identify the color and corresponding temperature of the flame. Similarly, the sound capture routine 892 may be one or more instructions or routines for capturing sound via the microphone 842. The captured sound data may be sent to nodes on the network 100 for analysis.
For capturing sound, the microphone 842 may capture audio associated with the plant asset. The captured audio may be used to identify plant assets or to diagnose plant assets. For example, the pump may have an expected voice signature. In such an example, the UI device 803 may capture audio generated during operation of the plant asset and may send the audio to a node (e.g., server 150, expert system 104, etc.) via the network 100 to identify the asset type as, for example, a pump. In such an environment, the node may even identify the particular pump that is relevant to the UI device 803. The UI device 803 may also include a motion sensor (e.g., accelerometer 840) for detecting vibrations. For example, a plant asset may have a desired level of vibration during operation. The user may place the UI device 803 on or near a plant asset. The UI device 803 may use the data detected by the motion sensor to identify the current vibration level associated with the asset. If the current vibration level exceeds the expected vibration level, the user may utilize the UI device 803 to further diagnose the plant asset or request the work order of the asset. In some instances, when the UI device 803 is placed on or near an asset, a diagnostic routine may be automatically initiated to detect vibrations associated with the asset.
In some instances, UI device 803 may include a peripheral interface (not shown) for establishing connections with other devices. The peripheral interface may be a serial interface, such as a Universal Serial Bus (USB) interface. In other embodiments, the peripheral interface may be a wireless interface for establishing a wireless connection with another device, similar to some of the embodiments of the network interface. For example, in some embodiments, the peripheral interface may be a short-range wireless interface compliant with standards such as bluetooth (operating in the 2400-. The peripheral interface may be used to send or receive status information to or from an external device as described below. In some embodiments, the peripheral interface may also be used to interact with external devices that provide context awareness to the UI device 803. For example, the environment ID device may be detected via a peripheral interface also described below. In some embodiments, the user may save state information or process information available at the UI device 803 to an external device via an external interface.
In the overall operation of the UI device 803, the processor 810 may access the memory 815 to execute the UI routine 882. When the processor 810 executes the UI routine 882, the processor 810 causes output to be provided at the display 820 that represents information related to an entity (e.g., a device, an apparatus, a network node, process data, control data, etc.) within the process plant 10. The output may be based on data stored in memory 815 (e.g., graphical data, historical data, or any previously received and stored data) or data received via network interface 825 (e.g., data received from controller 11 or database 151). Further, when input is received at the input interface 830, the input interface 830 can generate input data. Input data may be transmitted to processor 810 over system bus 835, where processor 810 may execute one or more instructions or routines in accordance with the received input. In many instances, the input data may represent user interaction with graphical output provided at display 820. For example, the input data may represent movement of a mouse, where the processor 810 operates to move a cursor displayed on the display 820 in accordance with mouse movements. The input data may also represent a selection of UI elements (e.g., windows (e.g., browser windows), device graphics (e.g., tanks, pumps, valves, tables, etc.), or operating system elements) displayed on the display 820. Further, the input data may represent control inputs. For example, a user may use a keyboard, mouse, or touch screen to enter setpoint values for a process device. When the input data represents a control input, the processor 810 may send the input data to the network interface 825 via the system bus 835, where the network interface 825 sends the input data to the process control network 100, where it may be received at another node (e.g., the controller 11 or the server 150 shown in FIG. 1A). The processor 810 may also cause any other type of input data to be sent to the process control network 100.
State awareness, transition, and collaboration
Because mobile devices facilitate process control operations, configurations, and maintenance, it is contemplated that personnel may move from device to device, and in either case, the ability to switch from device to device without having to recreate (and/or re-navigate) the display they are viewing on a second device and/or the process they are participating on the first device. As shown in the above example, an operator may desire to check the status of a process plant remotely from a mobile phone on her way to work so that she may be ready for what may happen when she arrives at the scene. When she arrives at the scene, she can walk to her office and expect to acquire the same location at the workstation. Thereafter, she can take the tablet computer and walk to the site of the plant to observe the plant area or engage in various tasks. In general, personnel involved in the operation and maintenance of a process plant may desire that the devices they use include some level of state awareness and be able to transition states between devices to facilitate movement and/or collaboration.
In embodiments of the present disclosure, state information is seamlessly transferred from the first UI device to the second UI device, which allows a user to continue or transfer a session from the first UI device to the second UI device without any interruption in the workflow. State transitions may also allow a first user of a first UI device to collaborate with a second user of a second UI device so that both users may work on tasks or work items in a collaborative manner. In another embodiment, the UI device may provide output according to the environment of operation of the UI device. For example, the UI device may consider the location and device location of the UI device, the type of UI device, or other considerations when determining what information is to be provided or how to provide the information at the UI device display. The UI devices and mobile control rooms disclosed herein provide the benefit of "freeing" users and operators of UI devices from the physical control room. Users of such UI devices can freely walk around within the plant without interrupting the workflow and without losing functionality or ability with respect to monitoring and controlling the process.
Fig. 9A illustrates aspects of an exemplary mobile control room 900 a. Mobile control room 900a includes UI device 912a, UI device 912b, and UI device 912c, each of which may be used by user 901 and/or user 902.
The mobile control room 900a may enable the user 901 to synchronize the UI devices 912a and 912b by transferring the state of the UI device 912a to the UI device 912 b. The UI state transition may cause UI device 912b to display information similar to that displayed at UI device 912 a. The state transition may also cause UI device 912b to execute routines or applications similar to those executed at UI device 912 a. Further, the routine or application on the UI device 912a may be executed in the same state as the routine or application executed at the UI device 912 a. By transferring the UI state from the operation of the UI device 912a to the UI device 912b, the user can stop using the UI device 912a and start using the UI device 912b without any loss in the workflow.
Similarly, control room 900a may enable a secure collaboration session to be established between at least two UI devices. In an embodiment, a secure collaboration session may be automatically established when two devices 912 move into proximity of each other and become aware of each other. Once the session is established, data synchronization between the UI devices may be performed during the collaborative work session. More specifically, user 901 may cooperate with user 902, where UI device 912b may transfer state information to UI device 912 c. By transferring the state information from the UI device 912b to the UI device 912c, the UI device 912c can recognize the state of the operation of the UI device 912 b. For example, UI device 912c may show the same or similar information as that displayed at UI device 912 b. UI devices 912b and 912c may also initiate communication routines that allow users 901 and 902 to exchange information (e.g., text, video, and voice over IP) via UI devices 912b and 912 c. For example, UI devices 912b and 912c may exchange information related to work items or tasks, enabling users 901 and 902 to work on items or tasks in a collaborative manner, even if users 901 and 902 do not view the same display on respective UI devices 912b and 912 c. In one example, a user may be able to verify a device via a UI device so that another user knows that the device is being processed.
In some embodiments, UI devices 912a-912c may transfer state information directly between each other. The UI devices 912a-912c may use short-range wireless technologies, such as near field communication (ISO/IEC 14443 and 1809 standards), to detect proximity and then transfer status information using WiFi (IEEE 802.11 standard) or bluetooth (IEEE 802.15.1 standard). In other embodiments, the UI devices 912a-912c may transfer state information via the backbone network 105 via a node (e.g., the server 150 shown in FIG. 1A). In some embodiments, the UI devices 912a-912c may be thin clients, where the UI devices 912a-912c may render graphics, but much of the processing of the UI devices 912a-912c occurs at nodes on the process control network 100 (e.g., the server 150 shown in FIG. 1A). In such embodiments, transferring data between UI devices 912a-912c may include transferring state information between UIs executing at nodes.
Fig. 9B illustrates UI device 112 in an exemplary mobile control room 900 c. The mobile control room 900c may support transitions to or from the operating state of any of the UI devices 112a-112k, UI device synchronization, and user collaboration. The mobile control room 900c includes the server 150, the process control network 100, the user 901, and the UI devices 112a-112 k. The server 150 may include a database 151 that may include display data, parameter data, historical data, environmental data, UI status information data, or any other process plant data. The database 151 may be stored in memory on the server 150, stored separately from the server 150, or stored in multiple devices in the process plant. Each of the UI devices 112a-112k may be any type of process control UI device 112 that provides information related to a process or unit associated with a process and receives user input regarding the process or unit. Each of the UI devices 112a-112k may execute a corresponding UI. In alternative embodiments, the UI may execute in whole or in part at the server 150 and may be provided to the UI devices 112a-112k, for example, via web pages. Each of the UI devices 112a-112k may communicate with the server 150 via the backbone 105 of the process control network 100. In the embodiment shown in FIG. 9B, a user 901 may interact with the UI device 112a through the display 920 and input interface 930 (although the user 901 may interact with any of the UI devices 112a-112 k). In this embodiment, the UI device 112a is a stationary workstation, where the input interface 930 is a keyboard and the display 920 is a monitor; UI device 112b is a mobile device (e.g., a phone or PDA); UI device 112c is a tablet device capable of receiving touch input from a user's hand or stylus; UI device 112d is a wearable device (in this case, a watch with a touch screen); UI device 112e is a laptop computer; the UI device 112f is a wearable device (in this case, an earpiece with a head-mounted display); UI device 112g is a television that may have an associated input interface (not shown), such as a keyboard, mouse, touch screen (e.g., capacitive touch screen), motion sensor, or any other type of device capable of accepting user input; the UI devices 112h are displays and user input devices (e.g., touch screens) located in the process plant environment (e.g., wall hanging, mounted on or near the process entity, etc.); the UI device 112j is a mobile device (e.g., a smartphone) having a built-in projector operable to project a UI onto a surface 112k (e.g., a wall within a process plant). The UI projected onto surface 112k may include user input methods (e.g., user actions tracked via UI device 112j or an external device (not shown)). Of course, in various embodiments, any combination of UI devices 112a-112k may be employed. In addition, mobile control room 900c may include additional UI devices similar to any of UI devices 112a-112 k. Although a particular type of input is described as being associated with each of the devices 112a-112k, it should be noted that, in various embodiments, any of the devices 112 may accept input from various input sources, depending at least on the type of use of the UI device 112. As just one example, it is contemplated that UI device 112 may accept devices from a stylus or may be a touch-sensitive device that is not a capacitive feature (e.g., resistive, surface acoustic wave, or any other type of touch screen technology) to facilitate input from a user wearing, for example, protective gloves. Speech input may also be used in any of the UI devices 112, particularly in environments where external noise is not a factor.
In the mobile control room 900c, each of the UI devices 112a-112k may enable a user 901 to monitor and/or control a process or unit associated with a process via the process control network 100. In embodiments, each of the UI devices 112a-112k may implement a network client or a thin client. In such embodiments, the server 150 may execute UIs and any other routines for the operation of one or more of the UI devices 112a-112 k. UI devices 112a-112k may communicate user input data to server 150, where server 150 may respond to the user input. The server 150 may transmit the display data to the UI devices 112a-112 k. Since the server 150 may manage a large amount of processing of the operation of the UI devices 112a-112k in this embodiment, the server 150 may track the status of the operation of each of the UI devices 112a-112k by monitoring the execution of routines at the server 150 and monitoring data received from and transmitted to each of the UI devices 112a-112 k.
In other embodiments, the UI devices 112a-112k operate solely as data clients. For example, in an embodiment, each UI device 112 includes a web browser and routines for automatically generating dynamic HTML (or other code) to display information on the UI device 112. The routines and/or the dynamic web pages generated by the routines retrieve data from the server 150 and display the retrieved data (as well as other data, such as user input data) on a display. The routines and/or dynamic web pages may also accept user input and send data back to the server 150. In such embodiments, most of the processing occurs on the UI device 112 while only data is transmitted to and from the server 150 via the network.
In another embodiment, instructions (e.g., JavaScript instructions) located on the UI device 112 dynamically generate code (e.g., HTML5 code) that is rendered in an appropriate viewing application (e.g., an HTML5 viewer or a web browser). For example, JavaScript code may open a WebSocket connection used by the WebSocket application messaging protocol to send messages between UI device 112 and JavaScript executing on server 150.
Server 150 may save UI state information (e.g., into database 151) periodically or in response to a triggering event. The UI state information may represent a state at the time of UI device capture. The UI state information may include information on: a user or operator interacting with the UI device; an application, program, routine, or module executing with respect to a UI device; graphics or sound presented at the UI device; a portion of the plant associated with the displayed data; or any other information related to the operation of the UI device. When server 150 receives a request for a state transition, server 150 may access UI state information saved locally in database 151 and may send the UI state information to the appropriate UI executing at server 150. The UI may send corresponding display data to the appropriate UI device. For example, UI device 112b may request state information from UI device 112a (where, for example, user 901 desires to switch the UI device from 112a to 112b without interrupting the workflow). In some embodiments, UI devices 112a and 112b may each have a UI executing at server 150. Server 150 may access UI state information stored locally at database 151 and may communicate the UI state information to the UI of UI device 112 b. The UI of the UI device 112b may determine what should be displayed at the UI device 112b based on the saved UI state information and communicate the display data to the UI device 112 b.
In some embodiments, each of the UI devices 112a-112k may capture UI state information and store the UI state information at the database 151 when the user interacts with the respective UI device. The UI device may transmit UI state information to the server 150 through the network 100. The server 150 may transmit the UI state information to any of the UI devices 112a-112k so that, for example, upon receiving a request from a particular one of the UI devices 112a-112k, the particular UI device may operate in a manner consistent with the received UI state information.
As an example, the user 901 may begin using the UI device 112a (although the following examples may also be performed with any of the UI devices 112b-112 k). When the user 901 interacts with the UI device 112a, the UI device 112a may periodically capture and save UI state information. The UI state information may be related to the user 901, for example, representing a user ID or a user role/role. UI state information may also be related to a user's session, including information related to: a program or routine running on the UI device 112a, a time of capture, a length of session, a configuration of graphics displayed at the display 920 of the UI device 112, an entity monitored or controlled at the UI device 112a (i.e., a process area, a device, an assembly, or data), and/or a type of UI device being used (in this case, a fixed workstation). After capturing and saving the UI state information, the UI device 112a may transmit the UI state information to the server 150 through the process control network 100 so that the server 150 may store the UI state information at the database 151.
The user 901 may decide to use a mobile UI device, such as any of the UI devices 112b-112f or 112i-112 k. In an embodiment, the user 901 may utilize the UI device 112b, where the UI device 112b may identify the user 901. UI device 112b may communicate with server 150 to obtain the most recent UI state information associated with user 901 (i.e., the UI state information most recently captured at UI device 112a in this case). In some embodiments, the communication may trigger additional capture of state information related to the UI device 112a in the UI device 112 a. UI device 112b may generate a GUI configuration based on the received UI state information such that the display of UI device 112b corresponds at least in part to the display of UI device 112a at the time of the most recent state information capture. In other words, the mobile control room 900c operates to enable state transitions or state synchronization between the UI device 112a and the UI device 112b (e.g., see fig. 10 showing how the display looks in UI synchronization or state transition). As a result of the state transition, user 901 experiences minimal disruption in the workflow.
In some embodiments, the capture of the UI state information may be automated. For example, the UI device 112a may capture state information on a predetermined, periodic basis (e.g., once every 5, 10, or 30 minutes). UI device 112a may also capture state information in response to a triggering event or activity. The trigger event may be related to user input (e.g., capturing state information at any time the user input is received or on a schedule related to receiving the user input) or information provided at the UI device 112a (e.g., capturing state information at any time an alert is present or any time a particular measurement or value reaches a specified threshold). Alternatively or additionally, UI device 112a may manually capture UI state information in response to user input representing a command to capture or communicate the UI state information. For example, the display 920 may provide graphics that interact with the user 901 such that the capture occurs. The input interface 930 may also have a mechanism (e.g., a button, key, or trackpad) that allows the user 901 to initiate capture. In some embodiments, a request by another UI device (e.g., one of UI devices 112 b-k) may also trigger the capture at UI device 112 a. As another example, UI devices 112a-112k may capture and communicate state information when two UI devices are in contact with each other (or in close proximity to each other (e.g., via near field communication), e.g., within 5cm, 2cm, 1 cm).
In further embodiments, UI device 112b may automatically identify user 901. For example, the user 901 may have a unique tag (e.g., located in a badge or card with an RFID chip) that identifies the user 901. In other embodiments, the tag may be any tag or device capable of providing identification information, such as an NFC device, barcode, bluetooth device, or any other wireless access point. UI device 112b may have a tag scanner or reader (e.g., an RFID scanner) that detects the unique tag. UI device 112b may access the database to identify the user associated with the unique tag, allowing UI device 112b to identify the user 901. The database may be located at UI device 112b, but in other embodiments, database 151 located at server 150 associates tags with users, and UI device 112 may communicate with server 150 to identify user 901. In other embodiments, each UI device may be assigned to a particular user such that only a single user interacts with the UI device. In such embodiments, UI device 112b may be assigned to user 901, such that UI device 112b may assume that any user interacting with UI device 112b is user 901. Alternatively, the UI device 112b may force the user 201 to enter a user ID and password to log on to the UI device 112b, allowing the UI device 112b to identify the user 901.
In other embodiments, the user 901 may use another UI device (e.g., any of the UI devices 112c-112k in place of the UI device 112b) to cause a state transition or state synchronization from the UI device 112a to one of the UI devices 112c-112 k. For example, the user 901 may synchronize a tablet device, such as the UI device 112c, with the state information recently captured at the UI device 112 a. In other instances, the user 901 may synchronize a watch (e.g., UI device 112d), a laptop (e.g., UI device 112e), a headset (e.g., UI device 112f), or a television (e.g., UI device 112g) to the most recently captured state information at the UI device 112 a.
Further, state information of the UI device 112a may be passed to the UI device 112a, allowing the user 901 to save the session on the UI device 112a and continue the session on the same UI device 112a after some time. UI device 112a may return to the previous UI state by accessing state information saved to UI device 112a or server 150. This is in contrast to some prior art systems where it may be difficult to continue a session even on the same device after some time, since multiple users interact with the same console.
In still other alternative embodiments, user 901 may use any of UI devices 112b-112k in place of UI device 112 a. The respective UI device utilized by the user 901 may capture state information related to the respective UI device. The captured state information may be communicated to server 150, where it may be stored at database 151 and accessed by the same or another UI device.
In some instances, server 150 may be a UI device similar to any of UI devices 112a-112k (i.e., server 150 may include a display and an input interface, and may be used as a UI device). In such a scenario, the state information saved at the server 150 may be accessed to provide UI information at the server 150 so that the user may utilize the server 150 as a UI device. Similarly, in some embodiments, any of UI devices 112a-112k may operate as a server 150 similar to server 150.
In another embodiment, the UI devices 112a-112k may communicate state information between each other over the network 100 or over some other network or communication, such as a personal area network (e.g., a Bluetooth network) or near field communication. In some embodiments, the receiving UI device may initiate the transfer of UI state information, while in other embodiments, the transferring UI device initiates the transfer. In still other embodiments, the state transfer may be performed by saving the UI state information to a memory (e.g., a memory on a USB thumb drive) and accessing the memory to retrieve the UI state information at the second UI device.
In some embodiments, the state transfer may be automatic and transparent to a user of any of the UI devices 112a-112 k. For example, the state transfer may be initiated automatically when the UI device is brought into proximity with another UI device. The UI devices may include circuitry (e.g., NFC circuitry) to allow the UI devices to detect each other. Such proximity may also be detected by location data received at, for example, a GPS receiver that may be included on one or more of the UI devices. The UI device may send location data to the server 150, where the server 150 may use the location data to determine the proximity and initiate the state transition. In some embodiments, one or more of the UI devices may display an indicator graphic indicating that the respective UI device is receiving or transmitting status information. The indicator graphic may also indicate that the UI device is cooperating with another UI device.
FIG. 10 exemplarily illustrates exemplary device displays (e.g., which may occur during or after a state transition) associated with UI synchronization between UI devices 803a and 803 b. In fig. 10, the UI device 803a may be a stationary workstation and the UI device 803b may be a mobile device (e.g., a tablet device). The UI device 803a includes a display 820a and an input interface 830 a. The display 820a may provide a GUI configuration 1010a, the GUI configuration 1010a including a tank graphic 1015a, a level indicator graphic 1016a, a pump graphic 1020a, a valve graphic 1025a, a valve graphic 1030a, a graph 1035a, a graph 1040a, and a graph 1045 a. The UI device 803b includes a display 820b and an input interface 830 b. The display 820b provides a GUI configuration 1010b, which GUI configuration 1010b includes a tank graphic 1015b, a level indicator graphic 1016b, a pump graphic 1020b, a valve graphic 1030b, and a chart 1040 b.
The UI device 803a may capture the UI state information 896 and send the UI state information 896 to the server 150 or another UI device such as the UI device 803 b. When capturing UI state information 896, UI device 803a may determine which entities are relevant to the output provided at display 820 a. For example, UI device 803a may identify the entity (tank, pump, two valves, device associated with graphs 1035a-1045 a) associated with graphs 1016a-1045a and save the entity as state information 896. In addition to identifying the entities described above, the UI device 803a may also identify coordinate locations associated with graphics provided at the display 820 a. As a result, the UI status information 896 may reflect, for example, that the can graphic is located in the middle of the screen. The UI device 803a may also identify the location of various windows or boxes associated with any executing application. Further, the UI device 803a may identify programs or routines executed at the UI device 803a and may save information indicating the state of each program. For example, a browser may be executing, and the UI device 803a may identify a resource (e.g., a web page, an image, a video, or some other content) being accessed or used by the browser.
The UI device 803b may receive UI state information 896 from the UI device 803a (or from the server 150 in other embodiments). The UI device 803b provides an output based on the received UI state information 896. In particular, the UI device 803b may display a visual representation or graphic at the UI display 830b based on the received UI state information 896. Since UI device 803b may be a different type of device having a different size display than UI device 112a, UI device 112b may provide a different GUI configuration than the GUI configuration provided at UI device 112 a. In particular, the UI device 112b may identify the highest priority entities and programs from the UI state information 96 and may generate the GUI configuration 1010b accordingly. In particular, the UI device 803b may identify the entity associated with the graphics 1015b, 1016b, 1020b, 1030b, and 1040b as high priority. Due to limited screen space, UI device 803b may not generate graphics related to graphics 1025a, 1035a, or 1045a shown at display 820a of UI device 803 a. The UI device 803b may also generate graphics in the GUI configuration 1010b at locations that are related to the relative location of the corresponding graphics in the GUI configuration 1010 a.
Furthermore, with a different type of input interface (i.e., touch screen based rather than keyboard based), the UI device 803b may generate graphics of a different size and shape than the graphics generated at the UI device 803 a. For example, UI device 803b may generate larger graphics that are more easily interacted with via contact.
In some embodiments, particularly in embodiments where the UI device 803a and the UI device 803b are the same type of device, the GUI configuration 1010b of the UI device 803b may be the same as the GUI configuration 1010a of the UI device 803 a. In other embodiments, the GUI configuration 1010b may be less relevant to the GUI configuration 1010 a. In some cases, for example, the output provided at the display 830b of the UI device 803b may be partially or fully text-based. Even in such embodiments, the UI device 803b may use the UI state information 896 to determine which process entities the UI device 803b should provide information regarding. For example, even if UI device 803b does not display graphics corresponding to the tank graphic 1015a of UI device 803a, UI device 803b may determine that the tank is a high priority entity and may provide text-related information (e.g., a text tank level value corresponding to tank level indicator 1016 a).
FIG. 11 is a flow diagram illustrating one example method 1100 for synchronizing UI devices 112. Other example methods will be described below, and method 1100 is not intended to be limiting. As described above, the sync UI device 112 may enable a user to continue a previous session from the same or another device, and it may enable two or more users to collaborate by exchanging information. Method 1100 may be implemented in whole or in part by one or more devices and systems, such as those shown in fig. 1-10. The method 1100 may be implemented as a set of instructions, routines, programs, or modules that are maintained on the memory 815 of the UI device 112 and may be executed by the processor 810 of fig. 8.
In method 1100, UI device 112 receives a request for UI state information 896 (block 1101). The UI device 112 identifies the UI state of the first UI device 112 (block 1105). Identifying the UI state may include identifying an output provided at a display of the first UI device 112. Identifying the output provided at the display may include identifying visual representations and graphics provided at the display of the first UI device 112 and identifying entities associated with the visual representations and graphics. Identifying the output provided at the display may further include identifying a process parameter provided at the display; identifying a GUI configuration at the display; and identifying a UI type or device type of the UI device 112.
The first UI device 112 may identify a process entity associated with output provided at a display. The process entities may include process parameter data, process plant areas, field devices, executing applications, or application states. For example, the first UI device 112 may identify a tank graphic provided at the display. Based on this identification, the first UI device 112 may identify a tank level measurement, a process plant area (e.g., a boiler area) of the tank, a field device associated with the tank (e.g., an inlet valve to the tank, a discharge pump of the tank, a temperature sensor of the tank material, etc.), an application executing on the first UI device 112 (e.g., a browser, a history and alarm management suite, etc.), and/or a status of executing the application (e.g., a resource accessed or used by the browser, a parameter used or displayed by the history, or an alarm displayed by the alarm management suite).
After identifying the UI state of the first UI device 112, the first UI device 112 may send data representing the identified UI state to the second UI device 112 (block 1110). More specifically, the first UI device 112 may transmit data representing the identified entity to the second UI device 112. In an alternative embodiment, the first UI device 112 may transmit the entity data to the server 150, where the server 150 may subsequently transmit the entity data to the second UI device 112.
After receiving the UI state information 896 at the second UI device 112, the second UI device 112 may provide output corresponding to the received UI state (and more particularly, the received entity data). For example, the second UI device 112 may provide the identified process parameter data (i.e., the process parameter data provided at the first UI device 112) at a display. The second UI device 112 may also generate a graphical overview of the identified one or more plant areas (i.e., the areas associated with the identified output at the first UI device 112) at the display. Additionally or alternatively, the second UI device 112 may generate a graphical representation of one or more identified field devices (i.e., devices associated with the output provided at the first UI device 112) at the display. The second UI device 112 may also load an application corresponding to the identified application (i.e., the application running at the first UI device 112). Finally, the second UI device 112 may cause one or more applications to be input into the identified state (i.e., the one or more application states identified at the first UI device 112).
By way of further example, and still referring to FIG. 11, the UI device 803 may capture UI state information 896 and transmit the state information to the process control network 100. UI state information 896 may represent the state of UI device 112 at the time of capture. The processor 810 may be operable to capture UI state information 896 by causing the memory 815 to store data representing a UI state. The processor 810 may retrieve the UI state information 896 from the memory 815 and send the UI state information 896 to the process control network 100 via the network interface 825. The UI state information may ultimately be received by a node (e.g., the server 150) on the process control network 100. In another embodiment, the UI status information 896 may be sent via a peripheral interface (e.g., a USB interface, WiFi interface, bluetooth interface, or NFC interface) that sends the UI status information 896 to another UI device 803.
As discussed with respect to fig. 1A and with respect to fig. 12A, 12B, in the following, UI state information 896 may include information or data such as user or operator related profile data that interacts with UI device 803. All or some of the profile data may be received at the input interface 830 or at the network interface 825. The processor 810 may cause the input interface 830 or the network interface 825 to transmit the profile data to the memory 815 via the system bus 825. In some embodiments, the processor 810 may generate profile data in response to data received from the input interface 830 or the network interface 825, the data relating to a user of the UI device 803 or similar UI device 803. In other embodiments, profile data may already exist on the memory 815, where the processor 810 may access the profile data, or save the profile data in a different data structure (e.g., the processor 810 may access profile data collected during operation of the operating system 880 or operation of another application on the UI device 803, and may cause the profile data to be saved into a particular database for UI state transfer operations).
In addition to the profile data, the UI state information 896 may also include session data relating to output (i.e., graphics or sound) provided at the UI device 803 and relating to the application executing at the UI device 803 and the state of the corresponding application. In other words, in the illustrated embodiment, the processor 810 may generate session data based on output provided at the display 825 and based on data generated or used during operation of other applications executed by the processor 810. In addition to user profile data and session data, UI state information 896 may include any other data related to the operation or state of UI device 803.
In another embodiment of the UI device 803, the UI device 803 may receive UI state information 896 from the process control network 100 and may be operable to place the UI device 803 in a state corresponding to the UI state information 896. In such embodiments, UI state information 896 may represent a previously captured state of operation of another UI device ("previous UI device") (e.g., UI device 803b) or UI device 803. In operation of such an embodiment of the UI device 803, UI state information 896 may be received at the network interface 825 via the process control network 100. The network interface 825 may send the UI state information 896 to the memory 815 for storage. The processor 810 may access some or all of the UI state information 896 stored in the memory 815 to place the UI device 803 in a state consistent with some or all of the UI state information 896. The UI state information 896 may indicate a UI state of a previous UI device operation that provided information related to a process or a particular entity in the process control network 100. The processor 810 may cause the display 820 to display information corresponding to the same particular entity. The display 820 may show this information in the same or similar GUI configuration used by the previous UI device 803b, but may also utilize a different GUI configuration in certain circumstances (e.g., where the UI device 803 is a different type of device than the previous UI device 803 b). In some embodiments, the processor 810 may identify a point of interest (e.g., an entity of interest) based on the UI state information 896 and may cause the display 820 to provide information related to the identified point of interest.
In addition to or instead of indicating a process entity, the UI state information 896 may indicate the state of one or more of the applications running on the previous UI device 803 b. Processor 810 may cause the one or more applications to launch and operate in the indicated states. For example, UI state information 896 may instruct a browser window to open and display a particular web page. In such an example, the processor 810 may cause the browser application to launch and open the same particular web page. In another example, the UI state information 785 may indicate that a process history viewing tool is running and that a particular process value is being accessed or displayed by the viewing tool. In such an example, the processor 810 may cause the viewing tool application to launch and access or display the same particular process value.
Turning now to fig. 12A, a block diagram illustrates exemplary data associated with UI device 112 in mobile control room 1200 a. The mobile control room 1200a may enable state transfer to one or more UI devices 112, allow users of the respective UI devices 112 to resume a workflow from a previously saved state or allow users of UI devices 112 to collaborate with users of other UI devices 112. The mobile control room 1200a includes a server 150, a process control network 100, and UI devices 112. In some embodiments, the server 150 may also serve as the UI device 112, where the server 150 includes a display 820 for displaying GUI configurations and providing process information to an operator or user. In such embodiments, the server 150 may also include an input interface 830 for receiving user input.
The server 150 includes a processor 1201, a network interface 1203, and a memory 1203. Memory 1203 stores UI state information 1240, which information 1240 may include profile data 1245 and/or session data 1265. The UI state information 1240 may be stored in the database 151 shown in fig. 9B. The server 150 may communicate over the process control network 100 using wired or wireless communication channels. Similarly, each UI device 112 may communicate over the process control network 100 using a wired or wireless communication channel, and each UI device 112 may communicate with the server 150.
The memory 1203 of the server 150 may include volatile and/or nonvolatile memory, and may be removable or non-removable memory. For example, memory 1203 may include computer storage media in the form of: random Access Memory (RAM), Read Only Memory (ROM), EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information. The processor 1201 is configured to fetch and execute instructions stored in the memory 1203. Memory 1203 may store data such as operating system data or program data. The network interface 1202 may include one or more antennas for wireless communication, one or more ports for wired connections, or both. In some embodiments, the network interface 1202 may include one or more GPS receivers, bluetooth transceivers, NFC transceivers, RFID transceivers, and/or local network transceivers. The network interface 1202 may communicate with the UI device 112 via the process control network 100.
Each UI device 112 may include data representing a user ID1205, a session ID 1210, a client device ID1215, and/or a UI type 1220. The user ID1205 may correspond to a single user or operator and serve as a unique identifier. Similarly, session ID 1210 may serve as a unique identifier for a particular user session at UI device 112. A user session is generally considered a period of use by a particular user without any extended interruptions. In general, when a user stops using the UI device 112a for an extended period of time and continues to use the UI device 112a thereafter, subsequent use may represent the start of a new session (unless the session continues as described below). The client device ID1215 a can serve as a unique identifier for the UI device 112 a. Finally, the UI type 1220a may represent the type of GUI implemented at the UI device 112 a. The UI type generally corresponds to the device type of the UI device. In the preferred embodiment, there are two common UI types: a normal UI and a mobile UI. Desktop, laptop and other UI devices with larger screens typically implement a normal UI. On the other hand, mobile devices (e.g., phones, PADs, and tablet devices) typically implement mobile UIs, which provide larger graphics and text (relative to screen size). In many embodiments, the mobile UI may provide different GUI configurations and graphics due to the size limitations of many mobile device screens. In other embodiments, there may be other UI types, such as phone UI, tablet UI, or headset UI.
The profile data 1245 may include user profiles 1250a-1250 d. Each of the user profiles 1250a-1250d may correspond to a unique user or operator. User profile 1250a may include data representing user ID 1252, user role 1254, and user history data 1256. The user profiles 1250b-1250d may include similar elements. The user ID1250a may represent a unique identifier for a particular user and may correspond to the user ID1205a at the client device 112 a. The user roles 1254 may represent the responsibilities, jobs, or roles of a particular user at a process plant. For example, user roles 1254 may restrict plant areas for which a user has control rights. User roles 1254 may also limit the degree of control that a user can achieve or the types of programs that a user may access. In some embodiments, the user role 1254 may also restrict the user's rights to access and control entities in the process plant based on scheduling. For example, user role 1254 may only have the right to exercise control during its work schedule (e.g., from 8am-5 pm). Finally, user history data 1256 may represent trends, habits, and preferences of the user associated with user profile 1250 a. The user history data 1256 may, for example, reveal particular areas, particular equipment or devices, or particular process parameters in the process plant that the user is interested in.
Session data 1265 may include sessions 1270a-1270 d. Session 1270a may include data representing session ID 1272, user ID 1274, client device ID1276, UI type 1278, application state data 1280, and session time data 1282. Each of the sessions 1270b-1270d may include data representing similar entities. The session ID 1272 serves as a unique identifier for a particular session. User ID 1274 may represent a unique user and may correspond to user ID 1252 of user profile 1250a and user ID1205a of UI device 112 a. Client device ID1276 may uniquely identify a particular UI device and may correspond to UI device ID1215 a. Similarly, UI type 1278 may correspond to UI type 1220a at UI device 112 a. Application state data 1280 may represent programs running at the UI device when UI state information 1240 is captured, and may also represent the state of each particular application at the time of capture. Session time data 1282 may represent time data such as a start time of a session, an end time of a session, a length of a session, and the like.
In operation, UI device 112a may capture UI state information 1240 (including profile data 1250a and session data 1270 a). When the user session has ended, the UI device 112a may send the UI state information 1240 to the server 150 for storage. The network interface 1202 may receive UI state information 1240 from the process control network 100. The processor 1201 may operate to send the UI state information 1240 to the memory 1203 for storage. In other embodiments, the UI device 112a may send all or a portion of the UI state information 1240 to the server 150 periodically or in response to a triggering event. Server 150 may then transmit all or a portion of UI state information 896 to a UI device, such as UI device 112 b.
Similar to fig. 12A, fig. 12B is a block diagram illustrating exemplary data associated with UI device 112 in mobile control room 1200B. The mobile control room 1200b may enable state transitions from the first UI device 112a to one or more other UI devices 112b, 112 c. As in mobile control room 1200a, mobile control room 1200b allows a user of UI device 112a to continue and/or continue a workflow on UI device 112b, or to collaborate with another user using UI device 112 b. The mobile control room 1200b includes the server 150, the process control network 100, and the UI devices 112 a-c. In some embodiments, the server 150 may also serve as the UI device 112, where the server 150 includes a display 820 for displaying GUI configurations and providing process information to an operator or user. In such embodiments, the server 150 may also include an input interface 830 for receiving user input.
Mobile control room 1200b differs from mobile control room 1200a in at least one respect. Specifically, in mobile control room 1200b, state and/or session data is transferred from UI device 112a to UI device 112b, e.g., directly, rather than via server 150. Each of the UI devices 112 stores UI state information 1240, which may include session data 1265. Session data 1265 stored by each of UI devices 112 may include user ID1205, session ID 1210, UI device ID1215, UI device type 1220, application state data 1280, and session time data 1282.
The user profile data 1245 described with reference to fig. 12A may be stored in the server 150 and/or in a memory of the individual UI device 112. In this manner, any user can use any of the UI devices 112, and a user profile (including the user's preferences, roles, historical data, etc.) will be available to the UI devices 112. In some embodiments, when a particular user logs into the UI device 112, the UI device 112 may download or access user profile data 1245 from the server 150. In other embodiments, a profile of all users or users who have previously used a particular UI device 112 may reside in the memory of the UI device 112.
In operation, each UI device 112 may store one or more applications, such as a display application, in the memory 815 for viewing information related to the process plant. UI device 112 may store the state of the application periodically in application state data 1280 and/or may store the state of the application in response to a request to transfer the state to another UI device 112. For example, a user may be viewing process plant data using a viewing application on the UI device 112 a. The process application may reside on the UI device 112 and may retrieve and/or receive data (e.g., process data) from the server 150. In an embodiment, the UI device 112a receives both process data and visual data from the server 150. For example, the UI device 112a may receive trend data related to a particular process parameter from the server 150 and, using the trend data, may additionally receive rendering instructions indicating a manner in which the data is to be displayed (e.g., 3D drawing information, table information, axis information, etc.). The presentation data may be sent as separate entities, allowing the same data to be sent with different presentation (e.g., format) information depending on the target device. In either case, the UI device 112a maintains specific information related to the state of the application running on the UI device 112a, including information related to what data is being displayed, what plant area or device is being displayed, what task is being performed, and so forth.
The user may desire to switch from UI device 112a to UI device 112b, for example, to move from a workstation UI device to a tablet UI device. To accomplish this, the user may initiate a state transition from UI device 112a to UI device 112 b. In the first embodiment, the user brings the UI device 112b close to the UI device 112a so that NFC devices in each UI device 112 can communicate with each other to establish and set up a connection. The NFC devices may cooperate, for example, to connect via bluetooth or WiFi settings, so that session data 1265a may be transferred from the UI device 112a to the UI device 112b, allowing the UI device 112b to continue the session in a state similar or identical to the state of operation on the UI device 112 a. In a second embodiment, the user may engage in one or more menus displayed on the display 820 of the UI device 112a to select the session number displayed on the UI device 112 b. Other embodiments for transferring state that may be employed in this and other cases are also described in this specification. The devices may then communicate via bluetooth or WiFi, via the network 100 (and optionally the server 150) or directly therebetween, to transfer the session data 1265a from the UI device 112a to the UI device 112 b. Once the mobile UI device 112b receives the session data 1265a and stores it as session data 1265b, the UI device 112b may resume the session previously operating on the UI device 112 a.
In an embodiment, the state transition from the first UI device 112 to the second UI device 112 also transfers any control rights associated with the UI device 112. For example, in some embodiments, a controller or other process device may receive input from only a single source at a time. In such an example, it is important to explicitly establish the source of the input and remove any potential conflicts. In the case where the user switches from the first UI device 112 to the second UI device 112, any such input must be explicitly associated with the second UI device 112 after the state is transferred to the device. In such a case, server 150 may maintain tracking data (e.g., UI device ID1276 associated with a particular session 1265) and may reassign the UI device ID upon transfer to a second UI device. The server 150 may be able to determine that a transfer has occurred based on the most recent request for process control data (even if the transfer occurred directly between the first and second UI devices 112). For example, server 150 may determine that UI device 112b has the most recently requested data, and thus may determine that UI device 112b now has control over the session. Alternatively, once the session has been transferred, the UI device 112a may give up or not support the session by sending a message to the server 150 to indicate that the UI device 112a is no longer associated with the session transferred to the UI device 112b, or the UI device 112b may send a similar message to the server 150 to positively identify that the UI device 112b is now associated with the session and indicate to the server 150 that the UI device 112a is no longer associated with the session. In yet another embodiment, each session may have associated with it a "session token" that is stored in memory of the UI device and passed from device to device. When a device does not have a session token for a particular session, the device will not send commands from the device (or will at least refrain from sending a subset of commands), even if the device maintains the session. In this manner, data associated with a particular session may continue to be displayed on the UI device 112a even after a state transition has occurred and the session token has been passed to the UI device 112 b. The session token may take any form including, for example, a secure file, a hash code, a special code or sequence of characters, and so forth.
Various methods related to the concepts described in the preceding paragraphs will now be described with reference to the corresponding figures.
Fig. 13 is a flow diagram of an example method 1300 for providing session data to a UI device 112. Providing session data may facilitate UI state transitions or synchronization, support continuous workflow or worker collaboration. Method 1300 may be implemented in whole or in part by one or more devices or systems, such as server 150 shown in fig. 1, 9, and 12. The method may be stored in the memory 1203 as a set of instructions, routines, programs, or modules, and may be executed by the processor 1201.
The method 1300 begins when the server 150 receives a session request from the UI device 112 (block 1305). The server 150 may determine whether the UI device 112 provides a user ID (block 1310), and may request a user ID when not (block 1315). Once the user ID has been provided, the server 150 may identify the data associated with the user ID (block 1320). For example, there may be one or more user profiles, sessions, or UI devices 112 associated with the user ID. In an alternative embodiment, server 150 may receive the UI device ID and identify data associated with the UI device ID (rather than the user ID).
After identifying the data associated with the provided user ID, the server 150 may determine whether the UI device 112 requests to continue the workflow from a previous session (block 1325). When there is no such request, the server 150 may identify the default session (i.e., the data representing the new session or the default session) as the "target session" to be provided to the UI device (block 1330). The default session data may include data such as default GUI configuration data, default process parameter data, or default display data. For example, a default GUI configuration for a new session that does not restore a previous workflow may include an active window with a plant overview graphic. The server 150 may send default session data to the UI device 112 (block 1350).
When server 150 receives the request to continue with the previous workflow, server 150 may determine whether UI device 112 has identified a particular session (block 1335). When a particular session is not identified, the server 150 may identify the most recently saved session associated with the user ID (UI device ID in an alternative embodiment) as the "target session" to be provided to the UI device 112 (block 1340). The server 150 may send the latest session data to the UI device 112 (block 1350). When the server 150 receives the particular session associated with the request to continue the workflow, the server 150 may identify the stored session data for the particular session (e.g., stored in the memory 1203 of the server 150 shown in FIG. 12A) as data for a "target session" to be provided to the UI device 112 (block 1345). The server 150 may send the particular session data to the UI device 112 (block 1350).
In an alternative embodiment, the server 150 may be the second UI device 112, wherein the second UI device 112 receives the session request from the first UI device 112 and provides the session data to the first UI device 112.
Fig. 14 is a flow diagram of an example method 1400 for generating a GUI configuration at a UI device 112. The method 1400 may enable the UI device 112 to provide output according to the information received in the UI state transition and according to the environment and context of use of the UI device 112. The method 1400 may be implemented in whole or in part at one or more devices or systems, such as any of the UI devices 112 or 112a-g (FIGS. 1-10 and 12). The method 1400 may be stored in the memory 815 as a set of instructions, routines, programs, or modules, and may be executed by the processor 8310.
Method 1400 begins when UI device 112 recognizes environmental data (block 1405). UI device 112 may also identify an entity associated with the environmental data. The context data may be any context information or item. In one embodiment, the context data may represent any elements included in the context aware data 1540 or the work item data 1550 described with respect to FIG. 15. The associated entity may be any region, device, equipment, or parameter associated with the environmental item.
The method 1400 may include receiving UI state information 896, such as the UI state information 896 shown in fig. 12 (block 14). The UI device 112 may receive UI state information 896 from a device or system implementing the method 1300 shown in fig. 13. After receiving the UI state information 896, the UI device 112 may identify an entity associated with the received UI state information 896 (block 1420). An entity may be any area, device, system, or parameter in the process. Typically, the entity associated with UI state information 896 is also associated with information provided at the previous UI device 112 that captured UI state information 896.
UI device 112 may prioritize the entities (block 1430). The entity may be higher or lower priority depending on, for example, the following factors: the importance of the entity to stable operation of the process, time sensitivity (e.g., the batch of product may be scrapped if the entity is not being quickly processed), location (e.g., the UI device 112 is close to a location associated with the entity), status (e.g., the entity is malfunctioning or associated with a malfunction), alarm condition (e.g., the entity is associated with a parameter value outside of a normal operating range), scheduling (e.g., the entity may be associated with an offline device), or work item relevance (e.g., the entity may be associated with a work item associated with a user or the UI device 112).
UI device 112 may generate a GUI configuration based on the prioritized entities (block 1435). When UI device 112 is unable to display all of the information related to the entities identified in the context data and received session, the entities may need to be prioritized. For example, in some embodiments, the previous UI device 112 may be a workstation with a normal UI type, while the UI device 112 receiving the UI state information 896 is a tablet device with a mobile UI type. Since mobile UI devices are configured for smaller screens, they typically provide less information. Thus, even if UI device 112 refrains from identifying the entity associated with the environmental data, UI device 112 may prioritize the entities in order to identify for which entity UI device 112 should provide information.
In other embodiments, the system or device providing the UI state information 896 may identify the UI type or device type of the UI device 112 that received the UI state information 896. In such embodiments, the provisioning system may customize the UI state information 896 provided to the UI device 112. In other words, the provisioning system may provide more or less information based on the UI type or device type. The reminder system can also provide display data formatted for a UI type or a device type.
FIG. 15 is a flow diagram illustrating a method 1500 of directing state information transfer between two UI devices 112 in a process control plant 10. Method 1500 may be implemented in whole or in part at one or more devices or systems, such as any of UI devices 112. The method 1500 may be stored in the memory 815 as a set of instructions, routines, programs, or modules, and may be executed by the processor 810.
The method 1500 begins with the first UI device 112, which the first UI device 112 may execute one or more routines to perform functions (block 1505). The function may be a control function, an operation function, a configuration function, a maintenance function, a data analysis function, a management function, a quality control function, or a safety function. The first UI device 112 may be coupled to a unified, logical data storage area, such as the big data facility 102, via a network. The unified, logical data storage area may be configured to store process data corresponding to the process plant using a common format. The process data may include a plurality of types of process data including configuration data, continuous data, batch data, measurement data, and event data.
The first UI device 112 may transfer the state information to the second UI device 112 (block 1510). The state information may indicate one or more first routines operating on the first UI device 112. In some embodiments, the status information may be communicated via an internet connection. In other embodiments, the state information may be communicated via an intermediate network. In still other embodiments, the state information may be transferred from the first UI device 112 to the second UI device 112 via a point-to-point wireless connection. In some instances, the status information may be communicated via wireless communication according to a protocol such as the bluetooth protocol or the NFC protocol. In other instances, the state information may be transferred from the first UI device 112 to the second UI device 112 via an intermediary device (which may be the server 150). In a particular example, when UI devices 112 manually detect each other and the same user logs onto both devices, first UI device 112 may communicate state information to second UI device 112. In some embodiments, the state information may be communicated upon receipt of an instruction by the first UI device 112 to communicate the state information. In some embodiments, communicating status information may facilitate one or more of: collaborate on two UI devices 112 between different users; movement of a single user across two UI devices 112; device awareness of user location in a process plant; or the device perception of a user adjacent to a particular process plant device.
The second UI device 112 may receive the status information and execute one or more second routines (block 1515). The display of the second UI device 112 may be configured according to the stored state and a device type according to the UI type of the second UI device 112. The second routine may correspond to one or more of the first routines operating on the first UI device 112. In some embodiments, the second UI device 112 may receive signals from the location-aware component and may modify the execution of one or more second routines in accordance with the received signals. In some instances, the location-aware component may receive a signal from the second UI device 112. The signal may cause the second UI device 112 to modify execution of one or more routines according to the received signal via the network. Modifying the execution of one or more routines may include one or more of: highlighting an area of the process plant where the second UI device 112 is located; displaying information related to a specific device within a predetermined distance of the second UI device 112; displaying an alert related to a device in the area of the process plant where the second UI device 112 is located; or display work items related to devices in the area of the process plant in which the second UI device 112 is located.
In some embodiments, the second UI device 112 may receive signals from the location-aware component and modify the execution of one or more second routines in accordance with the received signals. In some embodiments, the device awareness component may include a transmitter that transmits the wireless signal to the second UI apparatus 112. The wireless signal may identify the device with which the transmitter is associated.
In some embodiments, either or both of the first UI device 112 and the second UI device 112 may be mobile devices. In other embodiments, either or both of the first and second UI devices 112 may be workstations. In some embodiments, one UI device 112 may be a mobile device and the other may be a workstation. In an embodiment, the second UI device 112 may configure the display according to the status information received from the first UI device 112 and according to the device type or UI type associated with the second UI device 112.
FIG. 16 is a flow diagram illustrating an example method 1600 for communicating status information between two UI devices 112 coupled to a server 150 in a process plant 10. The method 1600 may be implemented in whole or in part at one or more networks or systems, such as the process control network 100. In particular, method 1600 may be implemented in whole or in part at one or more devices such as server 150 or at one or more devices or systems such as any of UI devices 112. The method 1600 may be stored as a set of instructions, routines, programs, or modules on the memory 815 or memory 1203 and may be executed by the processor 810 or processor 1201.
The method 1600 begins with the first UI device 112, which the first UI device 112 may execute one or more routines to implement functions in a process plant (block 1605). The first UI device 112 may track the state of one or more first routines executing at the first UI device 112 (block 1610). In some embodiments, the server 150 may track the state of one or more first routines executing at the first UI device 112. The first UI device 112 or the server 150 may store the tracked state of the one or more first routines (block 1615).
The first UI device 112 or the server 150 may transfer the stored state of the one or more first routines to the second UI device 112 (block 1620). In some embodiments, the status information may be communicated via an internet connection. In other embodiments, the state information may be transferred from the first UI device 112 or the server 150 to the second UI device 112 via a point-to-point wireless connection. The state information may also be transferred from the first UI device 112 to the second UI device 112 via an intermediary device or server 150. In some instances, the status information may be communicated via wireless communication according to a protocol such as a bluetooth protocol or a near field communication protocol. In some embodiments, the state may be transferred to the second UI device 112 when the second UI device 112 detects the first UI device 112 or the first UI device 112 detects the second UI device 112. Transferring the stored state to the second UI device 112 may include transferring the stored state upon receiving an instruction at the first UI device 112, wherein the instruction instructs the first UI device 112 to transfer the stored state to the second UI device 112.
The second UI device 112 may execute one or more second routines corresponding to the one or more first routines executed at the first UI device 112 (block 1625). In some embodiments, the second UI device 112 may receive the signal. The signal may indicate that the second UI device 112 is proximate to a device or location. After receiving the signal, the second UI device 112 may modify execution of one or more second routines according to the received signal. In some embodiments, the second UI device 112 may send a signal to the location-aware component indicating that the second UI device 112 is proximate to the device or location. In such embodiments, the second UI device 112 may receive information specific to the device or location from the server 150.
In some embodiments, when proximate to the device or location, the second UI device 112 may take one or more of the following actions: highlighting an area of the process plant where the second UI device 112 is located; displaying information related to a specific device within a predetermined distance of the second UI device 112; displaying an alert related to a device in the area of the process plant where the second UI device 112 is located; displaying work items related to devices in the area of the process plant where the second UI device 112 is located; highlighting on the display a process plant device associated with the received signal; displaying information related to the particular device associated with the received signal on the second UI device 112; displaying an alert related to a device associated with the received signal; or displaying work items related to the device associated with the received signal.
In some embodiments, either or both of the first UI device 112 and the second UI device 112 may be mobile devices. In other embodiments, either or both of the first and second UI devices 112 may be workstations. In some embodiments, one UI device 112 may be a mobile device and the other may be a workstation. In an embodiment, the second UI device 112 may configure the display according to the status information received from the first UI device 112 and according to the device type or UI type associated with the second UI device 112.
FIG. 17 is a flow diagram illustrating another method 1700 for transferring state information between two UI devices 112 within a process control plant 10. Method 1700 may be implemented in whole or in part at one or more devices or systems, such as server 150, or at one or more devices or systems, such as any of UI devices 112. The method 1700 may be stored as a set of instructions, routines, programs, or modules on the memory 815 or memory 1203 and may be executed by the processor 810 or processor 1201 in fig. 12.
The method 1700 begins with the server 150, which the server 150 may provide one or more functions related to process data (block 1705). In some embodiments, the process data may be stored in a unified, logical data storage area and may be stored using a common format. The process data may include a plurality of types of process data including configuration data, continuous data, batch data, measurement data, and event data.
The server 150 may allow the first UI device 112 to access the process data via the server 150. The server 150 may also allow the first UI device 112 to maintain state information on the server 150 (block 1710). The state information may indicate a state of a UI executed on the first UI device 112.
The server 150 may allow the second UI device 112 to access the process data and the state information via the server 150 (block 1710). The second UI device 112 may perform UI according to the state information.
In some embodiments, either or both of the first UI device 112 and the second UI device 112 may be mobile devices. In other embodiments, either or both of the first and second UI devices 112 may be workstations. In some embodiments, one UI device 112 may be a mobile device and the other may be a workstation.
FIG. 18 is a flow diagram of an example method 1800 for operating the process control plant 10 using the UI device 112 associated with a mobile control room. Method 1800 may be implemented in whole or in part at one or more devices or systems, such as server 150, or at one or more devices or systems, such as any of UI devices 112. The method 1800 may be stored as a set of instructions, routines, programs, or modules on the memory 815 or memory 1203 and may be executed by the processor 810 or processor 1201.
The method 1800 begins with the first UI device 112, which the first UI device 112 may access the server 150 (block 1805). The server 150 may be communicatively coupled to a database that stores process data. The first UI device 112 may be associated with a first user profile. The first UI device 112 may perform a function in the process plant (block 1810).
The second UI device 112 may request access to the server 150 (block 1812). The second UI device 112 may be associated with a first user profile. The server 150 may store state information, wherein the state information is associated with a state of the first UI device 112 (block 1815).
The server 150 may provide access to the second UI device 112, where the access may be according to the stored state information (block 1820). The second UI device 112 may perform a function in the process plant (block 1825).
In some embodiments, either or both of the first UI device 112 and the second UI device 112 may be mobile devices. In other embodiments, either or both of the first and second UI devices 112 may be workstations. In some embodiments, one UI device 112 may be a mobile device and the other may be a workstation.
FIG. 19 is a flow diagram illustrating an example method 1900 executed on a server for facilitating movement control of the process plant 10. The method 1900 may be implemented in whole or in part at one or more networks or systems, such as the process control network 100. In particular, the method 1900 may be implemented in whole or in part at one or more devices such as the server 150 or at one or more devices or systems such as any of the UI devices 112. The method 1900 may be stored as a set of instructions, routines, programs, or modules on the memory 815 or memory 1203 and may be executed by the processor 810 or processor 1201.
The method 1900 begins with the server 150, the server 150 may format the process data for display on the first UI device 112 (block 1905). In some examples, the formatted process data may be viewed in a web browser executing on the first UI device 112. The server 150 may format the process data according to the device type or UI type of the first UI device 112.
The server 150 may transmit the formatted process data to the first UI device 112 (block 1910). In particular, the server 150 may transmit process data to the first UI device 112 that is viewable in a multipurpose process control application executing on the first UI device 112.
The server 150 may store state information associated with the display of the process data on the first UI device 112 (block 1915). Storing the state information may include storing one or more of: display configuration of the first UI device 112; a portion of the process plant displayed by the first UI device 112; data of process control devices displayed by the first UI device 112; a function performed on the first UI device 112; a function including one or more of a control function, an operation function, a configuration function, a maintenance function, a data analysis function, a quality control function, or a safety function; and a user profile active on the first UI device 112.
The server 150 may format the process data for display on the second UI device 112 according to the stored state information (block 1920). The server 150 may transmit the process data to the second UI device 112 (block 1925). In particular, the server 150 may format the process data according to the device type or UI type of the second UI device 112. In some instances, the device type of the second UI device 112 may be different from the device type of the first UI device 112. For example, the first UI device 112 may be a workstation and the second UI device 112 may be a mobile device. Alternatively, the first UI device 112 may be a mobile device and the second UI device 112 may be a workstation. In some embodiments, the server 150 may format the process data for display on the second UI device 112 to replicate the operational state of the first UI device 112 on the second UI device 112.
In some embodiments, the server 150 may receive a request from the second UI device 112 to provide a user interface to the second UI device 112 according to the stored state information. The server 150 may establish a secure communication channel with the second UI device 112 between the server 150 and the second UI device 112 in response to a request to provide a user interface.
Fig. 20 is a flow diagram of an example method 2000 for transferring a state of a first UI device 112 to a second UI device 112. The method 2000 may be implemented in whole or in part at one or more networks or systems, such as the process control network 100. In particular, method 2000 may be implemented in whole or in part at one or more devices such as server 150 or at one or more devices or systems such as any of UI devices 112. The method 2000 may be stored as a set of instructions, routines, programs, or modules on the memory 815 or memory 1203 and may be executed by the processor 810 or processor 1201.
The method 2000 begins with the first UI device 112 or the server 150, which the first UI device 112 or the server 150 may recognize a graphic shown at the display of the first UI device 112 (block 2005).
The first UI device 112 or the server 150 may identify process entity data associated with a graphic provided at a display of the first UI device 112 (block 2010). Identifying process entity data can include identifying one or more of: process parameter data associated with graphics provided at a display of the first UI device 112; a process plant area associated with graphics provided at a display of the first UI device 112; a field device associated with graphics provided at a display of the first UI device 112; an application executing on the first UI device 112; or the state of an application executing on the first UI device 112.
The first UI device 112 or the server 150 may transmit the identified process entity data to the second UI device 112 (block 2020). The first UI device 112 or the server 150 may provide the recognized graphic to the second UI device 112 (block 2020).
Fig. 21 is a flow diagram illustrating a method 2100 for initiating a UI session on a first UI device 112. The method 2100 may be implemented in whole or in part at one or more networks or systems, such as the process control network 100. In particular, the method 2100 may be implemented in whole or in part at one or more devices such as the server 150 or at one or more devices or systems such as any of the UI devices 112. The method 2100 may be stored as a set of instructions, routines, programs, or modules on the memory 815 or memory 1203 and may be executed by the processor 810 or processor 1201.
The method 2100 begins with the server 150, which the server 150 may receive a session request from the first UI device 112 (block 2105).
Server 150 may identify a user profile associated with the session request (block 2110). Identifying the user profile associated with the session request may include receiving, from the first UI device 112, a user identifier associated with the user profile. The user identifier may be currently logged into the first UI device 112. Identifying the user profile may also include receiving a user identifier associated with the user profile from the first UI device 112, where the user identifier may be currently logged into the second UI device 112.
Server 150 may determine whether there is a previous session (block 2115). Making the determination may include requesting a session identifier associated with a previous session from the first UI device 112. In some embodiments, making the determination may include receiving the session identifier from the first UI device 112 and in response to the session identifier request. In some embodiments, making the determination may include identifying a session identifier received with the session request.
When the previous session exists, the server 150 may initiate a new session based on the previous session (block 2115). Alternatively, if no previous session exists, the server 150 may initiate a new session, where the new session may be initiated using a default session configuration. Initiating a new session based on a previous session may include determining whether a session identifier is received with the session request. When a session identifier is received with the session request, server 150 may initiate a session associated with the session identifier. When a session identifier is not received with the session request, the server 150 may initiate a session associated with a recent session (e.g., the most recent session of the user identifier associated with the first UI device 112).
In some embodiments, the method 2100 may further include the server 150 sending a request to instantiate a session on the first UI device 112 to the second UI device 112 according to the session operating on the second UI device 112. The method 2100 may also include the server 1500 receiving an acknowledgement from the second client device.
Fig. 22 is a flow diagram of a second method 2200 for instantiating a UI session on the first UI device 112. The method 2200 may be implemented in whole or in part at one or more networks or systems, such as the process control network 100. In particular, method 2200 may be implemented in whole or in part at one or more devices such as server 150 or at one or more devices or systems such as any of UI devices 112. The method 2200 may be stored as a set of instructions, routines, programs, or modules on the memory 815 or memory 1203 and may be executed by the processor 810 or processor 1201.
The method 2200 begins with the server 150, which the server 150 may receive a session request from the first UI device 112 (block 2205). Receiving the session request may include receiving a target session identifier and a device type.
Server 150 may determine the device type associated with the session request (block 2210). Server 150 may identify a graphical user interface configuration based on the device type (block 2215). The server 150 may identify a target session associated with the session request (block 2220).
The server 150 may configure a new session for the first UI device 112 based on the identified graphical user interface configuration and the identified target session. The server 150 may send data associated with the new session to the first UI device 112 (block 2225). Configuring the new session may include identifying as session data one or more of: a process area, a device resource, or a set of process data monitored or controlled in a target session. Configuring the new session may also include configuring the new session according to constraints associated with the identified graphical user interface configuration. Configuring the new session may also include identifying environmental data associated with the session request.
Identifying environmental data may include: identifying a location of a first UI device 112 in a process plant; identifying a user type or user identifier associated with the session request; identifying a user type or user identifier associated with the first UI device 112; identify one or more process control devices within a predetermined distance from the first UI device 112; identifying a function performed on a second UI device 112 associated with the target session; or identify a user identifier associated with the second US device associated with the target session.
Context awareness
FIG. 23 illustrates a second aspect of the exemplary mobile control room 2300, environmental awareness. The mobile control room 2300 includes the UI device 112 and the process entity 199. The process entity 199 may be a current task, user, process data, device, an apparatus, or another UI device. The mobile control chamber 2300 may be responsive to one or more environments in combination, and may be responsive to one or more environments in various ways, as described below. Generally, the UI device 112 will obtain information related to the content of the displayed data and the format of the displayed data, and will obtain and/or display the data according to the circumstances.
In an embodiment, information specifying the type and format of data to be displayed is included in the extended Device Description Language (DDL). DDL is a human-readable language that provides protocols for describing: data available from the smart device, meaning of data associated with and obtained from the smart device, methods available for implementation of the smart device, formats in which to communicate with the smart device to obtain the data, user interface information related to the device (e.g., editing displays and menus), and data needed to handle or interpret other information related to the smart device. The extended DDL may further include: what information should be displayed to different types of users; how to format information for display to different types of users; what information should be displayed on different types of displays; how to format information displayed on different types of displays; depending on what information the target function should display (i.e., what information is displayed when the user performs a particular task); how to format the information displayed to the user performing the target function; and how to fuse instructions according to the respective profiles according to some combination of users, target functions, and display types, etc.
When the UI device 112 is proximate to a particular process control device and/or when a user requests to display information related to a process control device, the UI device 112 may download the DDL or extended DDL for the particular device from the server 150. In some embodiments, once a DDL or extended DDL has been used, the UI device 112 may cache the DDL or extended DDL (hereinafter collectively referred to as "DDL") for future use. By caching DDLs for the device, the UI device 112 may provide display information more quickly when a particular environment or display is activated/requested. In the case where the DDL has changed, the UI device 112 may update the DDL information in the background. The DDL may vary according to: according to the user's preferences, according to the criteria of the process plant, and according to what information is useful in a particular environment as determined by the expert system (e.g., if the expert system determines that a particular parameter or value is important in the event of an alarm), etc.
In an embodiment, the mobile control room 2300, and in particular the UI device 112 carried by the user, may display user information (e.g., status, process variables and/or parameters, etc.) related to the particular process control device in proximity to the user. The UI device 112 may determine the location of the UI device 112 and/or may determine that the UI device 112 is proximate to a process control device in the manner described below. After determining that the UI device 112 is proximate to a process control device, the UI device 112 may access or retrieve a DDL specifying device-specific data (e.g., process parameters, status, maintenance information, etc.) for display, and may then download and display the device-specific data from the DDL. In an embodiment, the data displayed for a particular process control device may include data related to other process control devices, such as data regarding the operation or status of nearby devices, data regarding the operation of a process (e.g., the status of a batch receipt), and so forth.
In another embodiment, the UI device 112 may display information based not only on the location of the device and/or the proximity of the device to a particular process control device, but also on the user and, in particular, the span of control of the user. In process control, a control span refers to a user's role and the tasks and devices for which the user is responsible. The user's control span may affect various aspects of the process, such as process parameters that the user can view, process parameters that the user can modify, times at which the user can modify process parameters, areas and/or devices of the process plant that the user can view/modify, alarms/alerts that the user can acknowledge, maintenance tasks that the user can perform, decisions that the user can be requested or need to make, etc. Thus, in these embodiments, UI device 112 may obtain information related to the user's role and/or control span from the user profile of the user (stored on UI device 112 or on server 150) and may display data specific to the user's role and/or control span. For example, the data displayed may be data that is needed or useful for a user to make control decisions under particular plant conditions. Further, the information displayed by the UI device 112 may be formatted according to the role or control span of the user. For example, when the UI device 112 is proximate to a mixing tank, the UI device 112 used by the operator may display the operating status of the tank, the capacity of the tank, the fill level of the tank, the temperature of the material in the tank, the pressure in the tank, the status of any input/output valves controlling the material flowing into or out of the tank, any alarms or alerts associated with the tank, and the status of performing batch receptions. If the same UI device 112 is used by a maintenance technician in proximity to the same blending tank, the UI device 112 may display the status of the blending tank, the date the sensors in the blending tank are calibrated, the date the tank was last serviced and/or cleaned, a list of scheduled maintenance tasks (or impacts to the blending tank) for the blending tank, an alarm indicating required maintenance, the material in the tank, any locking of the tank if the tank is out of service, the presence of any residual fumes, and the like.
In another embodiment, the UI device 112 may display information based not only on the location of the device and/or the proximity of the device to a particular process control device, but also based on the target function. For example, a user may be assigned a work item (e.g., by supervisor engine 106). UI device 112 may sense (e.g., due to the time a work item is scheduled to be executed, due to input from a user, etc.) that a user is about to perform a task related to the work item. When a user arrives at or near a process control device associated with a work item (i.e., a target device and a target location), the UI device 112 may be provided with information related to a particular task. Referring again to the hybrid tank of the above example, the maintenance technician performing the work item associated with cleaning or servicing the tank may be presented by the UI device 112 with instructions or commands to disable the tank, take the tank out of service, lock the tank, or begin any other process required for the cleaning or service function associated with the work item. UI device 112 may also extract information from supervisor engine 104, from server 150, from big data facility 102, or from one or more controllers to implement and/or support maintenance functions and security operations. As described in the above-described examples (e.g., example 4), UI device 112 may extract information/data during maintenance tasks to facilitate security. Implementations of these concepts are described in the following paragraphs.
In operation, the mobile control room 2300 may enable the UI device 112 to receive information related to the environment and manner of use of the UI device 112. For example, the UI device 112 may identify its location within the process plant by receiving location data from a fixed location device 118 (e.g., a GPS device) or from a node on the process control network 100 shown in FIG. 1A. For example, UI device 112 may execute context-aware routines and/or location-aware routines for tracking the progress of a user's location, schedule, skill set, and/or work items. In other embodiments, the server 150 shown in fig. 1A may execute environment and/or location aware routines that communicate with the UI device 112. Based on the tracking, the location and/or environment awareness routines may enable the UI device 112 to automatically determine and/or display a factory map, device photos or videos, GPS coordinates, and other information corresponding to the location of the worker, or to assist the mobile worker in navigation and device identification. Additionally or alternatively, because the user may have a particular skill set, the context awareness routine or UI device 112 may automatically customize the appearance of the GUI configuration based on the skill set and/or the location of the UI device 112. For example, in another scenario, the context awareness routine may inform the user in real-time of newly opened work items or alerts that are related to one device in his or her vicinity and that the mobile worker is eligible to handle. In yet another scenario, the context awareness routine may cause one or more applications that are specifically related to the user's location and/or skill set to be automatically launched at the UI device 112.
The UI device 112 may identify a particular process entity, such as a field device or an installation, in its vicinity. The process entity may automatically self-identify to the UI device 112, for example, by using the following protocol: a wireless communication protocol such as a wireless local area network protocol compliant with IEEE802.11, a mobile communication protocol such as WiMAX, LTE, or other ITU-R compatible protocols, a short wavelength wireless communication protocol such as Near Field Communication (NFC) or bluetooth, a process control wireless protocol such as wireless HART, or some other suitable wireless communication protocol. In some embodiments, UI device 112 may receive a schedule or work item associated with the identified location, equipment, or field device. In an embodiment, identifying a process entity may cause the UI device 112 to automatically launch one or more applications related to the identified process entity (e.g., work orders, diagnostics, analytics, or other applications).
In some embodiments, in operation, the UI device 112 may identify the process entity 199 via an image sensor at the UI device 112. In some instances, a user of the UI device 112 may take an image of the process entity 199, and the UI device 112 may identify the process entity 199 based on the captured image. In some embodiments, the process entity 199 may include or be proximate to an environment ID device 198 that provides a unique tag or identifier (e.g., a barcode). The UI device 112 may capture a unique tag that allows the UI device 112 to identify the process entity 199 or the environment ID device 198. The UI device 112 may provide information related to the process entity 199 or to the environment ID device 198 (e.g., via a display). In some embodiments, the UI device 112 may determine the location of the UI device 112 by determining the location of the identified process entity 199 or the environment ID device 198. Once the location of the UI device 112 has been determined, the UI device 112 may provide (e.g., via a display) contextual information related to the determined location. The environmental information may relate to, for example, areas, schedules, or other process entities in a work item. In some embodiments, the environment ID device 198 may send the environment information to the UI device 112. In other embodiments, UI device 112 may receive the environmental information from server 150 in response to sending its location to server 150.
In some implementations, the UI device 112 can identify the process entity 199 via a motion sensor or an audio sensor. For example, an audio sensor may be used to capture audio associated with the process entity 199 (e.g., via a sound capture routine). The audio may be generated by the process entity 199 during normal operation of the process entity. In other implementations, the audio may be generated by speakers of an audio device associated with the process entity 199. In either case, the captured audio can be used to identify the process entity 199. The UI device 112 may also detect vibrations via a motion sensor to identify the process entity 199. For example, a plant asset may have a desired level of vibration during operation. The user may place the UI device 112 on or near a plant asset. UI device 112 may use the data detected by the motion sensor to identify the current vibration level associated with the asset. The UI device 112 may associate the current level of vibration with a signature vibration associated with the process entity 199, allowing the UI device 112 to identify the process entity 199. In some instances, a motion sensor and/or audio sensor may be used in conjunction with another identified image/sound/vibration/location to identify the unique identifier. For example, based on the detected vibration levels associated with the plant assets and the location of the UI device 112, the UI device 112 may identify a particular tag associated with the process entity 199, allowing the UI device 112 to identify the process entity 199.
In further operation, UI device 112 may identify its own location by receiving location data from one or more GPS satellites 2303. After identifying its own location, the UI device 112 may communicate with a database or server to identify process entities located proximate to the location of the UI device 112. UI device 112 may send its location to server 150. The server 150 may send the context information back to the UI device 112. The environmental information may be related to one or more process areas, devices, or equipment proximate to the UI device 112. The context information may also relate to scheduling or work items related to the location of the UI device 112. 24-27, described below, set forth the operation of the context awareness routine in various embodiments of the present disclosure.
Fig. 24 is a block diagram of an exemplary context-aware UI device 112 in a mobile control room 2400. Context-aware mobile control room 2400 can enable UI device 112 to provide output in response to its environment and manner of use. Context-aware mobile control room 2400 can include a context identification ("context ID") device 2402, UI device 122, and server 150. UI device 122 may interact with environment ID device 2402 to identify environment data or environment items. In some embodiments, the environment ID device 2402 may communicate with the UI device 122 through a wireless or wired channel. In some embodiments, the environment ID device 2402 may transmit process parameter data and/or display data to the UI device 112. The environment ID device 2402 may use image recognition technology (e.g., a barcode or QR code), audio recognition technology (emitting a unique voice signature), or radio frequency technology (e.g., RFID, NFC, bluetooth, or Wi-Fi (IEEE 802.11 standard) technology). The UI device 112 may communicate with the server 150 via a network, such as the process control network 100. In other embodiments, the environment ID device 2404 may be in the UI device 112, and a device (e.g., a plc device) may receive a signal from the environment ID device 2402 and report the location of the UI device 112 to the server 150.
In either case, the server 150 may store the environment data 2410. The environmental data can include user profile data 1245 (related to users/operators at the plant), UI device profile data 2414 (related to registered UI devices at the plant), field device profile data 2416 (related to installed devices at the plant), equipment profile data 2418 (related to installed equipment at the plant), scheduling data 2420 (related to user and equipment/device scheduling), and work item data 2422 (related to tasks or jobs in the plant). In some embodiments, field device profile data 2416 may be included in the equipment profile data 2418. User profile data 1245 may include skill set data indicating the skill level or level of responsibility associated with a particular user. Work item data 2422 may include data such as: task ID (identifying a particular task), skill threshold (identifying the minimum skill level or role/responsibility required to work on a task), target device (the device associated with a task), and work item progress (identifying how close to completing a task). Each of the context items 1245 and 2414-2422 may include information such as: location or area (e.g., associated with a user, device, equipment, schedule, or work item), status, related process entity, unique identifier/tag, and/or entitlement information.
In operation of the environment ID device 2402, the environment ID device 2402 may include a unique identifier or tag that may be read, scanned, or received at the UI device 112 when the UI device 112 comes within range of the environment ID device 2402. The range of the ambient ID device 2402 may depend on the particular embodiment of the ambient ID device 2402, and may be as small as a few centimeters or less, or as large as a kilometer or more, or a distance between the two. In some embodiments, the environment ID device 2402 may send a unique identifier to the UI device 112. In other cases, the environment ID device 2404 may display or provide a unique identifier so that it may be received and/or retrieved by the UI device 122.
In either case, the UI device 112 can receive the unique identifier and identify an environmental item, such as an area (i.e., a place, geographic area, or region), device, equipment, work item, or available schedule in the environment of the UI device 112, by associating the unique identifier to the environmental item. For example, UI device 112 may access a database, table, or data structure that pairs unique identifiers with particular environmental items. Such a database or table may exist at UI device 112, at environment ID 2402, or at server 150. When a database or table exists at server 150, UI device 112 may send the unique identifier to server 150. The server 150 may access a database, a table, or some other data structure to identify the environment item associated with the unique identifier. The server 150 may send data representing the environmental items to the UI device 112.
Once the UI device 112 has identified the environmental item, the UI device 112 can provide output related to the identified environmental item. For example, an environmental item may indicate a particular zone, device, apparatus, or alarm associated with a zone. The UI device 112 may generate visual representations, sounds, or other outputs related to particular devices, equipment, or alarms so that a user may be informed of process conditions in a process area. Also, there may be multiple devices or alarms associated with an identified one of the apparatuses. UI device 112 may provide information related to the device or an alarm associated with the device (according to field device profile data 2416). Similarly, context items may cause UI device 112 to provide information related to a device (provided according to device profile data 2418), a schedule (provided according to schedule data 2420), or a work item (provided according to work item data 2422).
In some embodiments, one or more process control devices in the process plant may be the environment ID device 2402. In other embodiments, the one or more process control devices may include the environment ID device 2402 or be associated with a nearby environment ID device 2402. For example, one or more of the field devices 15-22 and/or 40-58 shown in FIG. 1A may include or may be positioned proximate to the environment ID device 2402 (e.g., the environment ID device 2402 may be attached to or proximate to each of the field devices, or the field devices may have internal circuitry that causes the field devices to function as environment ID devices). Similarly, the controller 11, gateway 35, UI device 112, I/ O cards 26 and 28, and router 58 shown in FIG. 1A may be, include, or may be proximate to the environment ID device 2402. In such embodiments, the UI device 122 may receive a unique identifier associated with each of the environment ID devices 2402, allowing the UI device 112 to receive environment items (e.g., location or device IDs) associated with each of the process control devices.
In an alternative embodiment of the context aware mobile control room 2400, the UI device 122 can include or provide a unique identifier. For example, UI device 112 may have a uniquely scannable image on a device or chip that transmits the unique identification data. In another example, a user of the UI device 112 may carry a badge, card, or some other accessory that includes a similar image or chip. In such embodiments, the environment ID device 2402 may read, scan, or receive the unique identifier. The environment ID device 2402 may operate to associate a unique identifier to a particular user or UI device 112. The environment ID device 2402 may associate a unique identifier to a particular user or UI device by accessing a data structure stored at the environment ID device 2402. Alternatively, the environment ID device 2402 may send the unique identifier to the server 150, where the server 150 associates the particular user or UI device with the unique identifier.
In either case, once the context ID device 2402 identifies the UI device 112 or the user, the context ID device 2402 may send the relevant context item to the UI device 112. Alternatively, the environment ID device 2402 may communicate with one or more nodes on a network (e.g., the process control network 100) to notify the one or more nodes that the user or UI device 112 is within range of the environment ID device 2402. One or more nodes may send one or more environmental items, UI data (e.g., display data, process parameter data), or any other data to UI device 112. UI device 112 may operate or provide output based on the received data. For example, in some embodiments, UI device 112 may launch the target application in response to receiving a unique identifier, environment item, UI data, or other data from environment ID device 2402 or from server 150. The target device may be, for example, an application dedicated to providing process graphics and information to a user. The target application may be, for example, a mobile application operable on a phone or tablet device. In other embodiments, the target application may be a browser routine 888. In some embodiments, the browser routine 888 may be directed to a particular resource or group of resources that are related to the received unique identifier, environment item, UI data, or other data.
In some embodiments, the environment ID device 2402 may be part of a rights system. For example, the permissions associated with a process entity may depend on how close the UI device 112 is to the process entity. In some embodiments, when a user or the UI device 112 is in proximity to a process entity, the UI device 112 may receive permission or authorization to modify parameters associated with the process entity. The UI device 112 may also deny permission to participate in a work item or modify a parameter when the user's skill level is below an indicated skill threshold associated with the work item or parameter.
FIG. 25 is a block diagram of another embodiment of a mobile control room 2500 in the process plant 10. Context aware mobile control room 2500 may enable UI device 112 to provide output in response to its environment and manner of use. The mobile control room 2500 may include a UI device 112 communicatively coupled to the process control network 100, including an area 2505 and 2515 and a tank 2520. The UI device 112 is connected to the process control network 100. Region 2505 includes environment ID device 2402 a; region 2510 includes environment ID device 2402 b; the process area 2515 includes an environment ID device 2402 c; and canister 2520 includes an environment ID device 2402 d.
In an embodiment, the environment ID device 2402a is or includes an NFC device. The UI device 112 and the environment ID device 2402a typically operate at 13.56MHZ and may operate according to the NFC standard (e.g., ISO/IEC 14443, ISO/IEC 1809, NFCIP-1, NFCIP-2, and JIS: X6319-f). NFC technology supports wireless transactions and data exchange between UI device 112 and environment ID device 2402 a. NFC technology can also be used to automatically bootstrap other communication connections. In such embodiments, the environment ID device 2402a may send instructions to the UI device 112. UI device 112 may receive and execute instructions that cause UI device 112 to connect to another network. In some embodiments, another network may be a broader network (e.g., the process control network 100) that includes other nodes. In some embodiments, the other network may be a connection between the UI device 112 and the environment ID device 2402 a. For example, the other network may be a wireless adaptive network or a personal area network (e.g., bluetooth, IEEE 802.15.1 standard). In either case, the environment ID device 2402a can transmit authentication information to the UI device 112 in addition to the network connection instruction, allowing the UI device 112 to establish a connection to the network without requiring the user of the UI device 112 to manually set up the network and input the authentication information.
In further operation of the environment ID device 2402a, the NFC tag or device at the environment ID device 2402a may also store other instructions that may be executed at the UI device 112. For example, the instructions may cause one or more applications to be launched or executed in a particular manner. In the illustrated embodiment, the instructions may cause the UI device 112 to launch a UI (e.g., the UI routine 882 in FIG. 8) or a browser (e.g., the browser routine 888 in FIG. 8), or to place the UI or browser in a particular state. The instructions may cause UI device 112 to provide a GUI configuration for providing information related to devices and apparatuses in area 2505. For example, the GUI configuration may include a window with a graphical overview of the process area 2505.
In further operation of the environment ID device 2402a, the UI device 112 may receive the unique identifier from the environment ID device 2402a via NFC communication or via a network to which the UI device 112 connects after receiving authentication via NFC communication. The unique identifier generally represents region 2505, but may represent other environmental items in some embodiments. The UI device 112 may use the unique identifier to identify an environmental item (e.g., region 2505) and provide output in accordance with the identified environmental item (e.g., provide a graphical overview of region 2505). Alternatively, the environment ID device 2402a may receive the unique identifier from the UI device 112 and identify the UI device 112 (or a user thereof), allowing the environment ID device 2402a or another node on the process control network 100 to send data, such as environment data or UI data, to the UI device 112. UI device 112 may operate or provide output based on the received data.
In embodiments of the ambient ID device 2402b, the ambient ID device 2402b is or includes an RFID tag. In such embodiments, the UI device 112 includes an RFID scanner and uses the RFID scanner to obtain the unique identifier. The unique identifier generally represents the region 2510, but may represent other environmental items (e.g., a particular device, apparatus, location, etc.) in some embodiments. UI device 112 may use the unique identifier to identify the environmental item in a manner consistent with the method discussed with respect to fig. 24. In an alternative embodiment, the environment ID device 2402b may be an RFID scanner and the UI device 112 may include an RFID tag. In such embodiments, the environment ID device 2402b identifies the UI device 112 when the UI device 112 comes within range of the environment ID device 2402b (e.g., when the user comes into the area 2510). Upon identifying the UI device 112, the environment ID device 2402b may communicate with the UI device 112 (e.g., using the process control network 100; using another network such as a personal area network; or using a display) and send the unique identifier to the UI device 112 or to the server 150, which may use the unique identifier to provide environment information to the UI device 112. UI device 112 may identify region 2510 in a manner consistent with the method discussed with respect to fig. 24 and operate or provide output based on the identified region 2510. In another embodiment, the environment ID device 2402b may send the environment item (instead of the unique identifier) to the UI device 112 (using, for example, short-range wireless network communication such as bluetooth). In another embodiment, the user may have an RFID tag in addition to or instead of UI device 112 having an RFID tag. In any of these embodiments, both the RFID scanner and the RFID tag may be active or passive. UI device 112 may operate or provide output based on the received data.
In operation of an embodiment of the environment ID device 2402c, the environment ID device 2402c may be a Wi-Fi access point having a range that covers the process area 2515. When the UI device 112 enters the process area 2515, the environment ID device 2402c may establish communication with the UI device 112. The environment ID device 2402c may send the unique identifier (e.g., MAC address or device tag) to the UI device 112. The unique identifier generally represents region 2515, but may represent other environmental items in some embodiments. UI device 112 may use the unique identifier to identify an environmental item (e.g., data representing region 2515) and operate or provide output in accordance with the environmental item (e.g., provide a visual representation of region 2515) in a manner consistent with the method discussed with respect to fig. 24. For example, a database pairing MAC address or device tag to a particular area may be stored on UI device 112, accessible by UI device 112, or may be stored on a node in communication with UI device 112. Alternatively, the UI device 112 may transmit the unique identifier (e.g., MAC address of the UI device 112) to the environment ID device 2402 c. Upon receiving the unique identifier, the environment ID device 2402c may operate to determine that the UI device 112 is associated with the unique identifier. UI device 112 may operate or provide output based on the received data.
In embodiments of the environment ID device 2402d, the environment ID device 2402d may include a barcode. The barcode may be a matrix barcode (e.g., a QR code) or a linear barcode (e.g., a UPC barcode). UI device 112 may include or communicate with an image sensor, which may be a camera or a dedicated barcode scanner. In operation, UI device 112 may use an image sensor to capture a barcode at environment ID device 2402 d. UI device 112 may decode data encoded as a barcode ("barcode data"). The barcode data typically includes a unique identifier that represents the tank 2520 (or any other process control device or apparatus to which it is attached), but in some embodiments the unique identifier may represent other environmental items. UI device 112 may use the unique identifier to identify environmental items (e.g., data representative of tank 2520) and operate or provide output (e.g., provide a visual representation of tank 2520) in accordance with the environmental items in a manner consistent with the method discussed with respect to fig. 24. In alternative embodiments, the barcode may include data or instructions that cause the UI device 112 to perform a particular action (e.g., launch a browser or UI, cause the browser or UI to provide particular information). The particular information may relate to any of a number of process entities (e.g., process parameter data, a particular item of graphics (e.g., tank 2520), or a particular device's alarm data). In further embodiments, the UI device 112 or the user of the UI device 112 may alternatively or additionally include a barcode captured by the environment ID device 2402d, allowing the environment ID device 2402d to identify the UI device 112 or the user. The barcode at the UI device 112 may also provide instructions to be executed at the environment ID device 2402 d. For example, the barcode may cause the environment ID device 2402d to provide relevant information to the user or the UI device 112.
In some embodiments, UI device 112 may use other methods to identify the unique identifier. For example, the UI device 112 may use an audio sensor to identify the unique identifier, where the unique identifier is a sound signature associated with a plant area/asset (as described with respect to fig. 24). The sound signature may be associated with noise generated by a particular plant area/asset during operation. Alternatively, the sound signature may be an audio signal generated by an audio output device associated with the asset. UI device 112 may also use a motion sensor to identify the unique identifier. The unique identifier may be a particular vibration level associated with the plant asset. For example, a user may place the UI device 112 on a plant asset, allowing the UI device 112 to detect vibration levels. In some instances, a motion sensor may be used in conjunction with the identified image/sound/location to identify the unique identifier. For example, based on the detected vibration level associated with the plant asset and the location of the UI device 112, the UI device 112 may identify a particular tag associated with the plant asset.
In some embodiments, UI device 112 may identify its location by receiving location data. The location data may be received via a network such as the process control network 100. Alternatively, the location data may be received via a GPS receiver at the network interface of the UI device 112. The UI device 112 may compare its location to the locations of other process entities to identify that a process entity is proximate to the UI device 112. UI device 112 may transmit its location to a node on process network 100, such as server 150. In some embodiments, the node may respond by sending the environment information to the UI device 112. In other embodiments, UI device 112 may send the location data to environment ID device 2402. The environment ID device 2402 may transmit environment data to the UI device 112 according to the received location data.
In embodiments, the UI device 112 may cooperate with the environment ID device 2402 to provide real-time location data of the UI device 112. When the mobile operator is carrying the mobile UI device 112 through the environment, the UI device 112 may use the location information received from the environment ID device 2402 to determine the current location of the UI device 112 in the process plant and may display a current map of the mobile operator's location in the environment. The map may display the location of the mobile operator from an overhead view or a three-dimensional view. Of course, the desired or expected route may also be displayed on the mobile UI device 112. Alternatively, UI device 112 may use one or more accelerometers to determine the orientation and position of the device within the environment and may display an augmented reality view of the environment in cooperation with an image sensor on UI device 112. For example, a mobile operator may point at an image sensor at an area of the process plant and the UI device 112 may display a view of the apparatus on the image, may display a route to a desired one of the devices (e.g., the device associated with the current work item), and may display parameters or other process data associated with the area of the process plant.
Fig. 26 is an illustration of an exemplary mobile control room 2600. Mobile control room 2600 can include first UI device 2602a, second UI device 2602b, and device 2620. The first UI device 2602a can include a display that provides a graphic 2615 representative of the device 2610 or other data related to the device 2610 (e.g., current operating parameters, set points, alarms, errors, scheduled maintenance, calibration data, etc.). The second UI device 2602b may include a display that provides a graphic 2625 representing the device 2620 or other data related to the device 2620 (e.g., current operating parameters, set points, alarms, errors, scheduled maintenance, calibration data, etc.). Device 2610 may include a first environment ID device 2604a and device 2620 may include a second environment ID device 2604 b.
In operation, an operator carrying the UI device 2602a may enter an area within range of the environment ID device 2604 a. The UI device 2602a may communicate with the environment ID device 2604a or scan the environment ID device 2604a so that the UI device 2602a may receive data from the environment ID device 2604 a. The UI device 2602a may operate or provide output in response to the received data. In the illustrated embodiment, the UI device 2602a may provide a graphic 2615 that represents the device 2610. In some embodiments, the UI device 2602a may provide alternative or additional outputs, such as other graphics, process parameter values, or alarms. An operator carrying the UI device 2602b may come within range of the environment ID device 2604b, causing the UI device 2602b to provide a graphic 2625 representing the device 2620.
Fig. 27 is a flow diagram illustrating an example method 2700 for generating a graphical user interface. Method 2700 can be implemented in whole or in part at one or more devices or systems, such as any of UI devices 112. The method 2700 may be stored in the memory 815 as a set of instructions, routines, programs, or modules and may be executed by the processor 810.
Method 2700 begins with UI device 112, where UI device 112 identifies an external device or identifier/tag (block 2705). The identifier may be an image, a sound, or a bar code. The identifier may alternatively be a unique tag associated with the transmission of the NFC system or the RFID system. In some embodiments, the identifier may be associated with a process entity, such as a process area, a device, an apparatus, or another UI device 112.
The UI device 112 may receive environmental information based on the identified external device or identifier (block 2710). In some embodiments, UI device 112 may receive context information based on the identified external device or identifier. In other embodiments, UI device 112 may receive the environmental information from server 150 in response to sending data representing the identifier to server 150. The context information may represent context items such as location, device, schedule, work item, etc.
The UI device 112 may provide the information at a display of the UI device 112 (block 2715). The information may be provided based on the received environment information. For example, UI device 112 may generate information related to the received location, the identified device or equipment, the received schedule, or the received work item.
Turning now to FIG. 28, a flow diagram illustrates an example method 2800 executed on the UI device 112 for controlling the process plant 10 using the UI device 112. The method 2800 may be implemented in whole or in part at one or more networks or systems, such as the process control network 100. In particular, method 2800 may be implemented in whole or in part at one or more devices such as server 150 or at one or more devices or systems such as any of UI devices 112. The method 2800 may be stored as a set of instructions, routines, programs, or modules on the memory 815 or memory 1203 and may be executed by the processor 810 or processor 1201.
The method begins with UI device 112, which UI device 112 may send a first request for first data from a data storage area to server 150 via a network (block 2802). The data storage area may be a unified, logical data storage area including one or more devices configured to store process data corresponding to the process plant using a common format. The process data may include multiple types of process data, such as configuration data, continuous data, batch data, measurement data, and event data.
UI device 112 may receive first data from the storage area from server 150 in response to the first request (block 2810). UI device 112 may display the first data received from server 150 (block 2815).
UI device 112 may receive an indication that UI device 112 is proximate to an external device (block 2820). UI device 112 may include communication circuitry that operates to detect the proximity of an external device. The communication circuit may include a Near Field Communication (NFC) circuit, a Radio Frequency Identification (RFID) circuit, a bluetooth circuit, a circuit operating according to an IEEE802.11 protocol, or a circuit operating according to a wirelesshart protocol. In some instances, the UI device 112 may receive an indication that the UI device 112 is proximate to another UI device 112.
UI device 112 may send a second request for second data to server 150 according to the received indication (block 2825). In some embodiments, sending the second request includes sending a request to the server 150 for state information of the other UI device 112.
UI device 112 may receive second data from server 150 in response to the second request (block 2830). In some embodiments, the second data may represent requested state information of the other UI device 112. In such embodiments, the UI device 112 may also display process control data from a storage area according to the received status information. Displaying the process control data may include replicating the display of the other UI device 112 on the display of the UI device 112. Displaying the process control data may include arranging the data displayed on the other UI device 112 on a display of the UI device 112.
In other embodiments, receiving the proximity indication (block 1720) may include receiving an indication that the UI device 112 is in proximity to a process control device. Sending the second request (block 2825) may include sending an indication to the server 150 that the UI device 112 is proximate to the process control device. In such an embodiment, receiving the second data may include receiving process control data associated with the process control device (block 2830). Receiving process control data related to a process control device may include receiving and displaying data for one or more of: an alarm associated with the process control device; maintenance tasks associated with the process control devices; a graphical representation of an area of a process plant associated with a process control device; or the status of an area of the process plant associated with the process control device.
In some embodiments, receiving a proximity indication (block 2820) may include receiving an indication that a mobile device is in a particular area of a process plant. In such embodiments, sending the second request (block 2825) may include sending an indication to the server 150 that the UI device 112 is in a particular area of the plant. Additionally, receiving second data (block 2830) may include receiving second process control data associated with a particular area of the process plant. Receiving process control data related to a particular area may include receiving and displaying data for one or more of: alarms associated with particular areas of a process plant; maintenance tasks associated with a particular area of a process plant; a graphical representation of a particular area of a process plant; or the status of one or more process control devices associated with a particular area.
In some embodiments, UI device 112 may not communicate with server 150, but may communicate with devices in a particular area. For example, the UI device 112 may be proximate to a particular one of the process devices in the area of the process plant and may be capable of communicating with one or more devices (other than the server 150) in the area of the process plant either directly or via an intermediate device (e.g., via a router or other access point that is part of a wireless network). This may be the case, for example, if the server 150 is unavailable, or if an area of the process plant is physically or logically isolated from the server 150. In either case, the UI device 112 may send and/or receive data directly to and/or from devices in the area of the process plant. For example, UI device 112 may send a request for data directly to another device (instead of server 150) via a network, be able to receive data from the device in response to the request, be able to display the received data, be able to receive an indication that UI device 112 is proximate to an external device, and/or the like.
A flow diagram illustrating a method 2900 for facilitating movement control of a process plant is provided in FIG. 29. The method 2900 includes implementing a mobile user interface device (block 2905) and providing a location-aware component in the mobile user interface device that is operable to generate information related to a location of the mobile device (block 2910). The method 2900 also includes providing a database storing arrangement information for the process plant (block 2915) and implementing the first routine on the mobile user interface device (block 2920). The first routine may be operable to interpret information generated by the location-aware component from information stored in the database to determine a relationship between the location of the mobile user interface device and the layout of the process plant. The mobile user interface device may also implement a second routine operable to generate a graphic for showing on the display based on the determined relationship between the location of the mobile device and the layout of the process plant (block 2925). In embodiments, providing a database storing layout information may include providing a database storing layout information in a top view or may include providing a database storing layout information in a horizon view. The layout information may include, for each process device, device tags, device visualizations (e.g., one or more visualizations, each of which corresponds to a mobile user interface device type or a mobile user interface device display type), device locations, and device connection information. The location-aware component can be, for example, a communication channel between the GPS receiver, the RFID reader, the RFID tag and the mobile user interface device and a server that provides data to the mobile user interface device, a plurality of sensors (e.g., an accelerometer and a gyroscope) operable to determine the movement and location of the mobile user interface device relative to an anchor point, and the like. In some embodiments, implementing the second routine includes implementing a routine operable to generate a real-time graphic of a location of the mobile user interface device within the process plant as the mobile user interface device moves within the process plant. Generating the real-time graphic of the location of the mobile user interface device may include showing the location of the mobile user interface device in a top view on the display or in a three-dimensional view from the horizon on the display.
It should now be appreciated that the UI devices 112, and in some embodiments the control network 100, may be aware of various environmental information, including, importantly, the location of one or more UI devices 112 within the process plant 10. Various methods of controlling the network 100 (including the server 150) or the UI device 112 to determine the device location have been described. For example, the UI device 112 may cooperate with the environment ID device 2402 and/or the server 150 to determine the location of the UI device 112. The UI device 112 may also include a GPS receiver 832, which GPS receiver 832 allows the UI device 112 to determine its location by receiving signals from GPS satellites, as is generally known. In some embodiments, one or more of UI devices 112 may also include an Inertial Positioning System (IPS) 834. IPS 834 may take the form of a stand-alone component or an integrated circuit. In at least one embodiment, IPS 834 is an integrated circuit including a high-precision clock circuit, three accelerometers (one in each of the x-, y-, and z-axes), and three gyroscopes (one in each of the x-, y-, and z-axes). In some embodiments, IPS 834 also includes a compass or magnetometer.
In either case, the IPS 834 may operate to detect movement of the UI device 112 and the orientation it presents, and provide information about the distance and direction the device is moving or has moved. By combining information related to detected movement and orientation of UI device 112 with another source of information indicating an initial position of UI device 112 ("anchor"), UI device 112 may determine its position independent of any continuous source of information. For example, the UI device 112 carried by the operator may have a GPS receiver and may track the location of the UI device 112 as the operator moves through an outdoor environment toward an indoor environment. When the operator crosses the boundary of the outdoor and indoor environments, the UI device 112, and in particular the GPS receiver 832, will likely lose the GPS signal. The UI device 112 may use the last known location of the UI device 112 determined using the GPS receiver 832 as an anchor point. From the anchor point, UI device 112 may determine the distance and direction that UI device 112 has moved in the indoor environment. Using this information, the UI device 112, routines operating on the UI device 112, and potentially other devices (e.g., the server 150, the supervisor engine 106, etc.) may continue to track the location of the UI device 112. The UI device 112 may continue to provide the operator with an illustration of the operator's location within the indoor environment, may provide the operator with navigational directions to particular plant assets (e.g., to a particular one of the devices), may take or recommend actions based on the operator's location within the plant, and the like.
It should be noted that GPS receiver 832 is not the only source of information that can provide an anchor point for use in combination with IPS 834. Any of the environment ID devices 2402 may also cooperate with the UI device 112 to determine an anchor point. For example, environment ID device 2402 (e.g., an NFC device on a door frame) at a threshold may communicate with UI device 112 to establish the location of UI device 112 and provide an anchor point when the operator crosses the boundary between outdoor and indoor environments. As another example, an operator may use the UI device 112 to scan or interact with the environment ID device 2402 (e.g., RFID tag, NFC chip, barcode, etc.) at any known fixed location in a process plant (e.g., on a process device, proximate to a particular plant area, etc.) to provide an anchor point.
The UI device 112 may use the information provided by the IPS 834 and anchor points to show the location of the UI device 112 in the process plant or other environment on the display of the UI device 112. In an embodiment, this includes showing a location on a floor plan of the process plant, showing a location on a 3D map of the process plant, showing a location on a schematic of the process plant, and so forth. Alternatively or additionally, the UI device 112 may provide navigation information to direct an operator to a desired location in the process plant (e.g., to a location associated with an assigned work item, to a selected location, to a device associated with an error or alarm, etc.). In some embodiments, the UI device 112 may provide navigation or location information to guide an operator or other personnel in the plant environment. This may be useful, for example, when attempting to locate an injured person or a person requesting task assistance.
Each UI device 112 having location data (whether provided by GPS data, IPS data, or in cooperation with the environment ID device 2402) may provide the location of the UI device 112 to the control system, and in particular to the server 150 and/or the supervisor engine 106. In some embodiments, the presence of UI device 112 in a particular area may cause server 150, supervisor engine 106, or UI device 112 to disable one or more features of UI device 112. For example, the microphone 842 and/or camera 844 may be disabled when the UI device 112 is in an area where privacy of the operator may be important (e.g., in a toilet) or where relevant security considerations dictate.
Also, in some embodiments, various control aspects of a process plant may be changed by the presence of personnel in the area. For example, a particular security system may have a first threshold when no people are present in the area, and a second (more conservative) threshold when people are present in the area. In this way, the safety of the personnel can be improved.
FIG. 30 is a flow chart illustrating a method 3000 for determining a location of a mobile device in a process control environment. The method 3000 includes obtaining anchor locations within the process plant (block 3005) and determining anchor locations from the obtained data (block 3010). The method also includes receiving data indicative of acceleration and orientation of the mobile device from circuitry of the mobile device (block 3015) and determining a location of the mobile device from the received data and the anchor location (block 3020). In an embodiment, acquiring the data indicative of the anchor location comprises determining the location of the mobile device using a global satellite positioning system, such as GPS, GLONASS or any other satellite positioning system. In some embodiments, acquiring data indicative of the anchor location includes acquiring an image (e.g., an image of a barcode, an image of a portion of a process plant, etc.). Where an image of a portion of a process plant is acquired, for example, the captured image may be compared to a database of physically located images (i.e., images associated with corresponding physical locations). The data indicative of the anchor location may also include data of one or more wireless signals (e.g., signals compliant with the IEEE802.11 specification), obtain data from an RFID device, establish a bluetooth connection, or establish a near field communication session. Obtaining data indicative of the anchor location may also include determining a process control device in proximity to the mobile device and receiving or obtaining information associated with the location of the process control device from memory (or from a remote database).
Receiving data indicative of acceleration and orientation of the mobile device includes receiving data from one or more accelerometers and from one or more gyroscopes, receiving data from a geomagnetic instrument. In various embodiments, data is received from an inertial measurement unit and/or data is received from a device that includes three accelerometers and three gyroscopes. In some embodiments, the method further includes launching an application of the mobile device based at least in part on the determined location of the mobile device, wherein the application is operable to modify operation of the process plant.
Turning now to FIG. 31, a flow diagram illustrates a method 3100 for environmental operation of a mobile device in a process control environment. The method includes obtaining, at a mobile device, information identifying a process entity in a process control environment (block 3105). The process entity may be any process entity within a process plant including, but not limited to, an area of the process plant, process control devices and controllers, etc. The method also includes identifying, at the mobile device, work item data associated with the process entity (block 3110). The work item data includes information related to a target function associated with the process entity. In response to the acquired information and the identified work item data, events at the mobile device are automatically triggered to facilitate implementing the target function associated with the process entity (block 3115). The target function may be a scheduled task associated with the process entity. In an embodiment, the triggering event at the mobile device comprises at least one of: the method may include causing the mobile device to provide instructions related to performing the scheduled task, causing the mobile device to display security information (e.g., material in the process control device, whether the process control device has been deactivated and/or locked, whether residual material may be detected, etc.), causing the mobile device to launch an application for performing the scheduled task, or causing the mobile device to provide an interface for performing the scheduled task. In some embodiments, the target function may be a rights verification function associated with the process entity. The automatically triggered event may identify a user identification associated with a user operating the mobile device, identify a rights token associated with the process entity, determine a level of rights based on the user identification and the rights token, and provide an interface for modifying a parameter associated with the process entity to an extent indicated by the level of rights. The privilege level may indicate a degree to which a user is allowed to modify a parameter associated with a process control entity. The target function associated with the process entity may also be an alarm checking function, and the triggering event may include identifying an alarm and providing an indication of the alarm. The target function may be a location determination function and the automatically triggered event may be determining a location associated with the process control entity and providing a map graphic displaying the location of the process entity within the process control environment. In an embodiment, obtaining information to identify a process entity includes obtaining one or more data tags from one or more corresponding identification devices having a fixed spatial relationship relative to the process entity and including a unique identifier in a process control environment. In an embodiment, the environment identification device is a barcode and acquiring the tag data includes capturing an image of the barcode and analyzing the barcode to identify the tag data. The environment identification device may be a radio transmitter and acquiring the tag data may include detecting a radio frequency signal transmitted by the radio transmitter and carrying the tag data. The radio transmitter may be an NFC device, an RFID device or a personal area network device that performs short-wave radio transmission. In an embodiment, obtaining information to identify a process entity in a process control environment includes capturing an image uniquely associated with the process entity. Acquiring information may also include capturing an audio signal and determining that the audio signal is relevant to the process entity. Similarly, acquiring information may include detecting movement patterns associated with process entities.
Analysis of physical phenomena
In embodiments, the UI device 112 may cooperate with the expert system 104 and the big data facility 102 to analyze data related to physical phenomena. Physical sites that can be analyzed include, without limitation, phenomena related to the visible or invisible spectrum (e.g., flame color in the visible and infrared spectrum) and phenomena related to vibrations in the audio, sub-audio, and super-audio ranges (e.g., sound and other vibrations). A person carrying a UI device 112 equipped with a camera, accelerometer, microphone, or other device may be used to capture and/or record data related to the physical phenomenon. The camera may, for example, sense and record images in the visible spectrum or, in some embodiments, in the infrared or other spectrum. The microphone may sense and/or record audio, sub-audio, and/or super-audio vibrations that are propagated through the air. When the UI device 112 is placed on an apparatus, the accelerometer may sense and/or record the vibration. Any or all of these types of data may be sent from the UI device 112 to the expert system 104 for analysis and/or comparison with data in the big data facility 102.
A method 3200 for analyzing physical phenomena in a process plant is shown in fig. 32. The method 3200 includes detecting, in a mobile device, a physical phenomenon in a process plant (block 3205). In various embodiments, detecting the physical phenomenon may include detecting a visual scene, detecting sound, and/or detecting vibration. In various embodiments, by way of example and not limitation, detecting a physical phenomenon may include detecting a visual scene including flames, sounds associated with a combustion chamber, sounds associated with movement of a fluid, images or video of a chimney roof, and/or vibrations associated with a rotating element.
Method 3200 also includes converting, in the mobile device, the detected physical phenomenon into digital data representative of the physical phenomenon (block 3210). That is, the detected physical phenomena (visual scene, sound, vibrations, etc.) are acquired and converted into digital data in the form of, for example, a digital image, a digital video, a digital sound file, or a digital representation representing the detected vibrations. In addition, the method 3200 includes sending the digital data to an expert system (block 3215) and analyzing the digital data in the expert system to determine the status of one or more process units (block 3220). For example: where the detected physical phenomenon is a visual scene of a flame, analyzing the data may include analyzing a color associated with one or more portions of the flame, analyzing a shape of the flame, and/or analyzing movement of the flame; where the detected physical phenomenon is a sound or vibration associated with movement of the fluid, analyzing the data may include detecting a void associated with the movement of the fluid; where the detected physical phenomenon is a visual scene of the top of the chimney, analyzing the data may include analyzing the color or volume of the emitted smoke.
In various embodiments, the method 3200 may further include detecting an abnormal condition associated with one or more process units, determining a cause of the abnormal condition from the digital data, automatically initiating changes to one or more process control parameters to correct the abnormal condition, automatically creating a work item to cause a person to take an action to correct the abnormal condition, providing an indication to an operator that a corrective action is to be taken to address the abnormal condition, and/or determining a fuel composition associated with the flame or combustion chamber.
The following additional considerations may apply to the above discussion. In this description, the actions performed by server 150, UI device 112, or any other device or routine generally refer to the actions or processes of a processor manipulating or transforming data according to machine-readable instructions. The machine-readable instructions may be stored on and retrieved from a memory device communicatively coupled to the processor. That is, the methods described herein may be embodied by a set of machine-executable instructions stored on a computer-readable medium (i.e., on a memory device). When executed by one or more processors of a corresponding device (e.g., a server, a mobile device, etc.), the instructions cause the processors to perform the method. When instructions, routines, modules, processes, services, programs, and/or applications are referred to herein as being stored or stored on a computer-readable memory or on a computer-readable medium, the words "store" or "storing" are intended to exclude transitory signals.
The user interface devices are interchangeably referred to as "UI devices" and "mobile UI devices" in this specification. Also, in most cases, in certain descriptions, these devices are referred to simply as "UI devices," and in certain example uses, the word "move" is added to indicate that the UI device may be a mobile UI device. The use or non-use of the word "mobile" should not be considered limiting, and the concepts described herein may be applied to any and all UI devices capable of being used in a process plant environment.
Although many of the examples herein refer to a browser displaying information, each of these examples contemplates the use of a navigation application that communicates with a server to provide information. The native application may be designed for any mobile platform, any workstation operating system, or any combination of mobile platforms and/or workstation operating systems and/or web browsers. For example, the mobile UI device may operate in AndroidTMOn a platform, and a cooperating fixed UI device (e.g., workstation) may run on
Figure BDA0002525522080001001
7 on a platform.
Moreover, although the words "operator," "person," "user," and "technician" and other similar words have been used to describe personnel in a process plant environment that may interact with or use the systems, apparatus, and methods described herein, these words are not intended to be limiting. As can be appreciated from the foregoing, the systems, apparatus, and methods described herein may have benefits or effects that release plant personnel to some extent from the traditional boundaries of process control systems. That is, the operator may undertake some activities that are traditionally engaged in by a technician, and the technician may engage in activities that are traditionally reserved for the operator, and the like. Where specific words are used in the specification, the words are used in part because of the traditional activities in which plant personnel are involved, but are not intended to indicate that personnel are capable of participating in the specific activity.
Furthermore, in this specification, a plurality of examples may implement a component, an operation, or a structure described as a single example. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in the example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be shown as separate components. These and other variations, modifications, additions, and improvements may fall within the scope of the subject matter as set forth herein.
Unless specifically stated otherwise, discussions utilizing words such as "processing," "computing," "calculating," "determining," "identifying," "presenting," or "displaying" herein may refer to the action or processes of a machine (e.g., a computer), that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or a combination thereof), registers, or other machine components that receive, store, transmit, or display information.
When implemented as software, any of the applications, services, and engines described herein can be stored in any tangible, non-transitory computer-readable memory (e.g., magnetic disks, optical disks, solid-state memory devices, molecular memory storage devices, or other storage media), in RAM or ROM of a computer or processor, etc. Although the example systems disclosed herein are disclosed as including, among other components, software and/or firmware executed on hardware, it should be noted that such systems are merely illustrative and should not be considered as limiting. For example, it is contemplated that any or all of these hardware, software, and firmware components could be embodied exclusively in hardware, exclusively in software, or in any combination of hardware and software. Accordingly, one of ordinary skill in the art will readily appreciate that the examples provided are not the only way to implement such a system.
Thus, while the present invention has been described with reference to specific examples, which are intended to be illustrative only and not to be limiting of the invention, it will be apparent to those of ordinary skill in the art that various changes, additions or deletions may be made to the disclosed embodiments without departing from the spirit and scope of the invention.
Aspect(s)
The following aspects of the disclosure are merely exemplary and are not intended to limit the scope of the disclosure.
1. An automated computer-implemented method of assigning tasks to employees in a process plant, the method performed by a supervisory module and comprising: receiving data from an expert system; creating a task-specified work item based on the data received from the expert system; selecting a person to perform the task specified in the work item; sending the work item to a device associated with the selected person; and receiving an indication that the selected person has accepted the work item.
2. The method of aspect 1, wherein the step of receiving data from an expert system includes receiving data indicative of a trend related to a process parameter.
3. The method of aspect 1 or aspect 2, wherein receiving data from an expert system includes receiving data indicative of a predicted problem in the process plant.
4. The method of any of aspects 1-3, wherein the step of receiving data from an expert system comprises receiving a request to provide a parameter value for the expert system.
5. The method according to aspect 4, wherein creating a task-specified work item from the data received from the expert system includes creating a work item in which the specified task is observing and recording a parameter value that is not automatically communicated from a device that senses the parameter.
6. The method of any of the preceding aspects, wherein receiving data from the expert system includes receiving instructions to perform a particular action with respect to the process control device.
7. The method according to any of the preceding aspects, wherein the step of creating task-specified work items from the data received from the expert system comprises creating work items, the specified ones of the work items being performance maintenance tasks, calibration tasks, replacement tasks, inspection tasks, or repair tasks.
8. The method of any of the preceding aspects, wherein creating a task-specified workitem comprises creating a task-specified workitem and specifying a device object associated with the specified task.
9. The method of aspect 8, wherein selecting a person to perform the task specified in the work item comprises selecting a person based on location data received from a device associated with the selected person.
10. The method according to any of the preceding aspects, further comprising creating and storing a license token associated with the prescribed task, the process control device associated with the prescribed task, or both, wherein the license token is requested to cause the selected person to perform the prescribed task on the process control device associated with the prescribed task.
11. The method according to any of the preceding aspects, wherein selecting personnel to perform the task specified in the work item includes selecting personnel based on (1) the task specified in the work item, (ii) a process control device associated with the specified task, or (iii) both, and (2) a plurality of personnel profiles accessible to the supervisor module.
12. The method of aspect 11, wherein the step of selecting the person based on a plurality of person profiles accessible to the supervisor module comprises selecting the person based on skill combinations, roles, authentication, or credentials.
13. The method of any preceding aspect, wherein the step of receiving data from an expert system comprises receiving an indication to perform at least one of the following actions: (i) observing and recording parameters; (ii) an inspection process control device; (iii) calibrating the process control device; (iv) recording an audio sample; (v) capturing an image or video; (vi) maintaining the process control device; (vii) a repair process control device; (viii) replacing the process control device; or (ix) adjusting a process control parameter.
14. The method according to any one of the preceding aspects, wherein the step of creating a work item comprises creating a work item that specifies a task and a device object related to the specified task, further specifying at least one of: (i) the tools or equipment required to perform the prescribed task; (ii) the priority of the work item; (iii) a combination of skills required to perform the prescribed task; (iv) the desired start time and/or date; or (v) a desired completion time and/or date.
15. The method of any of the preceding aspects, further comprising scheduling execution of the work item.
16. The method of aspect 15, wherein scheduling execution of the work item comprises scheduling execution of the work item based on at least one of: (i) a predetermined route through the process plant associated with the selected personnel; (ii) scheduled delivery of input material for a process performed by the process plant; (iii) scheduled delivery of products produced by the process plant; (iv) predicted weather conditions; (v) a scheduled shipment time for the product produced by the process plant; (vi) a predicted or predetermined completion time for a process of the process plant; or (vii) a predicted or scheduled arrival of a tool, device, or component required to complete the specified task.
17. The method of any of the preceding aspects, wherein the step of selecting a person to perform the task specified in the work item comprises storing the work item in a database from which the person selects a work item to perform.
18. The method of aspect 17, wherein the step of selecting a person to perform the task specified in the work item further comprises: receiving a request from a device associated with a person to perform the work item; the profile associated with the person is compared to the information stored in the work item to determine whether the person is eligible to perform the work item.
19. A computer-readable storage medium storing instructions executable on a processor for causing the processor to: receiving data from an expert system; creating a work item specifying a task based on the data received from the expert system; selecting a person to perform the task specified in the work item; transmitting the work item to a device associated with the selected person; and receiving an indication that the selected person has accepted the work item.
20. A computer readable storage medium according to aspect 19, wherein the instructions for causing the processor to create a work item from the data received from the expert system include instructions for causing the processor to create a work item in which the prescribed task is for performing a maintenance task, a calibration task, a replacement task, an inspection task, or a repair task.
21. The computer-readable storage medium recited in aspect 19 or aspect 20, wherein the instructions for causing the processor to create a work item that specifies a task comprise instructions for causing the processor to specify a device object that is related to the specified task.
22. The computer-readable storage medium of aspect 21, wherein the instructions for causing the processor to select a person to perform the task specified in the work item comprise instructions for causing the processor to select a person based on location data received from a device associated with the selected person.
23. The computer-readable storage medium of any of aspects 19 to 22, further comprising instructions for causing the processor to create and store a license token associated with the prescribed task, with a process control device associated with the prescribed task, or both, wherein the license token is requested for the selected person to perform the prescribed task on the process control device associated with the prescribed task.
24. The computer-readable storage medium of any of aspects 19-23, wherein the instructions for causing the processor to select a person to perform the task specified in the work item comprise instructions for causing the processor to select the person based on (1) (i) the task specified in the work item, (ii) a process control device associated with the specified task, or (iii) both, and (2) a plurality of person profiles accessible to the supervisor module.
25. A computer readable storage medium according to aspect 24, wherein the instructions for causing the processor to select the person based on a plurality of person profiles comprise instructions for causing the processor to select the person based on skill combinations, roles, authentication, or credentials.
26. A computer readable storage medium according to any of aspects 19 to 25, wherein the instructions for causing the processor to receive data from the expert system include instructions for causing the processor to receive an indication to generate a work item for one of: (i) observing and recording parameters; (ii) an inspection process control device; (iii) calibrating the process control device; (iv) recording an audio sample; (v) capturing an image or video; (vi) maintaining the process control device; (vii) a repair process control device; (viii) a replacement process control device; or (ix) adjusting a process control parameter.
27. The computer readable storage medium of any of aspects 19-26, wherein the instructions for causing the processor to create work items include instructions for causing the processor to create work items that specify tasks and device objects related to the specified tasks, further specifying at least one of: (i) the tools or equipment required to perform the prescribed task; (ii) the priority of the work item; (iii) a combination of skills required to perform the prescribed task; (iv) the desired start time and/or date; or (v) a desired completion time and/or date.
28. The computer-readable storage medium of any of aspects 19 to 27, further comprising instructions for causing the processor to schedule execution of the work item.
29. The computer-readable storage medium of aspect 28 wherein the instructions for causing the processor to schedule execution of the work item comprise instructions for causing the processor to schedule execution of the work item based on at least one of: (i) a predetermined route through the process plant associated with the selected personnel; (ii) a scheduled delivery of input material for a process performed by the process plant; (iii) scheduled delivery of products produced by the process plant; (iv) predicted weather conditions; (v) a scheduled shipment time for the product produced by the process plant; (vi) a predicted or predetermined completion time for a process of the process plant; or (vii) a predicted or scheduled arrival of a tool, device, or component required to complete the specified task.
30. The computer readable storage medium of any of aspects 19-29, wherein the instructions for causing the processor to select a person to perform the task specified by the work item include instructions for storing the work item in a database in which a person selects a work item to perform.
31. The computer-readable storage medium of aspect 30, wherein the instructions for causing the processor to select a person to perform the task specified in the work item further comprise instructions for causing the processor to: receiving a request from a device associated with a person to perform the work item; the profile associated with the person is compared to the information stored in the work item to determine whether the person is eligible to perform the work item.
32. A process control system comprising: a plurality of process control devices; historical big data storage sensors and parameter data of the process control system; an expert system coupled to the historical big data and configured to analyze the data stored by the historical big data; a supervisory module coupled to the expert system and configured to assign tasks to personnel in the process plant. The supervision module is configured to: receiving data from the expert system; creating a work item specifying a task based on the data received from the expert system; selecting a person to perform the task specified in the work item; sending the work item to a device associated with the selected person; and receiving an indication that the selected person has accepted the work item.
33. The system of aspect 32, wherein receiving data from the expert system includes receiving data indicative of a trend associated with a process parameter.
34. The system according to any of aspects 32 or 33, wherein receiving data from the expert system includes receiving data indicative of a predicted problem in the process plant.
35. The system according to any one of aspects 32-34, wherein receiving data from the expert system includes receiving a request to provide a parameter value for the expert system.
36. The system of aspect 35, wherein creating a task-specified work item from the data received from the expert system comprises creating a work item, the specified task in the work item being to observe and record a parameter value not automatically transmitted from a device sensing the parameter.
37. The system of any of aspects 32-36, wherein receiving data from the expert system includes receiving instructions to perform a particular action with respect to the process control device.
38. The system according to any one of aspects 32 to 37, wherein creating a task-specified workitem from the data received from the expert system comprises creating a workitem, wherein the specified task is performing a maintenance task, a calibration task, a replacement task, an inspection task, or a repair task.
39. The system according to any of aspects 32-38, wherein creating a work item that specifies a task includes creating a work item that specifies a task and specifying a device object related to the specified task.
40. The system of aspect 39, wherein selecting a person to perform the task specified in the work item comprises selecting a person based on location data received from a device associated with the selected person.
41. The system according to any of aspects 32-40, the supervisory module is further operable to create and store a permission token associated with the prescribed task, with the process control device associated with the prescribed task, or with both. Wherein the license token is requested to cause the selected person to perform the prescribed task on the process control device associated with the prescribed task.
42. The system according to any of aspects 32-41, wherein selecting personnel to perform the task specified in the work item includes selecting personnel according to (1) (i) the task specified in the work item, (ii) a process control device associated with the specified task, or (iii) both, and (2) a plurality of personnel profiles accessible to the supervisor module.
43. The system of aspect 42, wherein selecting the person based on the plurality of person profiles accessible to the supervisor module comprises selecting the person based on skill combinations, roles, authentication, or credentials.
44. The system of any of aspects 32-43, wherein receiving data from the expert system comprises receiving an indication to perform at least one of the following: (i) observing and recording parameters; (ii) an inspection process control device; (iii) calibrating the process control device; (iv) recording an audio sample; (v) capturing an image or video; (vi) maintaining the process control device; (vii) a repair process control device; (viii) a replacement process control device; or (ix) adjusting a process control parameter.
45. The system of any of aspects 32 to 44, wherein creating a work item comprises creating a work item that specifies a task and a device objective related to the specified task, further specifying at least one of: (i) the tools or equipment required to perform the prescribed task; (ii) the priority of the work item; (iii) a combination of skills required to perform the prescribed task; (iv) the desired start time and/or date; or (v) a desired completion time and/or date.
46. The system of any of aspects 32-45, wherein the supervisor module is further configured to schedule execution of the work item.
47. The system of any of aspects 32-46, wherein scheduling execution of the work item comprises scheduling execution of the work item based on at least one of: (i) a predetermined route through the process plant associated with the selected personnel; (ii) a scheduled delivery of input material for a process performed by the process plant; (iii) scheduled delivery of products produced by the process plant; (iv) predicted weather conditions; (v) a scheduled shipment time for the product produced by the process plant; (vi) a predicted or predetermined completion time for a process of the process plant; or (vii) a predicted or scheduled arrival of a tool, device, or component required to complete the specified task.
48. The system according to any one of aspects 32 to 47, wherein selecting a person to perform the task specified in the work item comprises storing the work item in a database from which the person selects a work item to perform.
49. The system of aspect 48, wherein selecting a person to perform the task specified in the work item further comprises: receiving a request from a device associated with a person to perform the work item; the profile associated with the person is compared to the information stored in the work item to determine whether the person is eligible to perform the work item.

Claims (27)

1. An automated computer-implemented method of assigning tasks to employees in a process plant, the method being performed by a supervisory module comprising:
receiving data at the supervisor module from an expert system coupled to historical big data and used to analyze the data stored by the historical big data;
creating, by the supervisor module, a work item from the data received from the expert system, the work item specifying a maintenance, repair, or diagnostic task, the work item further specifying at least a target device, the target device associated with the maintenance, repair, or diagnostic task;
selecting, by the supervisory module, a person to perform the maintenance, repair, or diagnostic task specified in the work item, wherein the selection of the person to perform the maintenance, repair, or diagnostic task specified in the work item is determined by receiving an indication of a proximity of a mobile control device to a process entity from a mobile control device associated with the selected person and adapted to communicate with a controller that controls the process plant to perform an action in the process plant based on information obtained at the mobile control device;
sending the work item from the supervisor module to the mobile control device from the supervisor module; and
receiving, at the supervisor module, an indication that the selected person has accepted the work item,
wherein the step of receiving data from the expert system comprises one or more of: (i) receiving data indicative of a trend related to a process parameter; (ii) receiving data indicative of a predicted problem in the process plant; (iii) receiving a request to provide a parameter value for the expert system; and (iv) receive instructions to perform a particular action with respect to the process control device.
2. The method of claim 1, wherein creating a work item specifying maintenance, repair, or diagnostic tasks from data received from the expert system comprises creating a work item, the specified maintenance, repair, or diagnostic tasks in the work item being to observe and record parameter values not automatically transmitted from a device sensing the parameter.
3. The method of claim 1, wherein creating a work item that specifies a maintenance, repair, or diagnostic task from the data received from the expert system comprises creating a work item in which the specified maintenance, repair, or diagnostic task is to perform a calibration task, a replacement task, or an inspection task.
4. The method of claim 1, wherein creating a work item that specifies a task comprises creating a maintenance, repair, or diagnostic work item that specifies a task, and further comprising specifying a device object that is related to the specified task.
5. The method of claim 1, further comprising creating and storing a license token associated with the prescribed task, the process control device associated with the prescribed task, or both, wherein the license token is requested in order for the selected personnel to perform the prescribed task on the process control device associated with the prescribed task.
6. The method of claim 1, wherein selecting personnel to perform the task specified in the work item comprises selecting personnel (1) to the task specified in the work item, (ii) to a process control device associated with the specified task, or (iii) to both, and (2) to a plurality of personnel profiles accessible to the supervisor module based on the following.
7. The method of claim 6, wherein selecting a person based on a plurality of person profiles accessible to the supervisor module comprises selecting a person based on skill combinations, roles, authentication, or credentials.
8. The method of claim 1, wherein receiving data from an expert system comprises receiving an indication to perform at least one of the following actions:
(i) observing and recording parameters;
(ii) an inspection process control device;
(iii) calibrating the process control device;
(iv) recording an audio sample;
(v) capturing an image or video;
(vi) maintaining the process control device;
(vii) a repair process control device;
(viii) replacing the process control device; or
(ix) Process control parameters are adjusted.
9. The method of claim 1, wherein creating a work item includes creating a work item that specifies a task and a device object related to the specified maintenance, repair, or diagnostic task, further specifying at least one of:
(i) a tool or device required to perform the prescribed task;
(ii) a priority of the work item;
(iii) a skill combination required to perform the prescribed task;
(iv) the desired start time and/or date; or
(v) The required completion time and/or date.
10. The method of claim 1, further comprising scheduling execution of the work item.
11. The method of claim 10, wherein scheduling execution of the work items comprises scheduling execution of the work items according to at least one of:
(i) a predetermined route through the process plant associated with a selected person;
(ii) a scheduled delivery of input material for a process performed by the process plant;
(iii) a scheduled delivery of a product produced by the process plant;
(iv) predicted weather conditions;
(v) a scheduled shipment time for a product produced by the process plant;
(vi) a predicted or predetermined completion time of a process of the process plant; or
(vii) A predicted or scheduled arrival of a tool, device, or part thereof required to complete the specified task.
12. The method of claim 1, wherein the step of selecting a person to perform the task specified in the work item comprises storing the work item in a database from which the person selects a work item to perform.
13. The method of claim 12, wherein selecting a person to perform the task specified in the work item further comprises:
receiving a request from a device associated with a person to execute the work item; and
comparing the profile associated with the person to information stored in the work item for determining whether the person is eligible to execute the work item.
14. A process control system comprising:
a plurality of process control devices;
historical big data storage sensors and parameter data of the process control system;
an expert system coupled to the historical big data and for analyzing the data stored by the historical big data; and
a supervisory module coupled to the expert system and configured to assign maintenance, repair, or diagnostic tasks to personnel in the process plant, the supervisory module configured to:
receiving data from the expert system;
creating a task-specified work item from the data received from the expert system;
selecting a person to perform the maintenance, repair, or diagnostic task specified in the work item, the work item further specifying at least a target device associated with the maintenance, repair, or diagnostic task, wherein the selection of the person to perform the maintenance, repair, or diagnostic task specified in the work item is determined by receiving an indication of a proximity of a mobile control device to a process entity based on information obtained at the mobile control device associated with the selected person and adapted to communicate with a controller that controls the process plant to perform an action in the process plant;
sending the work item to the mobile control device associated with the selected person; and
receiving an indication that the selected person has accepted the work item
Wherein the step of receiving data from the expert system comprises one or more of: (i) receiving data indicative of a trend related to a process parameter; (ii) receiving data indicative of a predicted problem in the process plant; (iii) receiving a request to provide a parameter value for the expert system; and (iv) receive instructions to perform a particular action with respect to the process control device.
15. The system of claim 14, wherein creating work items that specify maintenance, repair, or diagnostic tasks from data received from the expert system comprises creating work items, the specified ones of the work items being for observing and recording parameter values that are not automatically communicated from a device that senses the parameters.
16. The system of claim 14, wherein creating work items that specify maintenance, repair, or diagnostic tasks from the data received from the expert system comprises creating work items, the specified ones of the work items being for performing calibration tasks, replacement tasks, or inspection tasks.
17. The system of claim 14, wherein creating a work item that specifies a task comprises creating a work item that specifies a maintenance, repair, or diagnostic task, further comprising specifying a device object related to the specified task.
18. The system of claim 17, wherein selecting a person to perform the task specified in the work item comprises selecting a person based on location data received from a device associated with the selected person.
19. The system of claim 14, the supervisory module further to create and store a license token associated with the prescribed task, the process control device associated with the prescribed task, or both, wherein the license token is requested to cause the selected personnel to perform the prescribed task on the process control device associated with the prescribed task.
20. The system of claim 14, wherein selecting a person to perform the task specified in the work item comprises selecting a person based on (1) the task specified in the work item, (ii) a process control device associated with the specified task, or (iii) both, and (2) a plurality of person profiles accessible to the supervisor module.
21. The system of claim 20, wherein selecting a person based on a plurality of person profiles accessible to the supervisor module comprises selecting a person based on skill combinations, roles, authentication, or credentials.
22. The system of claim 14, wherein receiving data from the expert system comprises receiving an indication to perform at least one of:
(i) observing and recording parameters;
(ii) an inspection process control device;
(iii) calibrating the process control device;
(iv) recording an audio sample;
(v) capturing an image or video;
(vi) maintaining the process control device;
(vii) a repair process control device;
(viii) replacing the process control device; or
(ix) Process control parameters are adjusted.
23. The system of claim 14, wherein creating a work item includes creating a work item that specifies a maintenance, repair, or diagnostic task and a device object related to the specified task, further specifying at least one of:
(i) a tool or device required to perform the prescribed task;
(ii) a priority of the work item;
(iii) a skill combination required to perform the prescribed task;
(iv) the desired start time and/or date; or
(v) The required completion time and/or date.
24. The system of claim 14, wherein the supervisory module is further operable to schedule execution of the work items.
25. The system of claim 24, wherein scheduling execution of the work items comprises scheduling execution of the work items according to at least one of:
(i) a predetermined route through the process plant associated with a selected person;
(ii) a scheduled delivery of input material for a process performed by the process plant;
(iii) a scheduled delivery of a product produced by the process plant;
(iv) predicted weather conditions;
(v) a scheduled shipment time for a product produced by the process plant;
(vi) a predicted or predetermined completion time of a process of the process plant; or
(vii) A predicted or scheduled arrival of a tool, device, or part thereof required to complete the specified task.
26. The system of claim 14, wherein selecting a person to perform the task specified in the work item comprises storing the work item in a database from which a person selects a work item to perform.
27. The system of claim 26, wherein selecting a person to perform the task specified in the work item further comprises: receiving a request from a device associated with a person to execute the work item; the profile associated with the person is compared to information stored in the work item for determining whether the person is eligible to execute the work item.
CN202010503158.6A 2013-03-15 2014-03-14 Supervisory engine for process control Pending CN111624967A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201361792109P 2013-03-15 2013-03-15
US61/792,109 2013-03-15
US14/028,972 2013-09-17
US14/028,972 US11112925B2 (en) 2013-03-15 2013-09-17 Supervisor engine for process control
CN201410097873.9A CN104049584A (en) 2013-03-15 2014-03-14 Supervisor engine for process control

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201410097873.9A Division CN104049584A (en) 2013-03-15 2014-03-14 Supervisor engine for process control

Publications (1)

Publication Number Publication Date
CN111624967A true CN111624967A (en) 2020-09-04

Family

ID=50490617

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201410097873.9A Pending CN104049584A (en) 2013-03-15 2014-03-14 Supervisor engine for process control
CN202010503158.6A Pending CN111624967A (en) 2013-03-15 2014-03-14 Supervisory engine for process control

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201410097873.9A Pending CN104049584A (en) 2013-03-15 2014-03-14 Supervisor engine for process control

Country Status (3)

Country Link
CN (2) CN104049584A (en)
DE (1) DE102014103538A1 (en)
GB (1) GB2513958B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200349665A1 (en) * 2019-05-03 2020-11-05 United States Postal Service Informed mobility platform for an item processing supervisor or user within a distribution facility
CN115392804A (en) * 2022-10-28 2022-11-25 四川安洵信息技术有限公司 Talent enabling method and system based on big data

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014104673A1 (en) * 2014-04-02 2015-10-08 Infineon Technologies Ag Process support system and method for supporting a process
CN104574557B (en) * 2015-01-21 2017-02-22 浪潮通信信息系统有限公司 Alarm-based site polling method, alarm-based site polling manipulation device and alarm-based site polling system
JP6610124B2 (en) 2015-09-25 2019-11-27 富士ゼロックス株式会社 Information processing apparatus and program
EP3156857B1 (en) 2015-10-16 2021-01-13 RP-Technik GmbH Device and method for monitoring a building
US20170171176A1 (en) * 2015-12-11 2017-06-15 Lenovo Enterprise Solutions (Singapore) Pte. Ltd. Maintenance credential permitting performance of just maintenance-related actions when computing device requires repair and/or maintenance
DE102016113214A1 (en) * 2016-07-18 2018-01-18 Prominent Gmbh Dosing device with communication interface
KR102405377B1 (en) * 2016-08-25 2022-06-07 크라운 이큅먼트 코포레이션 Observation-based event tracking
CN109716249B (en) * 2016-09-09 2022-09-13 德马泰克公司 Communication system for operation and management of workflows and integration of multiple devices utilizing different operating platforms
CN106849364A (en) * 2017-03-20 2017-06-13 四川洪诚电气科技有限公司 Allocation transformer intelligent monitoring management system
DE102017108622A1 (en) * 2017-04-23 2018-10-25 Goodly Innovations GmbH SYSTEM FOR SUPPORTING TEAMWORK BY AUGMENTED REALITY
US10719795B2 (en) 2017-10-27 2020-07-21 International Business Machines Corporation Cognitive learning workflow execution
US10713084B2 (en) 2017-10-27 2020-07-14 International Business Machines Corporation Cognitive learning workflow execution
US10719365B2 (en) 2017-10-27 2020-07-21 International Business Machines Corporation Cognitive learning workflow execution
US10552779B2 (en) 2017-10-27 2020-02-04 International Business Machines Corporation Cognitive learning workflow execution
US11281877B2 (en) 2018-06-26 2022-03-22 Columbia Insurance Company Methods and systems for guided lock-tag-try process
CN109523124A (en) * 2018-10-15 2019-03-26 平安科技(深圳)有限公司 Asset data processing method, device, computer equipment and storage medium
JP6754471B1 (en) 2019-06-11 2020-09-09 株式会社 日立産業制御ソリューションズ Business systems and programs
RU2743136C1 (en) * 2020-04-24 2021-02-15 Игорь Николаевич Пантелеймонов System and method of automated accounting of workers' production operations
DE102021112662A1 (en) 2021-05-17 2022-11-17 Carmen Held Expert system implemented on an electronic data processing system for diagnosing a suitability for working in the home office, in particular for employees

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1409179A (en) * 2001-09-17 2003-04-09 费舍-柔斯芒特系统股份有限公司 Monitoring treatment property and device and controlling synthesizing
CN1409232A (en) * 2001-09-13 2003-04-09 费舍-柔斯芒特系统股份有限公司 Portable computer in process controlling environment
CN1757002A (en) * 2003-02-28 2006-04-05 费舍-柔斯芒特系统股份有限公司 Delivery of process plant notifications
US20070078696A1 (en) * 2005-08-30 2007-04-05 Invensys Systems Inc. Integrating high level enterprise-level decision- making into real-time process control
US20090150209A1 (en) * 2000-09-06 2009-06-11 Masterlink Corporation System and method for managing mobile workers
CN201465281U (en) * 2009-07-27 2010-05-12 天津市津达执行器有限公司 Hand-held controller with Bluetooth communicating function
CN102239452A (en) * 2008-12-05 2011-11-09 费希尔控制国际公司 Method and apparatus for operating field devices via a portable communicator

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE60135086D1 (en) * 2000-01-06 2008-09-11 Rapp Roy W APPARATUS AND METHOD FOR PAPER-FREE TABLET AUTOMATION
US7900152B2 (en) * 2005-03-03 2011-03-01 Microsoft Corporation Adaptable user interface for business software
JP4927694B2 (en) * 2007-12-10 2012-05-09 本田技研工業株式会社 Scheduling apparatus, work management method and program
CN102184489A (en) * 2011-05-27 2011-09-14 苏州两江科技有限公司 Knowledge-based workflow management system
CN102867237A (en) * 2012-09-08 2013-01-09 无锡中科苏惠自动化技术有限公司 Intelligent production management method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090150209A1 (en) * 2000-09-06 2009-06-11 Masterlink Corporation System and method for managing mobile workers
CN1409232A (en) * 2001-09-13 2003-04-09 费舍-柔斯芒特系统股份有限公司 Portable computer in process controlling environment
CN1409179A (en) * 2001-09-17 2003-04-09 费舍-柔斯芒特系统股份有限公司 Monitoring treatment property and device and controlling synthesizing
CN1757002A (en) * 2003-02-28 2006-04-05 费舍-柔斯芒特系统股份有限公司 Delivery of process plant notifications
US20070078696A1 (en) * 2005-08-30 2007-04-05 Invensys Systems Inc. Integrating high level enterprise-level decision- making into real-time process control
CN102239452A (en) * 2008-12-05 2011-11-09 费希尔控制国际公司 Method and apparatus for operating field devices via a portable communicator
CN201465281U (en) * 2009-07-27 2010-05-12 天津市津达执行器有限公司 Hand-held controller with Bluetooth communicating function

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200349665A1 (en) * 2019-05-03 2020-11-05 United States Postal Service Informed mobility platform for an item processing supervisor or user within a distribution facility
CN115392804A (en) * 2022-10-28 2022-11-25 四川安洵信息技术有限公司 Talent enabling method and system based on big data

Also Published As

Publication number Publication date
CN104049584A (en) 2014-09-17
GB2513958B (en) 2020-07-08
GB2513958A (en) 2014-11-12
GB201403616D0 (en) 2014-04-16
DE102014103538A1 (en) 2014-09-18

Similar Documents

Publication Publication Date Title
US11169651B2 (en) Method and apparatus for controlling a process plant with location aware mobile devices
CN110244669B (en) Mobile control room with real-time environment perception
CN111624967A (en) Supervisory engine for process control
GB2513709A (en) Method and apparatus for managing a work flow in a process plant
GB2513956A (en) Context sensitive mobile control in a process plant
CN112631217A (en) Mobile analysis of physical phenomena in a process plant
GB2513455A (en) Generating checklists in a process control environment
GB2513708A (en) Method and apparatus for seamless state transfer between user interface devices in a mobile control room
GB2513457A (en) Method and apparatus for controlling a process plant with location aware mobile control devices
CN104049591B (en) Method for initiating or restoring a mobile control session in a process plant

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination