CN110968054A - Operator shift for enabling unmanned aerial vehicle - Google Patents

Operator shift for enabling unmanned aerial vehicle Download PDF

Info

Publication number
CN110968054A
CN110968054A CN201910940142.9A CN201910940142A CN110968054A CN 110968054 A CN110968054 A CN 110968054A CN 201910940142 A CN201910940142 A CN 201910940142A CN 110968054 A CN110968054 A CN 110968054A
Authority
CN
China
Prior art keywords
robotic vehicle
process plant
unmanned robotic
user interface
unmanned
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910940142.9A
Other languages
Chinese (zh)
Other versions
CN110968054B (en
Inventor
R·G·哈尔格伦三世
C·法亚德
G·K·劳
M·J·尼克松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fisher Rosemount Systems Inc
Original Assignee
Fisher Rosemount Systems Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/580,793 external-priority patent/US11281200B2/en
Application filed by Fisher Rosemount Systems Inc filed Critical Fisher Rosemount Systems Inc
Publication of CN110968054A publication Critical patent/CN110968054A/en
Application granted granted Critical
Publication of CN110968054B publication Critical patent/CN110968054B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/4185Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by the network communication
    • G05B19/4186Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by the network communication by protocol, e.g. MAP, TOP
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/4189Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by the transport system
    • G05B19/41895Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by the transport system using automatic guided vehicles [AGV]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/4183Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by data acquisition, e.g. workpiece identification
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0027Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement involving a plurality of vehicles, e.g. fleet or convoy travelling
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0044Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with a computer generated representation of the environment of the vehicle, e.g. virtual reality, maps
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/104Simultaneous control of position or course in three dimensions specially adapted for aircraft involving a plurality of aircrafts, e.g. formation flying
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/003Flight plan management
    • G08G5/0034Assembly of a flight plan
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/55UAVs specially adapted for particular uses or applications for life-saving or rescue operations; for medical use
    • B64U2101/57UAVs specially adapted for particular uses or applications for life-saving or rescue operations; for medical use for bringing emergency supplies to persons or animals in danger, e.g. ropes or life vests

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Testing And Monitoring For Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A drone (e.g., an unmanned aerial vehicle or "UAV") equipped with cameras and sensors may be configured to travel throughout the field environment of the process plant to monitor process plant conditions. An onboard computing device associated with the drone controls movement of the drone through a field environment of the process plant. The on-board computing device interfaces with the cameras and other sensors and communicates with the user interface devices, controllers, servers, and/or databases via a network. The on-board computing device may receive drone commands from the user interface device and/or the server, or may access drone commands stored in one or more databases. The on-board computing device may send data captured by the camera and/or other sensors to the UI device, controller, server, etc. Thus, the user interface device may display data captured by the drone camera and/or drone sensor (including real-time video feedback) to an operator in the human interface application.

Description

Operator shift for enabling unmanned aerial vehicle
Technical Field
The present disclosure relates generally to industrial process plants and process control systems, and more particularly, to systems and methods for monitoring process plant conditions using drones.
Background
Distributed process control systems, such as systems used to manufacture, refine, convert, produce, or produce physical materials or products in chemical, petroleum, industrial, or other process plants, typically include one or more process controllers communicatively coupled to one or more field devices via analog, digital, or combined analog/digital buses or via wireless communication links or networks. The field devices, which may be, for example, valves, valve positioners, switches and transmitters (e.g., temperature, pressure, level and flow sensors), are located within the process environment and typically perform physical or process control functions such as opening or closing valves, measuring process and/or environmental parameters such as temperature or pressure, etc. to control one or more processes performed within the process plant or system. Smart field devices, such as field devices conforming to the well-known fieldbus protocol, may also perform control calculations, alarm functions, and other control functions typically implemented within a controller. A process controller, also typically located within the plant environment, receives signals indicative of process measurements made by the field devices and/or other information pertaining to the field devices and executes a controller application that runs various control modules, such as different control modules that make process control decisions, generates control signals based on the received information, and communicates with the field devices (e.g., the field devices)
Figure BDA0002222642040000011
Figure BDA0002222642040000012
And
Figure BDA0002222642040000013
fieldbus field devices) are coordinated by control modules or modules executing in the Fieldbus field devices. A control module in the controller sends control signals to the field devices via the communication lines or links to control the operation of at least a portion of the process plant or system, for example, to control at least a portion of one or more industrial processes running or performed in the plant or system. For example, controllers and field devices control at least a portion of a process controlled by a process plant or system. I/O devices, also typically located within the plant environment, are typically disposed between the controller and one or more field devices and communicate therebetween by converting electrical signals to digital values and vice versa. As used herein, field devices, controllers, and I/O devices are often referred to as "process control devices" and are often located, or installed in the field environment of a process control system or plant.
Information from the field devices and controllers is typically provided over a data highway or communications network to one or more other hardware devices, such as operator workstations, personal computers or computing devices, data historians, report generators, centralized databases, or other centralized management computing devices typically located in control rooms or other locations remote from the harsher field environment of the plant (e.g., in the back-end environment of the process plant). Each of these hardware devices is typically centralized across a process plant or a portion of a process plant. These hardware devices run applications that may, for example, enable an operator to perform functions with respect to controlling a process and/or operating a process plant, such as changing settings of a process control routine, modifying the operation of a controller or a control module within a field device, viewing the current state of the process, viewing alarms generated by the field device and the controller, simulating the operation of the process for the purpose of training personnel or testing process control software, maintaining and updating a configuration database, and so forth. The data highway utilized by the hardware devices, controllers, and field devices may include wired communication paths, wireless communication paths, or a combination of wired and wireless communication paths.
As an example, DeltaV sold by Emerson Process managementTMThe control system includes a plurality of applications that are stored in and executed by different devices located at different locations within the process plant. A configuration application resident in one or more workstations or computing devices in the back-end environment of a process control system or plant enables a user to create or modify process control modules and download those process control modules to dedicated distributed controllers via a data highway. Typically, these control modules are made up of communicatively interconnected function blocks, which are objects in an object-oriented programming protocol that perform functions within the control scheme based on inputs thereto and provide outputs to other function blocks within the control scheme. The configuration application may also allow configuration designers to create or modify operator interfaces that are used by the viewing application to display data to an operator and to enable the operator to modify settings, such as set points, within the process control routine. Each dedicated controller, and in some cases one or more field devices, stores and executes a corresponding controller application that runs the control module assigned and downloaded thereto to implement the actual process control function. A viewing application, which may be executed on one or more operator workstations (or one or more remote computing devices communicatively connected to the operator workstations and the data highway), receives data from the controller application via the data highway and displays the data to a designer, operator, or user of the process control system using the user interface, and may provide any of a number of different views, such as an operator view, an engineer view, a technician view, etc. The data historian application is typically stored in and executed by a data historian device that collects and stores some or all of the data provided across the data highway, while the configuration database application may run in another computer attached to the data highway to store the dataThe current process control routine configuration and data associated therewith. Alternatively, the configuration database may be located in the same workstation as the configuration application.
While these viewing applications currently provide operators with a great deal of information via operator workstations that are remote from the plant field environment, the operators must still risk entering the plant field environment to obtain certain types of information. For example, while these viewing applications may display real-time or still images captured from cameras located within the plant, these cameras typically have a fixed field of view. Thus, an operator may need to enter the field environment of the plant to investigate process plant events that fall outside the field of view of these cameras. In addition, operators are currently walking through the field environment of the process plant during the scheduling shift, where they check the plant condition, perform actions within the plant, etc., to complete the plant safety checklist. However, because hazardous conditions (such as hazardous chemicals, high temperatures, hazardous equipment, and/or loud noise) are prevalent in the field environment of a process plant, it is generally unsafe for an operator to physically enter the field environment of the process plant.
Disclosure of Invention
Multiple drones equipped with cameras and sensors may be configured to travel throughout the field environment of a process plant to monitor the condition of the process plant. Generally, drones are unmanned robotic vehicles. In some examples, the drone is an airborne drone (e.g., an unmanned aerial vehicle or "UAV"), while in other examples, the drones are ground-based or water-based drones, or drones having some combination of air-based, ground-based, and/or water-based features. An onboard computing device associated with the drone controls movement of the drone through a field environment of the process plant. In addition, the on-board computing device interfaces with the cameras and other sensors and communicates with the user interface devices, controllers, servers, and/or databases via a network. For example, the on-board computing device may receive drone commands from the user interface device and/or the server, or may access drone commands stored in one or more databases. Additionally, the on-board computing device may send data captured by the camera and/or other sensors to the UI device, controller, server, etc. Thus, the user interface device may display data captured by the drone camera and/or drone sensor (including real-time video feedback) to an operator in a human-machine interface (HMI) application.
In one aspect, a method is provided. The method comprises the following steps: receiving, by a user interface device, an indication of a user selection of a piece of equipment in an overview of a process plant displayed by the user interface device at which the unmanned robotic vehicle is to be deployed, the overview of the process plant depicting a representation of a physical location of the equipment within the process plant; sending, by a user interface device, an indication of a user selection to an onboard computing device associated with the unmanned robotic vehicle; determining, by an onboard computing device associated with the unmanned robotic vehicle, a destination location of the unmanned robotic vehicle in the process plant based on the user-selected indication; controlling, by an onboard computing device associated with the unmanned robotic vehicle, the unmanned robotic vehicle to travel from a current location in the process plant to a destination location in the process plant; capturing camera data by one or more cameras of the unmanned robotic vehicle; sending, by an onboard computing device associated with the unmanned robotic vehicle, camera data to a user interface device; and displaying the camera data alongside the overview of the process plant via the user interface device.
In another aspect, a user interface device is provided. The user interface includes: a user interface device configured to communicate with a drone robotic vehicle equipped with one or more cameras, the user interface device comprising: a display; one or more processors; and one or more memories storing a set of computer-executable instructions. The computer-executable instructions, when executed by the one or more processors, cause the user interface device to: receiving, from an onboard computing device of the unmanned robotic vehicle, data captured by one or more cameras of the unmanned robotic vehicle; displaying an overview of the process plant alongside data captured by one or more cameras of the unmanned robotic vehicle, the overview of the process plant depicting a representation of physical locations of pieces of equipment within the process plant; receiving an operator selection of a piece of equipment within an overview of a process plant at which the unmanned robotic vehicle is to be deployed; and sending a command to an onboard computing device of the unmanned robotic vehicle, the command indicating the selected piece of equipment within the overview of the process plant at which the unmanned robotic vehicle is to be deployed.
In another aspect, an unmanned robotic vehicle is provided. This unmanned aerial vehicle ware people vehicle includes: one or more cameras configured to capture images while the unmanned robotic vehicle is traveling in the process plant; one or more processors; and one or more memories storing a set of computer-executable instructions. The computer executable instructions, when executed by the one or more processors, cause the unmanned robotic vehicle to perform the following: receiving a command from a user interface device, wherein the command from the user interface device comprises a command indicating a piece of equipment in a process plant to which the unmanned robotic vehicle is to be deployed; determining a destination location in the process plant based on the indication of the piece of equipment in the process plant; travel from a current location in the process plant to a destination location in the process plant; and transmitting the images captured by the one or more cameras to a user interface device.
Drawings
FIG. 1 depicts a block diagram illustrating an example process plant in which a drone may be implemented and/or included to monitor a condition of the process plant;
fig. 2 is a block diagram of an exemplary drone onboard computing device and an exemplary user interface device shown schematically in fig. 1; and
FIG. 3 is an exemplary user interface display view of an operator application for monitoring process plant conditions using drones.
FIG. 4 is a flow diagram of an exemplary method for monitoring process plant conditions using an unmanned robotic vehicle.
Detailed Description
As described above, while current viewing applications provide operators with a great deal of information via operator workstations that are remote from the field environment of the plant, the operators must still risk entering the field environment of the plant to obtain certain types of information. For example, while these viewing applications may display real-time or still images captured from cameras located within the plant, these cameras typically have a fixed field of view. Thus, an operator may need to enter the field environment of the plant to investigate process plant events that fall outside the field of view of these cameras. In addition, operators are currently walking through the field environment of the process plant during the scheduling shift, checking the plant condition therein, performing actions within the plant, etc., to complete the plant safety checklist. However, because hazardous conditions (such as hazardous chemicals, high temperatures, hazardous equipment, and/or loud noise) are prevalent in the field environment of a process plant, it is often unsafe for an operator to physically enter the field environment of the process plant.
The systems, methods, devices, and techniques disclosed herein address these and other shortcomings of known systems by enabling operators to monitor conditions in process plants without physically accessing the hazardous field environment of these plants. In particular, systems, methods, devices, and techniques for monitoring process plant conditions using drones are disclosed herein. Advantageously, a drone equipped with a camera and other sensors can safely monitor areas of a process plant that are too dangerous for an operator to enter. For example, data captured by cameras and other sensors may be communicated to a user interface device and displayed to an operator located securely outside of the field environment.
Fig. 1 depicts a block diagram of an example process control network or system 2 operating in a process control system or process plant 10 having and/or in which embodiments of systems, methods, devices, and techniques for monitoring process plant conditions using drones described herein may be utilized. The process control network or system 2 may include a network backbone 5 that provides connectivity between various other devices, directly or indirectly. In embodiments, the devices coupled to the network backbone 5 include one or more access points 7a, one or more gateways 7b to other process plants (e.g., via an intranet or a corporate wide area network), one or more gateways 7c to external systems (e.g., to the internet), one or more User Interface (UI) devices 8, which may be fixed (e.g., conventional operator workstations) or mobile computing devices (e.g., mobile device smartphones), one or more servers 12 (e.g., which may be implemented as server banks, cloud computing systems, or other suitable configurations), a controller 11, input/output (I/O) cards 26 and 28, wired field devices 15-22, a wireless gateway 35, and a wireless communication network 70. The communication network 70 may include wireless devices including the wireless field devices 40-46, the wireless adapters 52a and 52b, the access points 55a and 55b, and the router 58. Wireless adapters 52a and 52b may be connected to non-wireless field devices 48 and 50, respectively. The controller 11 may include a processor 30, a memory 32, and one or more control routines 38. Although fig. 1 depicts only a single one of some of the devices directly and/or communicatively connected to the network backbone 5, it will be understood that each of the devices may have multiple instances on the network backbone 5, and in fact, the process plant 10 may include multiple network backbones 5.
UI device 8 may be communicatively connected to controller 11 and wireless gateway 35 via network backbone 5. The controller 11 may be communicatively connected to wired field devices 15-22 via input/output (I/O) cards 26 and 28, and may be communicatively connected to wireless field devices 40-46 via the network backbone 5 and a wireless gateway 35. The controller 11 may operate using at least some of the field devices 15-22 and 40-50 to implement a batch process or a continuous process. Controller 11 (which may be, for example, DeltaV sold by EmersonTMA controller) is communicatively connected to the process control network backbone 5. The controller 11 may also use communication protocols with, for example, standard 4-20mA devices, I/ O cards 26, 28, and/or any smart communications protocol (such as
Figure BDA0002222642040000061
Fieldbus protocol,
Figure BDA0002222642040000062
Protocol, Wireless
Figure BDA0002222642040000063
Protocol, etc.) are communicatively connected to the field devices 15-22 and 40-50. In the embodiment illustrated in FIG. 1, the controller 11, the field devices 15-22, 48, 50, and the I/ O cards 26, 28 are wired devices, and the field devices 40-46 are wireless field devices.
In operation of UI device 8, in some embodiments, UI device 8 may execute a user interface ("UI") allowing UI device 8 to accept input via an input interface and provide output at a display. UI device 8 may receive data (e.g., process-related data, such as process parameters, log data, and/or any other data that may be captured and stored) from server 12. In other embodiments, the UI may be performed on all or part of the server 12, where the server 12 may send the display data to the UI device 8. The UI device 8 may receive UI data (which may include display data and process parameter data) from other nodes in the process control network or system 2, such as the controller 11, the wireless gateway 35, and/or the server 12, via the backbone 5. Based on the UI data received at the UI device 8, the UI device 8 provides output (i.e., visual representations or graphics, some of which may be updated during runtime) representing various aspects of a process associated with the process control network or system 2. The user may also influence the control of the process by providing inputs on the UI device 8. To illustrate, UI device 8 may provide graphics representing, for example, a tank filling process. In this case, the user may read the tank level measurement and determine that the tank needs to be filled. The user may interact with the inlet valve graphics displayed at the UI device 8 and input a command to open the inlet valve.
In some embodiments, UI device 8 may implement any type of client, such as a thin client, a network client, or a thick client. For example, UI device 8 may rely on other nodes, computers, UI devices, or servers to perform the bulk of the processing necessary for the operation of UI device 8, which may be the case if the UI device is limited to memory, batteries, etc. (e.g., in a wearable device). In such an example, the UI device 8 may communicate with the server 12 or another UI device, where the server 12 or other UI device may communicate with one or more other nodes (e.g., servers) on the process control network or system 2 and may determine display data and/or process data to send to the UI device 8. Further, UI device 8 may communicate any data related to the received user input to server 12 such that server 12 may process the data related to the user input and operate accordingly. In other words, what UI device 8 may do is simply render graphics and act as a portal to one or more nodes or servers that store data and execute routines needed for the operation of UI device 8. The thin client UI device provides the advantage of minimum hardware requirements of the UI device 8.
In other embodiments, UI device 8 may be a web client. In such an embodiment, a user of the UI device 8 may interact with the process control system via a browser at the UI device 8. The browser enables a user to access data and resources at another node or server 12, such as server 12, via backbone 5. For example, the browser may receive UI data (such as display data or process parameter data) from the server 12, allowing the browser to render graphics for controlling and/or monitoring some or all of the process. The browser may also receive user input (such as clicking a mouse on a graphic). The user input may cause the browser to retrieve or access an information resource stored on the server 12. For example, a mouse click may cause a browser (from the server 12) to retrieve and display information related to the clicked graphic. In other embodiments, much of the processing of UI device 8 may occur at UI device 8. For example, UI device 8 may execute a UI as previously discussed. UI device 8 may also store, access, and analyze data locally.
In operation, a user may interact with the UI device 8 to monitor or control one or more devices in the process control network or system 2, such as any of the field devices 15-22 or devices 40-50. Additionally, the user may interact with the UI device 8, for example, to modify or alter parameters associated with control routines stored in the controller 11. The processor 30 of the controller 11 implements or oversees one or more process control routines (stored in the memory 32), which may include control loops. The processor 30 may communicate with the field devices 15-22 and 40-50 and with other nodes communicatively connected to the backbone 5. It should be noted that any of the control routines or modules described herein (including the quality prediction and fault detection modules or functional blocks) may have portions implemented or executed by different controllers or other devices, if desired. Likewise, the control routines or modules described herein that are to be implemented within a process control system may take any form, including software, firmware, hardware, etc. The control routines may be implemented in any desired software format, such as using object oriented programming, ladder logic, sequential function charts, function block diagrams, or using any other software programming language or design paradigm. In particular, the control routine may be defined and implemented by a user through UI device 8. The control routines may be stored in any desired type of memory, such as Random Access Memory (RAM) or Read Only Memory (ROM). Likewise, the control routines may be hard-coded into, for example, one or more EPROMs, EEPROMs, Application Specific Integrated Circuits (ASICs), or any other hardware or firmware elements of the controller 11. Accordingly, controller 11 may be configured (in some embodiments by a user using UI device 8) to implement (e.g., receive, store, and/or execute) a control strategy or control routine in any desired manner.
In some embodiments of the UI device 8, a user may interact with the UI device 8 to define and implement control strategies at the controllers 11 using what are commonly referred to as function blocks, where each function block is an object or other part (e.g., a subroutine) of an overall control routine and operates in conjunction with other function blocks (via communications called links) to implement process control loops within a process control system. Control-based function blocks typically perform one of an input function (such as that associated with a transmitter, a sensor, or other process parameter measurement device), a control function (such as that associated with a control routine that performs PID, fuzzy logic, etc. control), or an output function that controls the operation of a device (such as a valve) to perform some physical function within the process control system. Of course, hybrid types and other types of function blocks exist. The function blocks may have graphical representations provided at the UI device 8, allowing a user to easily modify the types of function blocks, the connections between function blocks, and the inputs/outputs associated with each of the function blocks implemented in the process control system. The function blocks may be downloaded to, stored in, and executed by the controller 11, typically as is typical when used with or associated with standard 4-20mA devices and certain types of smart field devices (such as HART), or may be stored in and implemented by the field devices themselves, which may be the case with fieldbus devices. Controller 11 may include one or more control routines 38 that may implement one or more control loops. Each control loop is typically referred to as a control module and may be implemented by one or more of the executing functional blocks.
Still referring to FIG. 1, the Wireless field devices 40-46 communicate over the Wireless network 70 using, for example, a Wireless protocol such as the Wireless HART protocol. In some embodiments, UI device 8 may be capable of communicating with wireless field devices 40-46 using a wireless network 70. Such wireless field devices 40-46 may communicate directly with one or more other nodes of the process control network or system 2 that are also configured to communicate wirelessly (e.g., using a wireless protocol). To communicate with one or more other nodes not configured for wireless communication, the wireless field devices 40-46 may utilize a wireless gateway 35 connected to the backbone 5. Of course, the field devices 15-22 and 40-46 may conform to any other desired standard(s) or protocol(s), such as any wired or wireless protocol, including any standards or protocols developed in the future.
The wireless gateway 35 may provide access to various wireless devices or nodes 40-46, 52-58 of the wireless communication network 70. In particular, the wireless gateway 35 provides a communicative coupling between the wireless devices 40-46, 52-58 and other nodes of the process control network or system 2, including the controller 11 of FIG. 1. In an exemplary embodiment, the wireless gateway 35 provides communicative coupling to the lower layers of the wired and wireless protocol stacks (e.g., address translation, routing, packet segmentation, priority, etc.) through routing, buffering, and timing services in some cases while tunneling one or more shared layers of the wired and wireless protocol stacks. In other cases, the wireless gateway 35 may translate commands between wired and wireless protocols that do not share any protocol layers.
Similar to the wired field devices 15-22, the wireless field devices 40-46 of the wireless network 70 may perform physical control functions within the process plant 10 such as opening or closing valves or taking measurements of process parameters. However, the wireless field devices 40-46 are configured to communicate using the wireless protocol of the network 70. As such, the wireless field devices 40-46, the wireless gateway 35, and the other wireless nodes 52-58 of the wireless network 70 are manufacturers and consumers of wireless communication packets.
In some cases, the wireless network 70 may include non-wireless devices 48, 50, which may be wired devices. For example, the field device 48 of FIG. 1 may be a conventional 4-20mA device and the field device 50 may be a conventional wired HART device. To communicate within the network 70, the field devices 48 and 50 may be connected to the wireless communication network 70 via respective Wireless Adapters (WAs) 52a, 52 b. In addition, the wireless adapters 52a, 52b may support other communication protocols, such as
Figure BDA0002222642040000101
Fieldbus, PROFIBUS, DeviceNet, etc. Further, the wireless network 70 may include one or more network access points 55a, 55b, which may be separate physical devices in wired communication with the wireless gateway 35, or may be provided as an integral device with the wireless gateway 35. The wireless network 70 may also include one or more routers 58 to forward packets from one wireless device to another wireless device within the wireless communication network 70. The wireless devices 40-46 and 52-58 may communicate with each other and with the wireless gateway 35 via a wireless link 60 of a wireless communication network 70.
In some embodiments, the process control network or system 2 may include other nodes connected to the network backbone 5 that communicate using other wireless protocols. For example, the process control network or system 2 may include one or more wireless access points 7a that utilize other wireless protocols (such as WiFi or other IEEE 802.11 compliant wireless local area network protocols), mobile communication protocols (such as WiMAX (worldwide interoperability for microwave Access), LTE (Long term evolution) or other ITU-R (International telecommunication Union radio communication department) compatible protocols, short-wave radio communications (such as Near Field Communications (NFC) and Bluetooth), and/or other wireless communication protocols. typically, such wireless access points 7a allow handheld or other portable computing devices to communicate over a corresponding wireless network that is different from the wireless network 70 and supports a wireless protocol that is different from the wireless network 70. in some embodiments, the UI device 8 communicates over the process control network or system 2 using the wireless access points 7 a. in some cases, in addition to portable computing devices, one or more process control devices (e.g., the controllers 11, the field devices 15-22, the wireless devices 35, 40-46, 52-58) may also communicate using a wireless network supported by the access point 7 a.
Additionally or alternatively, the process control network or system 2 may include one or more gateways 7b, 7c of systems external to the instant process control system. In such embodiments, UI device 8 may be used to control, monitor, or otherwise communicate with the external system. Typically, such systems are customers and/or suppliers of information generated or operated by process control systems. For example, the plant gateway node 7b may communicatively connect the instant process plant 10 (having its own respective process control data network backbone 5) with another process plant having its own respective network backbone. In an embodiment, a single network backbone 5 may serve multiple process plants or process control environments.
In another example, the plant gateway node 7b may communicatively connect the instant process plant to a conventional or prior art process plant that does not include a process control network or system 2 or backbone 5. In this example, the plant gateway node 7b may convert or translate messages between the protocol used by the process control big data backbone 5 of the plant 10 and different protocols used by conventional systems (e.g., Ethernet, Profibus, Fieldbus, DeviceNet, etc.). In such examples, the UI device 8 may be used to control, monitor, or otherwise communicate with systems or networks in the conventional or prior art process plant.
The process control network or system 2 may include one or more external system gateway nodes 7c to communicatively connect the process control network or system 2 with external public or private systems, such as laboratory systems (e.g., laboratory information management systems or LIMS), personnel shift databases, material handling systems, maintenance management systems, product inventory control systems, production scheduling systems, weather data systems, transportation and processing systems, packaging systems, the internet, another provider's process control system, and/or other external systems. The external system gateway node 7c may, for example, facilitate communication between the process control system and personnel outside the process plant (e.g., personnel within the home).
Although FIG. 1 illustrates a single controller 11 having a limited number of field devices 15-22, 40-46, and 48-50 communicatively connected thereto, this is merely an exemplary and non-limiting embodiment. Any number of controllers 11 may be included in the process control network or system 2 and any of the controllers 11 may communicate with any number of wired or wireless field devices 15-22, 40-50 to control a process within the plant 10. Additionally, the process plant 10 may include any number of wireless gateways 35, routers 58, access points 55, wireless process control communication networks 70, access points 7a, and/or gateways 7b, 7 c.
In one example configuration of the process plant 10, multiple drones 72a, 72b equipped with respective cameras 74a, 74b and/or other sensors 76a, 76b travel throughout the field environment of the process plant 10 to monitor the condition of the process plant. Generally, the drones 72a, 72b are unmanned robotic vehicles. In some examples, the drones 72a, 72b are aerial drones (e.g., unmanned aerial vehicles or "UAVs"), while in other examples, the drones 72a, 72b are ground-based or water-based drones, or drones having some combination of aerial-, ground-, and/or water-based features. On- board computing devices 78a, 78b associated with the drones 72a, 72b control movement of the drones 72a, 72b through the field environment of the process plant 10. Additionally, on- board computing devices 78a, 78b interface with cameras 74a, 74b and/or other sensors 76a, 76b and communicate with user interface devices 8, controller 11, server 12, and/or one or more databases, e.g., via network 70. For example, the on- board computing devices 78a, 78b may receive drone commands from the user interface device 8 and/or the server 12, or may access drone commands stored in one or more databases. As another example, on- board computing devices 78a, 78b may send data captured by cameras 74a, 74b and/or other sensors 76a, 76b to UI device 8, controller 11, server 12, and/or any other suitable computing device. Thus, the user interface device 8 displays data captured by the cameras 74a, 74b and/or other sensors 76a, 76b to the operator.
Fig. 2 illustrates a block diagram of an example on-board computing device 78 (e.g., on- board computing device 78a or 78b) associated with an example drone 72 (e.g., drone 72a or 72b) and an example UI device 8, which may be used in conjunction with embodiments of a system for monitoring process plant conditions using the drones described herein. As shown in fig. 2, the exemplary drone 72 is equipped with one or more cameras 74 (which may include infrared cameras), one or more other sensors 76, and an onboard computing device 78. The camera 74, the sensor 76, and the onboard computing device 78 may be attached to the drone 72, contained within the drone 72, carried by the drone 72, or otherwise associated with the drone 72. The sensors 76 may include, for example, position sensors, temperature sensors, flame sensors, gas sensors, wind sensors, accelerometers, motion sensors, and/or other sensors, in some examples, the drone 72 is also equipped with additional accessories, such as lights, speakers, microphones, and so forth. Additionally, in some examples, a device for collecting a sample, a medical kit, a respiratory device, a defibrillator unit, or the like is attached to the drone 72, contained within the drone 72, carried by the drone 72, or otherwise associated with the drone 72. Further, the drone 72 may be equipped with other mechanical components for performing various operations within the plant 10. For example, the drone 72 may be equipped with a robotic arm for activating a switch or collecting samples.
The onboard computing device 78 interfaces with the cameras 74, sensors 76, and/or additional mechanical components and executes applications to receive drone path commands, control the path of the drone 72, and store and transmit data captured by the drone cameras 74 and/or drone sensors 76. On-board computing device 78 typically includes one or more processors or CPUs 80, memory 82, Random Access Memory (RAM)84, input/output (I/O) circuitry 86, and a communication unit 88 to send and receive data via a local area network, a wide area network, and/or any other suitable network, which may be wired and/or wireless (e.g., network 70). For example, on-board computing device 78 may communicate with UI device 8, controller 11, server 12, and/or any other suitable computing device.
Additionally, on-board computing device 78 includes a database 90 that stores data associated with drone 72. For example, the database 90 may store data captured by the drone camera 74 and/or drone sensor 76. In addition, the database 90 may store navigation data, such as maps of the plant 10. The map may include one or more paths for the drone 72 to travel through the plant 10, as well as indications of the location of various equipment within the plant 10. The indication of the location of the respective piece of plant equipment may include an indication of the location of the respective vantage point near each piece of equipment. Additionally, the indication of the location of the various pieces of plant equipment may include an indication of the location of one or more drone-activated switches (e.g., kill switches).
The memory 82 also includes a control unit 92 and an operating system 94 including one or more drone applications 96. Control unit 94 is configured to communicate with UI device 8, controller 11, server 12, and/or any other suitable computing device. For example, in some embodiments, the control unit 94 may transmit data captured by the drone camera 74 and/or the drone sensor 76 to the UI device 8, the controller 11, the server 12, and/or any other suitable computing device. Further, the control unit 94 is configured to control the movement and action of the drone 72 in the plant.
In general, the drone application 96 running on the operating system 94 may include an application that provides instructions to the control unit 94 for controlling the movement and actions of the drone 72 within the plant. Additionally, the drone application 96 may include applications for analyzing data captured by the camera 74 and/or the sensors 76, as well as for sending/receiving data from the UI device 8, the controller 11, the server 12, and/or any other suitable computing device.
For example, the drone application 96, which provides instructions to the control unit 94 to control the movement and action of the drone 72 within the plant, may include a drone navigation application. Generally, the drone navigation application utilizes some combination of navigation data stored in the database 90 and location data captured by the sensors 76 to determine the current location of the drone 72 within the plant 10. When receiving or determining the destination of the drone 72, the drone navigation application may calculate a route from the current location to the destination location.
In some examples, the destination location of the drone 72 may be preconfigured and stored in the database 90. For example, the database 90 may store a route or path that the drone 72 will repeatedly travel, or may store a particular location (e.g., proximate to particular equipment within the plant) that the drone 72 will hover over. Additionally, the database 90 may store a list of paths or destinations that the drones are to travel based on various triggers or conditions within the plant 10. For example, the database 90 may store data indicating that the drone 72 is to travel to a particular location or to travel a particular route within the plant 10 based on alarms or plant conditions within the plant 10. For example, the database 90 may store an indication of a safe location to which the drone 72 is to travel, or a safe route for the drone 72 to travel into and out of the plant 10 in the event of a fire, toxic gas leak, spill, explosion, or the like. For example, individuals within the plant 10 may follow the drone 72 away from the plant during these hazardous conditions. In other cases, the database 90 may store indications of locations or routes near fires, toxic gas leaks, spills, explosions, etc. (e.g., to capture photographs, video or other data associated with alarms or other conditions, to create drone "fences" to ensure unsafe conditions around, or to direct emergency assistance to plant conditions). Further, the database 90 may store data indicating that the drone 72 is to travel to a particular location or to travel a particular route based on the capture of trigger sensor data (e.g., sensor data indicating plant conditions).
In other examples, the destination location of drone 72 may be selected by the navigation application based on commands (e.g., operator commands) received from UI device 8, controller 11, server 12, and/or any other suitable computing device. For example, the operator may select an area of the plant 10 or a piece of equipment in the plant 10 via the UI device 8 and may select the destination based on the location of the area of the plant 10 or the piece of equipment stored in the database 90. Additionally, the operator may select a person within the plant 10 via the UI device 8, and may select a destination based on a location associated with the person (e.g., based on GPS data associated with the person, or based on a location associated with the person stored in the database 90). As another example, the operator may utilize directional control (e.g., left, right, up, down, forward, backward, etc.), and the destination may be selected by the navigation application based on directional movement from the current location of the drone 72.
Further, the drone application 96 that provides instructions to the control unit 94 to control the actions of the drone 72 within the plant may include applications that provide instructions to the control unit to cause the drone to activate a switch (e.g., a kill switch for shutting down equipment in unsafe places (e.g., a drilling site)), take measurements, perform evaluations, collect samples, issue alerts (via an auxiliary speaker system), listen for voice commands (via an auxiliary microphone), pick up objects (e.g., tools, medical kits, etc.) and bring them to a new location within the plant (e.g., a location near an individual), switch objects as needed for various situations, assist individuals within the plant as needed (e.g., as a "chore"), and so forth. As with the navigation destination, in some examples, actions for the drone 72 to perform may be preconfigured and stored in the database 90, while in other examples, actions performed by the drone 72 may be selected based on commands (e.g., operator commands) received from the UI device 8, the controller 11, the server 12, and/or any other suitable computing device.
Additionally, the drone application 96 for analyzing data captured by the camera 74 and/or the sensor 76 may include an image recognition application for analyzing photographs or video from the camera 74, or the sensor analysis application 76 configured to analyze data from the drone sensor 76, or some combination of the two, for automatically identifying indications of various conditions within the process plant, such as overheating, fire, smoke, motion, leaks (including identifying the size of the leak), drips, puddles, vapors, or valve conditions (including identifying whether the valve is open or closed). Further, applications running on the operating system 90 may include a sensor analysis application for determining primary wind speed and wind direction based on data from the drone sensor 76 (e.g., to help assess toxic gas leakage effects). The drone application 96 for analyzing data captured by the camera 74 and/or the sensor 76 may further include a monitoring application. For example, the monitoring application may analyze facial features of individuals within the plant to identify unauthorized individuals within the plant. As another example, the monitoring application may analyze data captured by the motion sensor to determine whether an unauthorized individual is traversing the plant. Additionally, the monitoring application may analyze the photographs or videos captured by the drone camera 74 to identify indications of unauthorized entry into the process plant, such as a damaged fence or a broken door.
Additionally or alternatively, the drone application 96 may include an application that causes the drone 72 to interact with other drones within the plant to create a field grid and/or wireless backhaul to enable communication in locations where no communication system is in place, or where communication functionality is temporarily lost. Additionally, applications running on the operating system 90 may include a GPS application that causes the drone 72 to send GPS information to other drones or receive GPS information from a master drone in order to create a local temporary GPS system.
Returning to the UI device 8, the device 8 may be a desktop computer, such as a traditional operator workstation, a control room display, or a mobile computing device, such as a laptop computer, a tablet computer, a mobile device smartphone, a Personal Digital Assistant (PDA), a wearable computing device, or any other suitable client computing device. UI device 8 may execute a graphical display configuration application used by a configuration engineer in a configuration environment to create, generate, and/or edit display view definitions or configurations, and display view element definitions or configurations. UI device 8 may also execute operator applications used by an operator to monitor, observe, and react to various states and conditions of processes within the operating environment. UI device 8 may include a display 98. Further, UI device 8 includes one or more processors or CPUs 100, a memory 102, a Random Access Memory (RAM)104, input/output (I/O) circuitry 105, and a communication unit 106 to send and receive data via a local area network, a wide area network, and/or any other suitable network, which may be wired and/or wireless (e.g., network 70). UI device 8 may communicate with controller 11, server 12, drone onboard computing device 78, and/or any other suitable computing device.
The memory 102 may include an operating system 108, applications (such as a graphic display configuration application and an operator application) running on the operating system 108, and a control unit 110 for controlling the display 98 and communicating with the controller 11 to control the online operation of the process plant. In some embodiments, the server 12 may transmit a graphical representation of a portion of the process plant to the UI device 8, and in turn, the control unit 110 may cause the graphical representation of the portion of the process plant to be presented on the display 98. In addition, the control unit 110 may obtain user input from the I/O circuitry 105, such as user input from an operator or configuration engineer (also referred to herein as a user), and translate the user input into a request to present a graphical display view in a particular language, a request to include a graphic indicating a particular control element in an active monitor or viewing window included on the display view, a request to display an adjustment to a process parameter included in one of the process portions, and so forth.
In some embodiments, the control unit 110 may communicate the converted user input to the server 12, which may generate and send the requested UI to the UI device 8 for display. In other embodiments, control unit 110 may generate a new UI based on the converted user input and present the new UI on display 98 of UI device 8. When the converted user input is a request to display an adjustment to a process parameter included in one of the process portions, the control unit 110 may adjust a value of the process parameter on the display 98 according to the user input from the operator and may provide instructions to the controller 11 to adjust the process parameter in the process plant. In other embodiments, the control unit 110 may communicate the converted user input to the server 12, and the server 12 may generate and send the adjusted process parameter values to the UI device 8 for display and provide instructions to the controller 11 to adjust the process parameters in the process plant.
Turning now to fig. 3, an example user interface display view 200 of an operator application for monitoring process plant conditions using drones (e.g., drones 72, 72a, 72b) is depicted in accordance with an embodiment. The exemplary user interface display view 200 shown in fig. 3 is displayed via the display 92 of the user interface 8. The user interface display view 200 includes an overview 202 of the process plant displayed alongside real- time video feedback 204, 206 associated with the first and second drones and a set of drone controls 207. Although two drone video feeds are shown in fig. 3, in embodiments, real-time video feeds associated with additional or fewer drones may be displayed. Further, in some embodiments, the user interface display view 200 includes a display associated with sensor data captured by sensors associated with drones within the process plant.
In general, the overview 202 of the process plant depicts a representation of the physical location of equipment within the plant, as well as the physical location of drones relative to devices within the process plant. For example, drone 1(208) is depicted as being near equipment 210. As another example, drone 2(212) is depicted near equipment 214, and drone 3(216) is depicted near equipment 218. Thus, the real-time video feedback 204 associated with drone 1 depicts real-time feedback of video showing the current state of equipment 210. Similarly, the real-time video feedback 206 associated with drone 2 depicts real-time feedback of video showing the current state of equipment 214. In one example, when the operator selects a drone (e.g., using the cursor 220), real-time feedback associated with the drone is displayed. For example, when the operator selects drone 3(216), real-time video feedback associated with drone 3 is displayed, which depicts real-time feedback of the current state of equipment 218.
In some examples, the drone moves along a fixed path within the process plant. In other examples, the drone is entirely controlled by the operator. In other examples, the drone generally moves along a fixed path within the process plant until an operator takes action to control the path of the drone. In one example, an operator manually controls the physical movement of each drone within the process plant using drone controls 207. For example, the operator may use the drone controls 207 to physically move the drone up and down, rotate, move back and forth. Thus, the drone may be moved within the process plant to obtain a close-up view of the equipment within the plant, thereby enabling an operator to check the conditions within the plant via the user interface device 8 (i.e., without having to enter the field environment).
In another example, the operator controls the physical movement of the drones within the process plant by selecting the depictions 208, 212, 216 of each drone within the overview 202 of the process plant or by selecting the area within the overview 202 of the process plant where the drones will be deployed. In one example, when the operator selects a drone (e.g., using the cursor 220) and clicks it and drags it to a new location, the drone is configured to physically travel to the new location. For example, an airborne drone may take off and fly automatically on a predetermined safe route to fly toward a hover point associated with a selected location. In another example, when the operator selects (e.g., using the cursor 220) an area of the process plant overview 202 (and/or equipment depicted in the process plant overview 202) in which no drones are currently located, the drone is configured to physically travel to that area of the process plant. Thus, once the drone reaches the destination, the operator may view real-time video feedback of the selected area of the process plant.
In an example, an operator selects a drone within the overview 202 of the process plant to travel away from its current location to a safe location (which may be selected or predetermined by the operator). For example, if an individual is trapped in an unsafe location within a process plant, the operator may select a drone near the individual to travel away from its current location to the safe location, and the individual may follow the drone to the safe location. In another example, an operator selects a drone within the overview 202 of the process plant to travel from an entrance of the process plant to a particular area of the process plant. For example, if an emergency situation exists within the plant, emergency personnel may follow the drone from the plant entrance to the area of the plant associated with the emergency situation.
Further, in an example, an operator selects a drone or an area within the overview 202 of the process plant to cause the drone to perform other tasks within the plant. For example, the drone may be configured to alert individuals within the process plant of danger, for example, by broadcasting a warning or marking the surrounding of an unsafe area. For example, the operator may select an unsafe area within the overview 202, thereby moving the drone to that area and alerting individuals in that area that are at risk. Furthermore, in some cases, the operator may select a drone to transport a medical kit, a respiratory device, or a defibrillator unit to individuals trapped in an unsafe area of the facility. Further, the operator may select the drone to activate a switch within the plant (including a kill switch for shutting off equipment in an unsafe location, such as a drilling site), make measurements within the plant, and/or collect product samples from the plant.
Referring now to FIG. 4, a flow diagram 400 of an example method for monitoring process plant conditions using an unmanned robotic vehicle is shown, in accordance with some embodiments. For example, memory 82 of on-board computing device 78 of unmanned robotic vehicle 72 and/or memory 102 of user interface device 8 may store instructions that, when executed by processor 80 or 100, respectively, cause unmanned robotic vehicle 72 and/or user interface device 8, respectively, to perform at least a portion of method 400. In embodiments, method 400 may include additional, fewer, and/or alternative acts.
At block 402, the user interface device 8 may receive an indication of a user selection of a piece of equipment in an overview of the process plant displayed by the user interface device 8 at which the unmanned robotic vehicle 72 is to be deployed. For example, an overview of a process plant may depict a representation of the physical location of equipment within the process plant. In some examples, displaying the overview of the process plant may include displaying a representation of a physical location of the unmanned robotic vehicle within the process plant relative to equipment in the overview of the process plant.
At block 404, an indication of the user selection may be transmitted to the onboard computing device 78 associated with the unmanned robotic vehicle 72.
At block 406, a destination location of the unmanned robotic vehicle 72 in the process plant may be determined based on the user selected indication. In some examples, determining a destination location of the unmanned robotic vehicle process in the plant may include determining a safe hover point near a physical location of the piece of equipment selected by the user.
At block 408, the unmanned robotic vehicle 72 may be controlled to travel from a current location in the process plant to a destination location in the process plant. For example, in some examples, controlling the unmanned robotic vehicle 72 to travel from the current location to the destination location may include identifying a safe route from the current location in the process plant to the destination location in the process plant and controlling the unmanned robotic vehicle 72 to travel from the current location in the process plant to the destination location in the process plant via the safe route.
At block 410, camera data may be captured by one or more cameras of the unmanned robotic vehicle 72. In some examples, the camera data may include real-time video feedback. In some examples, the camera data may include infrared camera data captured by one or more infrared cameras of the unmanned robotic vehicle 72.
At block 412, the camera data may be transmitted to the user interface device 8.
At block 414, the camera data may be displayed alongside the overview of the process plant. In some examples, displaying the camera data alongside the overview of the process plant may include displaying real-time video feedback based on the camera data alongside the overview of the process plant.
In some examples, the method 400 may further include: capturing sensor data by one or more sensors of the unmanned robotic vehicle, capturing sensor data to a user interface device, and/or displaying sensor data through the user interface device (not shown in fig. 4). Capturing sensor data by one or more sensors of the unmanned robotic vehicle may include capturing sensor data from one or more of the following sensors: a position sensor, a temperature sensor, a flame sensor, a gas sensor, a wind sensor, an accelerometer, and/or a motion sensor of the unmanned robotic vehicle. Additionally, the method 400 may further include analyzing the camera data and/or the sensor data to identify an indication of a condition within the process plant. For example, the condition may be an overheating condition, a fire condition, a smoke condition, a leak condition, a steam condition, a water accumulation or drip condition, a valve condition, a motion condition within the process plant, and the like.
Further, in some examples, method 400 may further include: receiving an indication of a user selection of a switch within a process plant to be activated by the unmanned robotic vehicle; and controlling a robotic arm of the unmanned robotic vehicle to activate a switch within the process plant.
Additionally, in some examples, the method 400 may further include controlling a robotic arm of the unmanned robotic vehicle to collect the sample within the process plant.
Additional considerations below apply to the foregoing discussion. Throughout the specification, actions described as being performed by any device or routine generally refer to the actions or processes of a processor manipulating or transforming data in accordance with machine-readable instructions. The machine-readable instructions may be stored on and retrieved from a storage device communicatively coupled to the processor. That is, the methods described herein may be embodied by a set of machine-executable instructions stored on a computer-readable medium (i.e., a storage device) such as that shown in fig. 2. The instructions, when executed by one or more processors of a respective device (e.g., server, user interface device, etc.), cause the processors to perform the method. When instructions, routines, modules, processes, services, programs, and/or applications are referred to herein as being stored or stored on a computer-readable memory or computer-readable medium, the words "stored" and "stored" are intended to exclude transitory signals.
Further, although the terms "operator," "person," "user," "technician," and other similar terms are used to describe a person in a process plant environment that may use or interact with the systems, apparatus, and methods described herein, these terms are not intended to be limiting. Where specific terminology is used in the description, this data is used in part because of the traditional activities undertaken by plant personnel, but is not intended to limit the personnel who may be engaged in that specific activity.
In addition, throughout the specification, multiple instances may implement a component, an operation, or a structure described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in the exemplary configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements may fall within the scope of the subject matter claimed herein.
When implemented in software, any of the applications, services, and engines described herein may be stored in any tangible, non-transitory computer-readable memory, such as RAM or ROM on a magnetic disk, a laser disk, a solid state memory device, a molecular memory storage device, or other storage medium, a computer or processor, or the like. Although the exemplary systems disclosed herein are disclosed as including software and/or firmware and other components executed on hardware, it should be noted that such systems are merely illustrative and should not be considered as limiting. For example, it is contemplated that any or all of these hardware, software, and firmware components could be embodied exclusively in hardware, exclusively in software, or in any combination of hardware and software. Thus, one of ordinary skill in the art will readily appreciate that the examples provided are not the only way to implement such a system.
Thus, while the present invention has been described with reference to specific examples, which are intended to be illustrative only and not to be limiting of the invention, it will be apparent to those of ordinary skill in the art that changes, additions or deletions may be made to the disclosed embodiments without departing from the spirit and scope of the invention. Moreover, while the foregoing text sets forth a detailed description of numerous different embodiments, this detailed description should be construed as exemplary only and does not describe every possible embodiment since describing every possible embodiment would be impractical, if not impossible. Numerous alternative embodiments could be implemented, using either current technology or technology developed after the filing date of this patent.

Claims (29)

1. A method, comprising:
receiving, by a user interface device, an indication of a user selection of a piece of equipment in an overview of a process plant displayed by the user interface device at which an unmanned robotic vehicle is to be deployed, the overview of the process plant depicting a representation of a physical location of equipment within the process plant;
sending, by the user interface device, an indication of the user selection to an onboard computing device associated with the unmanned robotic vehicle;
determining, by an on-board computing device associated with the unmanned robotic vehicle, a destination location of the unmanned robotic vehicle in the process plant based on the user-selected indication;
controlling, by an on-board computing device associated with the unmanned robotic vehicle, the unmanned robotic vehicle to travel from a current location in the process plant to the destination location in the process plant;
capturing camera data by one or more cameras of the unmanned robotic vehicle;
sending, by an onboard computing device associated with the unmanned robotic vehicle, the camera data to the user interface device; and
displaying, by the user interface device, the camera data alongside the overview of the process plant.
2. The method of claim 1, wherein determining the destination location of the unmanned robotic vehicle in the process plant based on the user-selected indication comprises: determining a safe hover point near the user selected physical location of the piece of equipment.
3. The method of claim 1, wherein controlling the unmanned robotic vehicle to travel from a current location to the destination location comprises:
identifying, by an on-board computing device associated with the unmanned robotic vehicle, a safe route from the current location in the process plant to the destination location in the process plant; and
controlling, by the on-board computing device, the unmanned robotic vehicle to travel from the current location in the process plant to the destination location in the process plant via the safe route.
4. The method of claim 1, wherein displaying the camera data alongside the overview of the process plant comprises: displaying real-time video feedback based on the camera data alongside the overview of the process plant.
5. The method of claim 1, further comprising:
displaying, by the user interface device, a representation of a physical location of the unmanned robotic vehicle within the process plant relative to the equipment in the overview of the process plant.
6. The method of claim 1, wherein capturing camera data by one or more cameras of the unmanned robotic vehicle comprises: capturing infrared camera data by one or more infrared cameras of the unmanned robotic vehicle.
7. The method of claim 1, further comprising:
capturing sensor data by one or more sensors of the unmanned robotic vehicle;
sending, by an on-board computing device associated with the unmanned robotic vehicle, the sensor data to the user interface device; and
displaying, by the user interface device, the sensor data.
8. The method of claim 7, wherein capturing sensor data by one or more sensors of the unmanned robotic vehicle comprises capturing sensor data from one or more of: a position sensor, a temperature sensor, a flame sensor, a gas sensor, a wind sensor, an accelerometer, and a motion sensor of the unmanned robotic vehicle.
9. The method of claim 7, further comprising:
analyzing, by an on-board computing device associated with the unmanned robotic vehicle, one or more of the camera data and the sensor data to identify an indication of a condition within the process plant.
10. The method of claim 8, wherein the condition is one or more of the following conditions: an overheating condition, a fire condition, a smoke condition, a leakage condition, a steam condition, a water accumulation or dripping condition, a valve condition, or a movement condition within the process plant.
11. The method of claim 1, further comprising:
receiving, by a user interface device, an indication of a user selection of a switch within the process plant to be activated by the unmanned robotic vehicle; and
controlling, by the on-board computing device, a robotic arm of the unmanned robotic vehicle to activate the switch within the process plant.
12. The method of claim 1, further comprising:
controlling, by the on-board computing device, a robotic arm of the unmanned robotic vehicle to collect samples within the process plant.
13. A user interface device configured to communicate with a drone robotic vehicle equipped with one or more cameras, the user interface device comprising:
a display;
one or more processors; and
one or more memories storing a set of computer-executable instructions that, when executed by the one or more processors, cause the user interface device to:
receiving, from an on-board computing device of the unmanned robotic vehicle, data captured by the one or more cameras of the unmanned robotic vehicle;
displaying an overview of a process plant alongside the data captured by the one or more cameras of the unmanned robotic vehicle, the overview of the process plant depicting a representation of physical locations of equipment within the process plant;
receiving an operator selection of a piece of equipment within the overview of the process plant at which the unmanned robotic vehicle is to be deployed; and
sending a command to the onboard computing device of the unmanned robotic vehicle, the command indicating the selected piece of equipment within the overview of the process plant at which the unmanned robotic vehicle is to be deployed.
14. The user interface device of claim 13, wherein the instructions that cause the user interface device to display the data captured by the one or more cameras of the unmanned robotic vehicle comprise: instructions for displaying live video feedback based on the data captured by the one or more cameras.
15. The user interface device of claim 13, wherein the instructions that cause the user interface device to display the overview of the process plant comprise: instructions for displaying a representation of a physical location of the unmanned robotic vehicle within the process plant relative to the equipment in the overview of the process plant.
16. The user interface device of claim 13, wherein the instructions further cause the user interface to display sensor data captured by one or more sensors of the unmanned robotic vehicle.
17. The user interface device of claim 16, wherein the sensor data comprises data captured by one or more of the following sensors: a position sensor, a temperature sensor, a flame sensor, a gas sensor, a wind sensor, an accelerometer, or a motion sensor of the unmanned robotic vehicle.
18. The user interface device of claim 13, wherein the instructions further cause the user interface to:
receiving an indication of a user selection of a switch within the process plant to be activated by the unmanned robotic vehicle; and
sending a command to the on-board computing device of the unmanned robotic vehicle, the command indicating a selection of the switch to be activated by the unmanned robotic vehicle.
19. An unmanned robotic vehicle comprising:
one or more cameras configured to capture images while the unmanned robotic vehicle is traveling in a process plant,
one or more processors; and
one or more memories storing a set of computer-executable instructions that, when executed by the one or more processors, cause the unmanned robotic vehicle to:
receiving a command from a user interface device, wherein the command from the user interface device comprises a command indicating a piece of equipment in the process plant to which the unmanned robotic vehicle is to be deployed;
determining a destination location in the process plant based on the indication of the piece of equipment in the process plant;
travel from a current location in the process plant to the destination location in the process plant; and
transmitting the images captured by the one or more cameras to the user interface device.
20. The unmanned robotic vehicle of claim 19, wherein the instructions that cause the unmanned robotic vehicle to determine the destination location of the unmanned robotic vehicle in the process plant based on the indication of the piece of equipment in the process plant comprise: causing the unmanned robotic vehicle to determine a safe hang point of the piece of equipment near a physical location in the process plant.
21. The method of claim 19, wherein the instructions to cause the unmanned robotic vehicle to travel from the current location to the destination location comprise: instructions that cause the unmanned robotic vehicle to:
identifying a safe route from the current location in the process plant to the destination location in the process plant; and
travel from the current location in the process plant to the destination location in the process plant via the safe route.
22. The unmanned robotic vehicle of claim 19, wherein the unmanned robotic vehicle is an unmanned aerial vehicle.
23. The unmanned robotic vehicle of claim 19, wherein the one or more cameras comprise one or more infrared cameras.
24. The unmanned robotic vehicle of claim 19, further comprising:
one or more sensors configured to capture sensor data while the unmanned robotic vehicle is traveling in the process plant; and wherein the instructions further cause the unmanned robotic vehicle to transmit sensor data captured by the one or more sensors to the user interface device.
25. The unmanned robotic vehicle of claim 24, wherein the one or more sensors comprise one or more of the following sensors: position sensors, temperature sensors, flame sensors, gas sensors, wind sensors, accelerometers, and motion sensors.
26. The unmanned robotic vehicle of claim 24, wherein the instructions further cause the unmanned robotic vehicle to:
analyzing one or more of the camera data and the sensor data to identify an indication of a condition within the process plant.
27. The unmanned robotic vehicle of claim 26, wherein the condition is one or more of: an overheating condition, a fire condition, a smoke condition, a leakage condition, a steam condition, a water accumulation or dripping condition, a valve condition, or a movement condition within the process plant.
28. The unmanned robotic vehicle of claim 19, further comprising:
a robotic arm configured to perform one or more of the following operations: (i) activate a switch within the process plant, or (ii) collect a sample within the process plant.
29. The unmanned robotic vehicle of claim 19, wherein the instructions further cause the unmanned robotic vehicle to:
receiving an indication of a user selection of a switch within the process plant to be activated by the unmanned robotic vehicle; and
controlling the robotic arm of the unmanned robotic vehicle to activate the switch within the process plant.
CN201910940142.9A 2018-10-01 2019-09-30 Operator shift enabling unmanned aerial vehicle Active CN110968054B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201862739829P 2018-10-01 2018-10-01
US62/739,829 2018-10-01
US16/580,793 US11281200B2 (en) 2018-10-01 2019-09-24 Drone-enabled operator rounds
US16/580,793 2019-09-24

Publications (2)

Publication Number Publication Date
CN110968054A true CN110968054A (en) 2020-04-07
CN110968054B CN110968054B (en) 2024-08-20

Family

ID=69945748

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910940142.9A Active CN110968054B (en) 2018-10-01 2019-09-30 Operator shift enabling unmanned aerial vehicle

Country Status (2)

Country Link
JP (1) JP7530166B2 (en)
CN (1) CN110968054B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114482885A (en) * 2022-01-25 2022-05-13 西南石油大学 Pressure-controlled drilling intelligent control system

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022012110A (en) * 2020-07-01 2022-01-17 三菱電機株式会社 Plant monitoring control system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8682502B2 (en) * 2007-03-28 2014-03-25 Irobot Corporation Remote vehicle control system and method
US20160282872A1 (en) * 2015-03-25 2016-09-29 Yokogawa Electric Corporation System and method of monitoring an industrial plant
WO2016200508A1 (en) * 2015-06-11 2016-12-15 Intel Corporation Drone controlling device and method
CN106325290A (en) * 2016-09-30 2017-01-11 北京奇虎科技有限公司 Monitoring system and device based on unmanned aerial vehicle
CN106339079A (en) * 2016-08-08 2017-01-18 清华大学深圳研究生院 Method and device for realizing virtual reality by using unmanned aerial vehicle based on computer vision
CN108132675A (en) * 2017-11-23 2018-06-08 东南大学 Unmanned plane is maked an inspection tour from main path cruise and intelligent barrier avoiding method by a kind of factory
KR101866239B1 (en) * 2017-05-30 2018-06-12 (주)이피에스이앤이 Method for Monitoring Water Quality Environment Using Drone
CN108268121A (en) * 2016-12-30 2018-07-10 昊翔电能运动科技(昆山)有限公司 Control method, control device and the control system of unmanned vehicle

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002167737A (en) * 2000-09-22 2002-06-11 Mitsubishi Heavy Ind Ltd Remote monitor control responding type drainage system and drainage pump station remote monitor device
JP2004171127A (en) 2002-11-18 2004-06-17 Hitachi Ltd Field work support method, system therefor and recording medium
JP4300060B2 (en) 2003-05-20 2009-07-22 株式会社日立製作所 Monitoring system and monitoring terminal
JP3834651B2 (en) * 2003-09-04 2006-10-18 防衛庁技術研究本部長 Traveling robot
JP6413510B2 (en) * 2014-09-03 2018-10-31 村田機械株式会社 Carrier vehicle system and route periphery monitoring method
US9489852B1 (en) 2015-01-22 2016-11-08 Zipline International Inc. Unmanned aerial vehicle management system
CN111542479B (en) * 2018-12-07 2022-07-26 乐天集团股份有限公司 Method for determining article transfer location, method for determining landing location, article transfer system, and information processing device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8682502B2 (en) * 2007-03-28 2014-03-25 Irobot Corporation Remote vehicle control system and method
US20160282872A1 (en) * 2015-03-25 2016-09-29 Yokogawa Electric Corporation System and method of monitoring an industrial plant
WO2016200508A1 (en) * 2015-06-11 2016-12-15 Intel Corporation Drone controlling device and method
CN106339079A (en) * 2016-08-08 2017-01-18 清华大学深圳研究生院 Method and device for realizing virtual reality by using unmanned aerial vehicle based on computer vision
CN106325290A (en) * 2016-09-30 2017-01-11 北京奇虎科技有限公司 Monitoring system and device based on unmanned aerial vehicle
CN108268121A (en) * 2016-12-30 2018-07-10 昊翔电能运动科技(昆山)有限公司 Control method, control device and the control system of unmanned vehicle
KR101866239B1 (en) * 2017-05-30 2018-06-12 (주)이피에스이앤이 Method for Monitoring Water Quality Environment Using Drone
CN108132675A (en) * 2017-11-23 2018-06-08 东南大学 Unmanned plane is maked an inspection tour from main path cruise and intelligent barrier avoiding method by a kind of factory

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈卫东等: "基于无人机的PM2.5空气质量监测系统", 科学技术创新, no. 19, pages 51 - 52 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114482885A (en) * 2022-01-25 2022-05-13 西南石油大学 Pressure-controlled drilling intelligent control system
CN114482885B (en) * 2022-01-25 2024-03-29 西南石油大学 Intelligent control system for pressure-controlled drilling

Also Published As

Publication number Publication date
CN110968054B (en) 2024-08-20
JP2020064623A (en) 2020-04-23
JP7530166B2 (en) 2024-08-07

Similar Documents

Publication Publication Date Title
US11281200B2 (en) Drone-enabled operator rounds
US11265513B2 (en) Virtual reality and augmented reality for industrial automation
US10535202B2 (en) Virtual reality and augmented reality for industrial automation
JP6457182B2 (en) Method for starting or resuming a mobile control session in a process plant
CN104049597A (en) Method and apparatus for managing a work flow in a process plant
CN104049582A (en) Generating checklists in a process control environment
CN104049586A (en) Mobile control room with function of real-time environment awareness
CN104049594A (en) Method and apparatus for seamless state transfer between user interface devices in a mobile control room
CN110968054B (en) Operator shift enabling unmanned aerial vehicle
US20200278675A1 (en) Remotely controlled airborne vehicle providing field sensor communication and site imaging during factory failure conditions
DE102014103377A1 (en) Method and apparatus for seamless state transfer between UI devices in a mobile switching center
CN104049268A (en) Method and apparatus for determining the position of a mobile control device in a process plant
WO2021168810A1 (en) Unmanned aerial vehicle control method and apparatus, and unmanned aerial vehicle
Poma et al. Open-Source Web-Based Ground Control Station for Long-Range Inspection with Multiple UAV s

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant