WO2020086071A1 - System and method to automatically optically monitoring field devices and assess state information - Google Patents

System and method to automatically optically monitoring field devices and assess state information Download PDF

Info

Publication number
WO2020086071A1
WO2020086071A1 PCT/US2018/057274 US2018057274W WO2020086071A1 WO 2020086071 A1 WO2020086071 A1 WO 2020086071A1 US 2018057274 W US2018057274 W US 2018057274W WO 2020086071 A1 WO2020086071 A1 WO 2020086071A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
computer
imaging device
operable
state
Prior art date
Application number
PCT/US2018/057274
Other languages
French (fr)
Inventor
Joshua S. MCCONKEY
Tao CUI
Original Assignee
Siemens Energy, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Energy, Inc. filed Critical Siemens Energy, Inc.
Priority to PCT/US2018/057274 priority Critical patent/WO2020086071A1/en
Publication of WO2020086071A1 publication Critical patent/WO2020086071A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/048Monitoring; Safety
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0428Safety, monitoring
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B23/00Testing or monitoring of control systems or parts thereof
    • G05B23/02Electric testing or monitoring
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/24Pc safety
    • G05B2219/24097Camera monitors controlled machine
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2612Data acquisition interface
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2651Camera, photo

Definitions

  • the present disclosure is directed, in general, to a system and method to read field instrumentation, and more specifically to a wireless system for automatically reading field instrumentation and/or determining the state of field devices.
  • a device monitor for monitoring a remote device and for providing data to a separate system spaced apart from the remote device includes an imaging device positioned to capture an image of the remote device, a computer proximate to and coupled to the imaging device and operable to extract an operating parameter from the image, the operating parameter being a text value, and a transmitter proximate to and coupled to the computer and operable to wirelessly transmit the operating parameter to the separate system.
  • a system for monitoring the operation of a machine including a plurality of remote devices and a control system spaced apart from the plurality of remote devices includes a plurality of device monitors each associated with one of the plurality of remote devices, each of the plurality of device monitors including an imaging device positioned to capture an image of an associated one of the remote devices, a computer connected to the imaging device and operable to extract an operating parameter from the image, the operating parameter being a text value, a transmitter coupled to the computer and operable to transmit the operating parameter to the control system, and a power supply operably connected to the imaging device, the computer, and the transmitter.
  • the system also includes a receiver coupled to the control system and operable to receive the transmitted operating parameter from each of the plurality of device monitors.
  • transmitting that data to a control system includes positioning an imaging device adjacent the remote device, capturing an image of the remote device with the imaging device, analyzing the image to extract information regarding a state of the remote device, converting the information to an operating parameter that is a text value, and wirelessly transmitting the operating parameter to the control system.
  • a device monitor for monitoring an object of interest having a first state and a second state and for providing data to a separate system spaced apart from the object of interest includes an imaging device positioned to capture an image of the object of interest, and a computer proximate to and coupled to the imaging device and operable to determine a current state of the object of interest, the current state being one of the first state and the second state, and further assigning a first text value if the current state is the first state and a second text value if the current state is the second state.
  • a transmitter is located proximate to and coupled to the computer and is operable to wirelessly transmit the text value to the separate system.
  • a method of gathering data from an object of interest includes positioning an imaging device adjacent the object of interest to allow the imaging device to periodically capture an image, determining an allowable quantity of a substance that can be associated with the object of interest, and programming a computer to assign a numerical value to the image based on a quantity of the substance that is visible in the image and the allowable quantity of the substance.
  • the method also includes wirelessly transmitting the numerical value from the computer with a transmitter and powering the imaging device, the computer, and the transmitter with no more than 1.5 watts of electricity.
  • FIG. 1 is a schematic illustration of a pre-existing facility including a plurality of remote devices and device monitors.
  • FIG. 2 is a schematic illustration of a device monitor suitable for use in the facility of Fig. 1
  • Fig. 3 is an image of a first instrument including several bounding boxes.
  • Fig. 4 is an image of a second instrument in the form of an operating fan.
  • Fig. 5 is an image of the second instrument in a non-operating condition.
  • Fig. 6 is an image of a third instrument including several bounding boxes
  • Fig. 7 is an image of the text extracted from the third instrument for transmission.
  • first”, “second”, “third” and so forth may be used herein to refer to various elements, information, functions, or acts, these elements, information, functions, or acts should not be limited by these terms. Rather these numeral adjectives are used to distinguish different elements, information, functions or acts from each other. For example, a first element, information, function, or act could be termed a second element, information, function, or act, and, similarly, a second element, information, function, or act could be termed a first element, information, function, or act, without departing from the scope of the present disclosure.
  • adjacent to may mean: that an element is relatively near to but not in contact with a further element; or that the element is in contact with the further portion, unless the context clearly indicates otherwise.
  • phrase“based on” is intended to mean “based, at least in part, on” unless explicitly stated otherwise.
  • Terms“about” or“substantially” or like terms are intended to cover variations in a value that are within normal industry manufacturing tolerances for that dimension. If no industry standard as available a variation of 20 percent would fall within the meaning of these terms unless otherwise stated.
  • Fig. 1 schematically illustrates a facility 10 such as a power plant, a steel mill, a paper mill, and the like.
  • the facility 10 includes operating equipment such as turbines, blast furnaces, rolling mills, paper cutters, etc. and devices 20 (e.g., gauges, sensors, controls, switches, etc.) positioned to measure parameters of the equipment 15 or to operate as desired to support operation of the equipment 15.
  • devices 20 e.g., gauges, sensors, controls, switches, etc.
  • a power plant may include a stand-alone hydraulic system that provides pressurized oil to a mechanical control system. The system pressure may be monitored by a simple pressure gauge attached to a pressure tank.
  • This tank must be checked periodically by plant personnel.
  • a fan might be positioned near other equipment to cool the equipment.
  • the fan may be controlled by a simple temperature switch positioned near the equipment. The temperature is not actually transmitted to a central control system 25 and the operating status of the fan is not directly monitored. Rather, plant personnel periodically inspect the fan to verify that the switch is functioning and that the fan is operating as desired.
  • the new data must be provided to a central control system 25 or other device to be useful. This could require additional programing or a separate stand-alone device 20 that displays the results.
  • Fig. 2 illustrates a device monitor 30 that could be used in conjunction with some or all the devices 20 of Fig. 1 to gather the important data and deliver it to a user or the central control system 25.
  • the device monitor 30 of Fig. 2 is positioned to monitor one or both of a meter 35 and a fan 40 and includes an imaging device 45, a computer 50, a transmitter 55, and a power supply 60. It should be understood that the device monitor 30 is capable of monitoring virtually any device 20 and should not be limited to the example devices 35, 40 described herein.
  • the imaging device 45 is preferably an inexpensive low-power camera capable of capturing still images. Suitable cameras include chip-based CMOS or other image sensors or other suitable devices. Low-power imaging devices 45 include cameras that operate on less than one watt when capturing images. Preferable cameras operate on less than two-hundred milliwatts of electricity when capturing video at the frame rate of 30 fps. In preferred constructions, the imaging device 45 is capable of capturing color images as well as video. However, some constructions operate with a simple imaging sensor capable of capturing only black and white or gray-scale still images.
  • the camera is positioned to image two separate devices 20, specifically the meter 35 and the fan 40.
  • the imaging device 45 can be positioned to capture the important areas of both devices 35, 40 in a single image.
  • the imaging device 45 is movably supported to allow movement between two (or more) positions to capture separate images of each device 35, 40.
  • a simple motor, electrical actuator or other electrically movable device could be employed to change the position of the imaging device 45.
  • the imaging device 45 could be positioned to image more than two devices 20 together or in sequence.
  • the computer 50 is connected to the imaging device 45 to receive images captured by the imaging device 45.
  • the computer 50 illustrated in Fig. 2 includes a central processing unit (CPU) 65, a storage device 70, and an artificial intelligence (AI) co-processor 75.
  • CPU central processing unit
  • AI artificial intelligence
  • the term“computer” as used herein should be interpreted broadly to cover any device that receives and processes the images captured by the image sensor 45.
  • the illustrated computer 50 includes the CPU 65, the storage device 70, and the AI co-processor 75, other computers 50 may omit one or more of these components or may include additional components such as additional processors, memory, storage, communication devices, and the like.
  • the computer 50 generally includes an input/output device that allows for access to the software regardless of where it is stored, one or more processors 65, 75, memory devices, user input devices, and output devices such as monitors, printers, and the like.
  • the processors 65, 75 may include standard micro-processors 65 and/or the artificial intelligence accelerator or processor 75 specifically designed to perform artificial intelligence applications such as artificial neural networks, machine vision, and machine learning. Typical applications include algorithms for robotics, internet of things, and other data- intensive or sensor-driven tasks. Often AI accelerators 75 are multi-core designs and generally focus on low-precision arithmetic, novel dataflow architectures, or in-memory computing capability. In still other applications, the AI processor 75 may include a graphics processing unit (GPU) designed for the manipulation of images and the calculation of local image properties.
  • GPU graphics processing unit
  • the computer 50 may also include communication devices, in addition to the transmitter 55, that allow for communication between other computers or computer networks, as well as for communication with other devices such as machine tools, work stations, actuators, controllers, sensors, and the like.
  • the CPU 65, storage device 70, and AI co-processor 75 cooperate to process images and extract certain data from the images.
  • the data is in the form of text or numbers (referred to herein as“text”) that represent an operating parameter being monitored, measured, or controlled by the device 20.
  • text a text or numbers
  • other applications can detect other characteristics from the image. For example, one application determines if the fan 40 is operating by looking for a blurred or unclear image of the fan blades.
  • an infrared imaging device is used to detect heat from a motor powering the fan 40 during operation.
  • the computer 50 is a low-power device and consumes less than three watts of electricity during operation.
  • a more preferred computer 50 operates on less than one watt of electricity and is operated periodically.
  • the transmitter 55 is connected to the computer 50 to receive and transmit the operating parameter or other data from the computer 50 to an external device such as a user, the central control 25, a secondary control, an internet connection, or a repeater that retransmits the data to extend the range of transmission.
  • the transmitter 55 includes a wireless radio 80 and a wired digital interface 85.
  • the term“transmitter” should be read broadly to include any device or arrangement capable of outputting the operating parameter or other data received from the computer 50.
  • the transmitter 55 could omit one of the components illustrated in Fig. 2 or could include additional components such as an analog interface, a cellular transmitter, an internet connector, and the like.
  • the transmitter 55 is preferably capable of receiving external transmissions (i.e., a transceiver) such as instructions, software updates, and the like.
  • the wireless radio 80 is operable to wirelessly transmit the operating parameter or other data from the device monitor 30 to an external receiver.
  • the wireless radio 80 is capable of transmission of at least one hundred meters with further transmission being possible.
  • repeaters or other components can be positioned throughout the facility 10 to extend the range of the wireless radio 80 if necessary.
  • the digital interface 85 could include an RS-485 or similar serial interface that allows for two-way communication between the device monitor 30 and an external device. While the digital interface 85 could be used to transmit the operating parameter or other data, it is preferably provided for other purposes such as software updates and the like.
  • the transmitter 55 is a low-power transmitter 55 that requires less than one watt of electricity during transmission of the operating parameter or other data.
  • a more preferred transmitter 55 uses less than twenty milliwatts of electricity to complete the necessary transmissions.
  • the combination of the computer, the camera, and the imaging device use less than five watts of power with more preferred systems operating using less than 1.22 watts. Typical systems operate at a power level of 1.5 watts or less.
  • the device monitor 30 includes low-power features that allow the device monitor 30 to operate on less than five watts of electricity with more optimized designs using less than one watt of electricity.
  • the power supply 60 is coupled to each of the imaging device 45, the computer 50, and the transmitter 55 to provide the necessary power for each of these devices without the need to connect to an external power supply.
  • the power supply 60 includes a photovoltaic (PV) cell 90 capable of providing at least five watts of electricity at a voltage of five volts or less to power the imaging device 45, the computer 50, and the transmitter 55.
  • the PV cell 90 has a surface area of less than thirty square inches (194 square cm) with larger or smaller PV cells 90 being possible depending on the specific power needs of the device monitor 30 and the amount of light available at the installation location of the device monitor 30. In one construction, the PV cell 90 is fifteen square inches (97 square cm). The PV cell 90 is positioned and oriented during installation to allow for the capture of energy from whatever light source might be available within the facility 10.
  • the PV cell 90 is also connected to a rechargeable battery 95 that is charged when excess energy from the PV cell 90 is available.
  • the battery 95 is then capable of providing power to the imaging device 45, the computer 50, and the transmitter 55 when and if the PV cell 90 is not able to deliver the necessary power.
  • Still other constructions could operate on batteries alone. However, the need to periodically replace those batteries would negatively affect the benefits provided by the device monitor 30.
  • Figs. 3-7 illustrate the operation of the device monitor 30.
  • Fig. 3 illustrates an example of an image 100 captured by the imaging device 45 and transferred to the computer 50 for processing.
  • the first step performed by the computer 50 is the placement of bounding boxes 105 around text detected in the image 100.
  • the CPU 65, the storage device 70, and the AI co processor 75 cooperate to define the bounding boxes 105.
  • the storage device 70 includes pre-trained models that aid the CPU 65 and AI co-processor 75 in identifying the text and placing the bounding boxes 105 around the text.
  • the AI co-processor 75 may utilize a neural network or deep learning techniques to enable the placement of the bounding boxes 105 and to determine which of the bounding boxes 105 contains the desired data.
  • the device monitor 30 can be trained to determine which bounding box 105 contains the important data. Alternatively, the device monitor 30 learns the format of the important text (e.g., five numbers with a decimal point) and looks for a bounding box 105 that contains text that matches the desired pattern. In still other constructions, the correct bounding box 105 is defined by a user at installation and the remaining bounding boxes 105 are ignored.
  • the format of the important text e.g., five numbers with a decimal point
  • the CPU 65 and AI co-processor 75 operate to extract the value of the data from the image 100 (e.g., a number or a word such as “on”).
  • This process is well known and is commonly referred to as optical character recognition (OCR).
  • OCR optical character recognition
  • Several different OCR techniques are available including seven segment optical character recognition (SSOCR) and variations thereof and Tesseract. Tesseract is an open- source OCR engine able to employ both traditional computer vision and deep learning engines. Some device monitors 30 may include multiple OCR engines that each run to allow the comparison of the results to enhance the accuracy of the extracted value.
  • the OCR process extracts the operating parameter or other data that is then transferred to the transmitter 55 for transmission from the device monitor 30. Rather than sending an image, the text of the operating parameter or other data is sent, thereby requiring a much smaller transmission. This saves both time and energy, thereby further allowing the device monitor 30 to operate at very low power levels.
  • Figs. 6 and 7 illustrate another image 110 of a meter and the resulting data 115 that is transmitted to an external device or control system 25.
  • the image 110 includes several bounding boxes 105 with the bounding box 105 containing the data of interest 115 being the largest box 105.
  • the computer 50 is either trained or learns to look for this bounding box 105 and to extract the data 115 contained therein.
  • the data 115 is the number “ 1.34”.
  • the computer 50 performs the OCR process and transfers the resulting data 115 to the transmitter 55.
  • the transmitter 55 transmits the data 115 (as a number and not an image) to an external device or control system 25 which displays the data 115 as illustrated in Fig. 7.
  • Fig. 4 illustrates another application of the device monitor 30 in which the device monitor 30 captures images of the fan 40 to determine the current state of the fan 40 (i.e., is it on or off).
  • the imaging device 30 could be directed at that indicator to capture an image. If the imaging device 30 is capable of detecting in the IR spectrum, the imaging device 30 could be aimed at the motor that drives the fan 40 to detect the increased temperature during operation. Alternatively, the imaging device 30 is directed at the fan blades themselves and captures a still image of the blades. The image is then transferred to the computer 50 which has been trained to detect something that is not text based. In the case where the fan 40 has an indicator light, the computer can simply determine if the light is on or off based on the color or the brightness of the light. The computer 50 would then generate data indicative of the condition of the light (i.e., the word“on” if the fan 40 is operating and the word “off’ if the fan 40 is not operating).
  • the fan 40 does not include a textual or text-based indicator (i.e., numbers or words) and the computer must determine the state of the fan 40 based on the image of the blades.
  • the computer 50 is trained or otherwise learns that when an image of the fan 120 includes a blurred region in the vicinity of the blades, the fan 40 is in the“on” state.
  • the computer 50 can perform an edge detection operation in which the computer 50 attempts to locate an edge 125 of one or more of the blades as illustrated in Fig. 5.
  • the computer 50 would be trained or would have otherwise learned what the edge 125 or edges 125 should look like.
  • the edges 125 of the blades appear blurred and cannot be defined.
  • the computer 50 would generate data indicative of an“on” state when the edges 125 cannot be identified (i.e., the word“on”) and would generate data indicative of an“off’ state when the edges 125 can be identified (i.e., the word“off’).
  • the computer 50 can perform an end-to-end object detection using convolutional neural networks (CNN).
  • CNN convolutional neural networks
  • a CNN model can be trained to directly map the image of the fan as matrix of pixels to the state of the fan as in word“on” or“off’.
  • far less transmission power is required to transmit a short word such as “on” or“off’ than is required to transmit the image of the fan 120.
  • the digitized the information can be conveniently integrated into control system.
  • the computer 50 includes an error detection process that reviews the captured images 100, 120 for potential errors with the system. For example, if the image 100, 120 is blurred or partially obscured, the computer 50 could generate data indicative of the problem. A simple message such as“camera failure” could be transmitted to indicate that the device monitor 30 should be inspected and repaired or replaced.
  • the imaging device 45 or the device monitor 30 itself, could be supported for motion between two or more positions or orientations to monitor two or more devices 20.
  • the device monitor 30 is mounted to an autonomous or semi-autonomous drone such as a land-based vehicle or an aerial drone such as a multi-rotor helicopter.
  • the drone could be pre-programmed to proceed to waypoints within or around the facility 10 where images of devices 20 are captured, analyzed, and transferred to an outside device such as the central control system 25. This system could essentially travel the path formerly followed by facility personnel to take the necessary readings.
  • the device monitor is capable of monitoring a number of other instruments or conditions and should not be limited to the examples discussed to this point.
  • most applications of the device monitor fall into one of four types of applications. These applications can be categorized as “Boolean Detection”,“Boolean/Value Detection”,“Value Detection”, and“Position Detection” with some overlap existing between each of these various detection systems.
  • the device monitor is positioned adjacent a device, object, space, or virtually any feature or component (collectively referred to as“an object of interest”) that is capable of having two and generally only two states.
  • an object of interest a feature or component that is capable of having two and generally only two states.
  • one Boolean detection system may be positioned adjacent a door that is either opened or closed. The device monitor would be trained to determine from an image if the door is opened, which might correspond to the number one, or if the door is closed, which would then correspond to the number zero. Thus, only a single digit needs to be transmitted which would require a very small amount of power.
  • Boolean detection system might include detecting and reporting the status of a light (i.e., is it on or off), detecting and reporting the condition of a drain (opened or closed), and detecting and reporting the presence of a person in a particular space (present, not present).
  • a light i.e., is it on or off
  • detecting and reporting the condition of a drain opened or closed
  • detecting and reporting the presence of a person in a particular space present, not present.
  • a light may have the states of Off, On (yellow), and On (green).
  • a Boolean detection system would detect only if the light is on or off and would ignore the color of the light. If the color is not important data, the Boolean detection system may be appropriate.
  • Boolean/V alue detection systems are closely related to Boolean detection systems.
  • the device monitor operates much like a Boolean system to detect a state of something, but further detects a value associated with the device associated with the device monitor.
  • the device monitor could determine first if the light is on or off, and if it is on it would determine the color. Again, the Boolean state could be reduced to a single digit. In addition, the color could be reduced to a single digit with each digit representing a particular color. Thus, the device could send two digits to identify the state and color of the light. Alternatively, a zero could indicate that the light is off, with numbers 1, 2, 3, etc. each representing the light being on and emitting a particular color.
  • a device monitor could be positioned to monitor the build up of ice in a problematic location of an air intake for an engine. While a Boolean system could be used, it would only provide information regarding the presence or lack of ice.
  • the Boolean/Value detection system is arranged to allow the device monitor to detect the presence of ice but to also quantify how much ice is present.
  • the user defines a maximum amount of ice that is allowable. This value becomes 100 percent.
  • the system quantifies and reports the quantity of ice in a percentage of the maximum allowable.
  • the system transmits a single value (the percent quantity of ice) and uses a minimal amount of power.
  • the system could also be arranged to transmit an actual image of the ice if the quantity exceeds a predetermined percentage (e.g., 90 percent). While this uses a significant amount of power, the importance of the image overrides the importance of maximizing power efficiency.
  • Boolean/V alue detection systems are typically used in situations where knowing a value is desirable but where no gauge or measuring device exists to accurately measure that value.
  • Another application might be the monitoring of corrosion in a particular location.
  • a device monitor could be positioned to periodically image the location (e.g., once per day) and transmit a value (e.g., percent of maximum allowable corrosion) to the central control system.
  • a Boolean/V alue detection system could be used to determine the number of objects (e,g., people, cows, boxes, cars, airplanes, etc.) in a space.
  • objects e.g., people, cows, boxes, cars, airplanes, etc.
  • a device monitor is positioned to image the desired space. The image is then analyzed to separate the desired objects and to count those objects. Once counted, a single integer value can be transmitted to the central control system or other desired device.
  • Value detection systems are particularly suited to reading gauges, meters or other devices that present data as a measured value.
  • a device monitor is positioned to image the device and analyze the image to select the region of the image that includes the desired data and to convert the value to a number rather than an image (e.g., optical character recognition).
  • a system similar to this was described in detail with regard to Figs. 3, 6, and 7.
  • Value Detection systems are therefore well-suited to extracting data in which the data of interest is produced directly on the device being monitored.
  • Position Detection systems are used to detect the position of something and to assign a number value to that position.
  • a position detection system is used to determine the position of a valve.
  • Valves can be arranged to be positioned at any point between opened and closed (analog) or could have fixed positions such as closed, 25 percent open, 50 percent open, 75 percent open, and opened (digital).
  • a device monitor is positioned to image a portion of the valve indicative of the valve position. This could include an actuator handle, a knob, a selector switch, an actuator arm, or some other component. As with other examples, the device monitor determines a position of the valve from the image of the component and reports that value as a number that could be indicative of a valve position or a percentage of open.
  • a Value Detection system is not appropriate as there is no single number to read for the valve position. Rather, the detection system must detect the position of an object (e.g., a pointer) and then must associate that position with a value.
  • an object e.g., a pointer
  • the device monitor provides a very inexpensive and completely wireless system that can perform the task.
  • the cost of running wires to the various conventional devices often costs significantly more than the device monitor.
  • the very efficient use of power enhanced by taking only periodic measurements (e.g., once per hour, once per day, etc.) assures that the device monitors can be powered by small solar power cells with minimal light to eliminate the need to change batteries or provide other power sources.
  • the preferred embodiment of the device monitor includes the imaging device, the power supply in the form of a solar cell, a computer, and a wireless transmitter.
  • the computer preferably runs AI software that is provided, programmed, or trained for the specific application where the device monitor is positioned. While self-learning software could be employed, the training often is more difficult or complex than simply programming the device to operate as desired.
  • the entire system can be shutdown between measurement intervals to allow for the storage of power and the reduction in total power use.
  • device monitors can operate for a week or more with only one hour of light exposure to power the device.
  • the computer utilizes the AI co-processor 75 to facilitate deep-learning, neural networks or other AI techniques to learn the various controls, objects, devices, or areas that are typically measured such that in some applications little to no training or programming is required to implement the system.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

A device monitor for monitoring a remote device and for providing data to a separate system spaced apart from the remote device includes an imaging device positioned to capture an image of the remote device, a computer proximate to and coupled to the imaging device and operable to extract an operating parameter from the image, the operating parameter being a text value, and a transmitter proximate to and coupled to the computer and operable to wirelessly transmit the operating parameter to the separate system.

Description

SYSTEM AND METHOD TO AUTOMATICALLY OPTICALLY MONITORING FIELD
DEVICES AND ASSESS STATE INFORMATION
TECHNICAL FIELD
[0001] The present disclosure is directed, in general, to a system and method to read field instrumentation, and more specifically to a wireless system for automatically reading field instrumentation and/or determining the state of field devices.
BACKGROUND
[0002] Pre-existing facilities such as power plants, oil refineries, steel mills, factories, and the like include hundreds of instruments, valves, and other devices dispersed throughout the facility. Most of these devices are not connected to a centralized control system and therefore require periodic checking or adjustment by plant personnel. This process can be very time consuming and has the potential for error. Replacing the instruments or other devices with modern connected instruments or devices can be cost prohibitive as it would require expensive components, wiring, testing and other effort that is costly. In addition, the replacement of known working components with new and different instruments or devices can increase the potential for unexpected operation.
SUMMARY
[0003] A device monitor for monitoring a remote device and for providing data to a separate system spaced apart from the remote device includes an imaging device positioned to capture an image of the remote device, a computer proximate to and coupled to the imaging device and operable to extract an operating parameter from the image, the operating parameter being a text value, and a transmitter proximate to and coupled to the computer and operable to wirelessly transmit the operating parameter to the separate system. [0004] In another construction, a system for monitoring the operation of a machine including a plurality of remote devices and a control system spaced apart from the plurality of remote devices includes a plurality of device monitors each associated with one of the plurality of remote devices, each of the plurality of device monitors including an imaging device positioned to capture an image of an associated one of the remote devices, a computer connected to the imaging device and operable to extract an operating parameter from the image, the operating parameter being a text value, a transmitter coupled to the computer and operable to transmit the operating parameter to the control system, and a power supply operably connected to the imaging device, the computer, and the transmitter. The system also includes a receiver coupled to the control system and operable to receive the transmitted operating parameter from each of the plurality of device monitors.
[0005] In another construction, a method of acquiring data from a remote device and
transmitting that data to a control system includes positioning an imaging device adjacent the remote device, capturing an image of the remote device with the imaging device, analyzing the image to extract information regarding a state of the remote device, converting the information to an operating parameter that is a text value, and wirelessly transmitting the operating parameter to the control system.
[0006] In another construction, a device monitor for monitoring an object of interest having a first state and a second state and for providing data to a separate system spaced apart from the object of interest includes an imaging device positioned to capture an image of the object of interest, and a computer proximate to and coupled to the imaging device and operable to determine a current state of the object of interest, the current state being one of the first state and the second state, and further assigning a first text value if the current state is the first state and a second text value if the current state is the second state. A transmitter is located proximate to and coupled to the computer and is operable to wirelessly transmit the text value to the separate system.
[0007] In another construction, a method of gathering data from an object of interest includes positioning an imaging device adjacent the object of interest to allow the imaging device to periodically capture an image, determining an allowable quantity of a substance that can be associated with the object of interest, and programming a computer to assign a numerical value to the image based on a quantity of the substance that is visible in the image and the allowable quantity of the substance. The method also includes wirelessly transmitting the numerical value from the computer with a transmitter and powering the imaging device, the computer, and the transmitter with no more than 1.5 watts of electricity.
[0008] The foregoing has outlined rather broadly the technical features of the present disclosure so that those skilled in the art may better understand the detailed description that follows.
Additional features and advantages of the disclosure will be described hereinafter that form the subject of the claims. Those skilled in the art will appreciate that they may readily use the conception and the specific embodiments disclosed as a basis for modifying or designing other structures for carrying out the same purposes of the present disclosure. Those skilled in the art will also realize that such equivalent constructions do not depart from the spirit and scope of the disclosure in its broadest form.
[0009] Also, before undertaking the Detailed Description below, it should be understood that various definitions for certain words and phrases are provided throughout this specification and those of ordinary skill in the art will understand that such definitions apply in many, if not most, instances to prior as well as future uses of such defined words and phrases. While some terms may include a wide variety of embodiments, the appended claims may expressly limit these terms to specific embodiments.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] Fig. 1 is a schematic illustration of a pre-existing facility including a plurality of remote devices and device monitors.
[0011] Fig. 2 is a schematic illustration of a device monitor suitable for use in the facility of Fig. 1
[0012] Fig. 3 is an image of a first instrument including several bounding boxes.
[0013] Fig. 4 is an image of a second instrument in the form of an operating fan. [0014] Fig. 5 is an image of the second instrument in a non-operating condition.
[0015] Fig. 6 is an image of a third instrument including several bounding boxes
[0016] Fig. 7 is an image of the text extracted from the third instrument for transmission.
[0017] Before any embodiments of the invention are explained in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings. The invention is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting.
DETAILED DESCRIPTION
[0018] Various technologies that pertain to systems and methods will now be described with reference to the drawings, where like reference numerals represent like elements throughout.
The drawings discussed below, and the various embodiments used to describe the principles of the present disclosure in this patent document are by way of illustration only and should not be construed in any way to limit the scope of the disclosure. Those skilled in the art will understand that the principles of the present disclosure may be implemented in any suitably arranged apparatus. It is to be understood that functionality that is described as being carried out by certain system elements may be performed by multiple elements. Similarly, for instance, an element may be configured to perform functionality that is described as being carried out by multiple elements. The numerous innovative teachings of the present application will be described with reference to exemplary non-limiting embodiments.
[0019] Also, it should be understood that the words or phrases used herein should be construed broadly, unless expressly limited in some examples. For example, the terms“including,” “having,” and“comprising,” as well as derivatives thereof, mean inclusion without limitation.
The singular forms“a”,“an” and“the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Further, the term“and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. The term“or” is inclusive, meaning and/or, unless the context clearly indicates otherwise. The phrases“associated with” and“associated therewith,” as well as derivatives thereof, may mean to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have, have a property of, or the like.
[0020] Also, although the terms "first", "second", "third" and so forth may be used herein to refer to various elements, information, functions, or acts, these elements, information, functions, or acts should not be limited by these terms. Rather these numeral adjectives are used to distinguish different elements, information, functions or acts from each other. For example, a first element, information, function, or act could be termed a second element, information, function, or act, and, similarly, a second element, information, function, or act could be termed a first element, information, function, or act, without departing from the scope of the present disclosure.
[0021] In addition, the term "adjacent to" may mean: that an element is relatively near to but not in contact with a further element; or that the element is in contact with the further portion, unless the context clearly indicates otherwise. Further, the phrase“based on” is intended to mean “based, at least in part, on” unless explicitly stated otherwise. Terms“about” or“substantially” or like terms are intended to cover variations in a value that are within normal industry manufacturing tolerances for that dimension. If no industry standard as available a variation of 20 percent would fall within the meaning of these terms unless otherwise stated.
[0022] Fig. 1 schematically illustrates a facility 10 such as a power plant, a steel mill, a paper mill, and the like. As illustrated, the facility 10 includes operating equipment such as turbines, blast furnaces, rolling mills, paper cutters, etc. and devices 20 (e.g., gauges, sensors, controls, switches, etc.) positioned to measure parameters of the equipment 15 or to operate as desired to support operation of the equipment 15. Many of these facilities 10 were built years or decades ago and as such include dozens if not hundreds of remote instruments or devices 20 that must be periodically monitored or adjusted. For example, a power plant may include a stand-alone hydraulic system that provides pressurized oil to a mechanical control system. The system pressure may be monitored by a simple pressure gauge attached to a pressure tank. The pressure of this tank must be checked periodically by plant personnel. In another example, a fan might be positioned near other equipment to cool the equipment. The fan may be controlled by a simple temperature switch positioned near the equipment. The temperature is not actually transmitted to a central control system 25 and the operating status of the fan is not directly monitored. Rather, plant personnel periodically inspect the fan to verify that the switch is functioning and that the fan is operating as desired.
[0023] While new controls and gauges are available and could replace the aforementioned examples, the cost of implementing these new devices 20 in a large plant or facility 10 is prohibitive. Not only do the devices 20 (e.g., gauges, sensors, controls, switches, etc.) need to be replaced but often new wiring, tubing, transmission devices, receivers, etc. need to be installed.
In addition, the new data must be provided to a central control system 25 or other device to be useful. This could require additional programing or a separate stand-alone device 20 that displays the results.
[0024] Fig. 2 illustrates a device monitor 30 that could be used in conjunction with some or all the devices 20 of Fig. 1 to gather the important data and deliver it to a user or the central control system 25. The device monitor 30 of Fig. 2 is positioned to monitor one or both of a meter 35 and a fan 40 and includes an imaging device 45, a computer 50, a transmitter 55, and a power supply 60. It should be understood that the device monitor 30 is capable of monitoring virtually any device 20 and should not be limited to the example devices 35, 40 described herein.
[0025] The imaging device 45 is preferably an inexpensive low-power camera capable of capturing still images. Suitable cameras include chip-based CMOS or other image sensors or other suitable devices. Low-power imaging devices 45 include cameras that operate on less than one watt when capturing images. Preferable cameras operate on less than two-hundred milliwatts of electricity when capturing video at the frame rate of 30 fps. In preferred constructions, the imaging device 45 is capable of capturing color images as well as video. However, some constructions operate with a simple imaging sensor capable of capturing only black and white or gray-scale still images.
[0026] In the construction of Fig. 2, the camera is positioned to image two separate devices 20, specifically the meter 35 and the fan 40. The imaging device 45 can be positioned to capture the important areas of both devices 35, 40 in a single image. However, in preferred constructions, the imaging device 45 is movably supported to allow movement between two (or more) positions to capture separate images of each device 35, 40. A simple motor, electrical actuator or other electrically movable device could be employed to change the position of the imaging device 45. In some constructions, the imaging device 45 could be positioned to image more than two devices 20 together or in sequence.
[0027] The computer 50 is connected to the imaging device 45 to receive images captured by the imaging device 45. The computer 50 illustrated in Fig. 2 includes a central processing unit (CPU) 65, a storage device 70, and an artificial intelligence (AI) co-processor 75. It should be noted that the term“computer” as used herein should be interpreted broadly to cover any device that receives and processes the images captured by the image sensor 45. While the illustrated computer 50 includes the CPU 65, the storage device 70, and the AI co-processor 75, other computers 50 may omit one or more of these components or may include additional components such as additional processors, memory, storage, communication devices, and the like.
[0028] As is well understood, the software aspects of the present invention could be stored on virtually any computer readable medium including a local disk drive system, a remote server, internet, or cloud-based storage location. In addition, aspects could be stored on portable devices or memory devices as may be required. The computer 50 generally includes an input/output device that allows for access to the software regardless of where it is stored, one or more processors 65, 75, memory devices, user input devices, and output devices such as monitors, printers, and the like.
[0029] The processors 65, 75, as discussed may include standard micro-processors 65 and/or the artificial intelligence accelerator or processor 75 specifically designed to perform artificial intelligence applications such as artificial neural networks, machine vision, and machine learning. Typical applications include algorithms for robotics, internet of things, and other data- intensive or sensor-driven tasks. Often AI accelerators 75 are multi-core designs and generally focus on low-precision arithmetic, novel dataflow architectures, or in-memory computing capability. In still other applications, the AI processor 75 may include a graphics processing unit (GPU) designed for the manipulation of images and the calculation of local image properties.
The mathematical basis of neural networks and image manipulation are similar, leading GPUs to become increasingly used for machine learning tasks. Of course, other processors or arrangements could be employed if desired. Other options include but are not limited to field- programmable gate arrays (FPGA), application-specific integrated circuits (ASIC), and the like.
[0030] The computer 50 may also include communication devices, in addition to the transmitter 55, that allow for communication between other computers or computer networks, as well as for communication with other devices such as machine tools, work stations, actuators, controllers, sensors, and the like.
[0031] The CPU 65, storage device 70, and AI co-processor 75 cooperate to process images and extract certain data from the images. In the most common application, the data is in the form of text or numbers (referred to herein as“text”) that represent an operating parameter being monitored, measured, or controlled by the device 20. However, other applications can detect other characteristics from the image. For example, one application determines if the fan 40 is operating by looking for a blurred or unclear image of the fan blades. In another application, an infrared imaging device is used to detect heat from a motor powering the fan 40 during operation.
[0032] As with the imaging device 45, the computer 50 is a low-power device and consumes less than three watts of electricity during operation. A more preferred computer 50 operates on less than one watt of electricity and is operated periodically.
[0033] The transmitter 55 is connected to the computer 50 to receive and transmit the operating parameter or other data from the computer 50 to an external device such as a user, the central control 25, a secondary control, an internet connection, or a repeater that retransmits the data to extend the range of transmission. In the illustrated construction, the transmitter 55 includes a wireless radio 80 and a wired digital interface 85. The term“transmitter” should be read broadly to include any device or arrangement capable of outputting the operating parameter or other data received from the computer 50. The transmitter 55 could omit one of the components illustrated in Fig. 2 or could include additional components such as an analog interface, a cellular transmitter, an internet connector, and the like. In addition, the transmitter 55 is preferably capable of receiving external transmissions (i.e., a transceiver) such as instructions, software updates, and the like. [0034] The wireless radio 80 is operable to wirelessly transmit the operating parameter or other data from the device monitor 30 to an external receiver. In preferred constructions, the wireless radio 80 is capable of transmission of at least one hundred meters with further transmission being possible. In addition, repeaters or other components can be positioned throughout the facility 10 to extend the range of the wireless radio 80 if necessary.
[0035] The digital interface 85 could include an RS-485 or similar serial interface that allows for two-way communication between the device monitor 30 and an external device. While the digital interface 85 could be used to transmit the operating parameter or other data, it is preferably provided for other purposes such as software updates and the like.
[0036] As with the image device 45 and the computer 50, the transmitter 55 is a low-power transmitter 55 that requires less than one watt of electricity during transmission of the operating parameter or other data. A more preferred transmitter 55 uses less than twenty milliwatts of electricity to complete the necessary transmissions. Thus, the combination of the computer, the camera, and the imaging device use less than five watts of power with more preferred systems operating using less than 1.22 watts. Typical systems operate at a power level of 1.5 watts or less.
[0037] As described, the device monitor 30 includes low-power features that allow the device monitor 30 to operate on less than five watts of electricity with more optimized designs using less than one watt of electricity. The power supply 60 is coupled to each of the imaging device 45, the computer 50, and the transmitter 55 to provide the necessary power for each of these devices without the need to connect to an external power supply.
[0038] In the construction illustrated in Fig. 2, the power supply 60 includes a photovoltaic (PV) cell 90 capable of providing at least five watts of electricity at a voltage of five volts or less to power the imaging device 45, the computer 50, and the transmitter 55. To generate the necessary power, the PV cell 90 has a surface area of less than thirty square inches (194 square cm) with larger or smaller PV cells 90 being possible depending on the specific power needs of the device monitor 30 and the amount of light available at the installation location of the device monitor 30. In one construction, the PV cell 90 is fifteen square inches (97 square cm). The PV cell 90 is positioned and oriented during installation to allow for the capture of energy from whatever light source might be available within the facility 10. In some constructions, the PV cell 90 is also connected to a rechargeable battery 95 that is charged when excess energy from the PV cell 90 is available. The battery 95 is then capable of providing power to the imaging device 45, the computer 50, and the transmitter 55 when and if the PV cell 90 is not able to deliver the necessary power. Still other constructions could operate on batteries alone. However, the need to periodically replace those batteries would negatively affect the benefits provided by the device monitor 30.
[0039] Figs. 3-7 illustrate the operation of the device monitor 30. Fig. 3 illustrates an example of an image 100 captured by the imaging device 45 and transferred to the computer 50 for processing. The first step performed by the computer 50 is the placement of bounding boxes 105 around text detected in the image 100. The CPU 65, the storage device 70, and the AI co processor 75 cooperate to define the bounding boxes 105. In some device monitors 30, the storage device 70 includes pre-trained models that aid the CPU 65 and AI co-processor 75 in identifying the text and placing the bounding boxes 105 around the text. The AI co-processor 75 may utilize a neural network or deep learning techniques to enable the placement of the bounding boxes 105 and to determine which of the bounding boxes 105 contains the desired data. In some constructions, the device monitor 30 can be trained to determine which bounding box 105 contains the important data. Alternatively, the device monitor 30 learns the format of the important text (e.g., five numbers with a decimal point) and looks for a bounding box 105 that contains text that matches the desired pattern. In still other constructions, the correct bounding box 105 is defined by a user at installation and the remaining bounding boxes 105 are ignored.
[0040] Once the correct bounding box 105 is identified, the CPU 65 and AI co-processor 75 operate to extract the value of the data from the image 100 (e.g., a number or a word such as “on”). This process is well known and is commonly referred to as optical character recognition (OCR). Several different OCR techniques are available including seven segment optical character recognition (SSOCR) and variations thereof and Tesseract. Tesseract is an open- source OCR engine able to employ both traditional computer vision and deep learning engines. Some device monitors 30 may include multiple OCR engines that each run to allow the comparison of the results to enhance the accuracy of the extracted value. [0041] The OCR process extracts the operating parameter or other data that is then transferred to the transmitter 55 for transmission from the device monitor 30. Rather than sending an image, the text of the operating parameter or other data is sent, thereby requiring a much smaller transmission. This saves both time and energy, thereby further allowing the device monitor 30 to operate at very low power levels.
[0042] Figs. 6 and 7 illustrate another image 110 of a meter and the resulting data 115 that is transmitted to an external device or control system 25. As illustrated, the image 110 includes several bounding boxes 105 with the bounding box 105 containing the data of interest 115 being the largest box 105. The computer 50 is either trained or learns to look for this bounding box 105 and to extract the data 115 contained therein. In this example, the data 115 is the number “ 1.34”. The computer 50 performs the OCR process and transfers the resulting data 115 to the transmitter 55. The transmitter 55 then transmits the data 115 (as a number and not an image) to an external device or control system 25 which displays the data 115 as illustrated in Fig. 7.
[0043] Fig. 4 illustrates another application of the device monitor 30 in which the device monitor 30 captures images of the fan 40 to determine the current state of the fan 40 (i.e., is it on or off).
If the fan 40 includes a light or other indicator, the imaging device 30 could be directed at that indicator to capture an image. If the imaging device 30 is capable of detecting in the IR spectrum, the imaging device 30 could be aimed at the motor that drives the fan 40 to detect the increased temperature during operation. Alternatively, the imaging device 30 is directed at the fan blades themselves and captures a still image of the blades. The image is then transferred to the computer 50 which has been trained to detect something that is not text based. In the case where the fan 40 has an indicator light, the computer can simply determine if the light is on or off based on the color or the brightness of the light. The computer 50 would then generate data indicative of the condition of the light (i.e., the word“on” if the fan 40 is operating and the word “off’ if the fan 40 is not operating).
[0044] In the example of Fig. 4, the fan 40 does not include a textual or text-based indicator (i.e., numbers or words) and the computer must determine the state of the fan 40 based on the image of the blades. In this case, the computer 50 is trained or otherwise learns that when an image of the fan 120 includes a blurred region in the vicinity of the blades, the fan 40 is in the“on” state. As an alternative, the computer 50 can perform an edge detection operation in which the computer 50 attempts to locate an edge 125 of one or more of the blades as illustrated in Fig. 5. The computer 50 would be trained or would have otherwise learned what the edge 125 or edges 125 should look like. When the fan 40 is operating (Fig. 4), the edges 125 of the blades appear blurred and cannot be defined. Thus, the computer 50 would generate data indicative of an“on” state when the edges 125 cannot be identified (i.e., the word“on”) and would generate data indicative of an“off’ state when the edges 125 can be identified (i.e., the word“off’). As another alternative, the computer 50 can perform an end-to-end object detection using convolutional neural networks (CNN). A CNN model can be trained to directly map the image of the fan as matrix of pixels to the state of the fan as in word“on” or“off’. As with the text transmission of Fig. 3, far less transmission power is required to transmit a short word such as “on” or“off’ than is required to transmit the image of the fan 120. Moreover, by converting images into word and numbers, the digitized the information can be conveniently integrated into control system.
[0045] In some constructions, the computer 50 includes an error detection process that reviews the captured images 100, 120 for potential errors with the system. For example, if the image 100, 120 is blurred or partially obscured, the computer 50 could generate data indicative of the problem. A simple message such as“camera failure” could be transmitted to indicate that the device monitor 30 should be inspected and repaired or replaced.
[0046] As previously discussed, the imaging device 45, or the device monitor 30 itself, could be supported for motion between two or more positions or orientations to monitor two or more devices 20. In another construction, the device monitor 30 is mounted to an autonomous or semi-autonomous drone such as a land-based vehicle or an aerial drone such as a multi-rotor helicopter. The drone could be pre-programmed to proceed to waypoints within or around the facility 10 where images of devices 20 are captured, analyzed, and transferred to an outside device such as the central control system 25. This system could essentially travel the path formerly followed by facility personnel to take the necessary readings.
[0047] The foregoing description included several detailed examples. To be clear, the device monitor is capable of monitoring a number of other instruments or conditions and should not be limited to the examples discussed to this point. In general, most applications of the device monitor fall into one of four types of applications. These applications can be categorized as “Boolean Detection”,“Boolean/Value Detection”,“Value Detection”, and“Position Detection” with some overlap existing between each of these various detection systems.
[0048] In a Boolean detection system, the device monitor is positioned adjacent a device, object, space, or virtually any feature or component (collectively referred to as“an object of interest”) that is capable of having two and generally only two states. For example, one Boolean detection system may be positioned adjacent a door that is either opened or closed. The device monitor would be trained to determine from an image if the door is opened, which might correspond to the number one, or if the door is closed, which would then correspond to the number zero. Thus, only a single digit needs to be transmitted which would require a very small amount of power. Other examples of a Boolean detection system might include detecting and reporting the status of a light (i.e., is it on or off), detecting and reporting the condition of a drain (opened or closed), and detecting and reporting the presence of a person in a particular space (present, not present). One of ordinary skill in the art will understand that many other Boolean detection applications are possible.
[0049] It should be noted, that many features may be capable of more than two states but in a Boolean detection system only two states are important. For example, a light may have the states of Off, On (yellow), and On (green). However, a Boolean detection system would detect only if the light is on or off and would ignore the color of the light. If the color is not important data, the Boolean detection system may be appropriate.
[0050] Boolean/V alue detection systems are closely related to Boolean detection systems. In a Boolean/V alue detection system the device monitor operates much like a Boolean system to detect a state of something, but further detects a value associated with the device associated with the device monitor. For the light example discussed with regard to the Boolean detection system, if the color of the light is deemed important, the device monitor could determine first if the light is on or off, and if it is on it would determine the color. Again, the Boolean state could be reduced to a single digit. In addition, the color could be reduced to a single digit with each digit representing a particular color. Thus, the device could send two digits to identify the state and color of the light. Alternatively, a zero could indicate that the light is off, with numbers 1, 2, 3, etc. each representing the light being on and emitting a particular color.
[0051] In another example, a device monitor could be positioned to monitor the build up of ice in a problematic location of an air intake for an engine. While a Boolean system could be used, it would only provide information regarding the presence or lack of ice. The Boolean/Value detection system is arranged to allow the device monitor to detect the presence of ice but to also quantify how much ice is present. In one application, the user defines a maximum amount of ice that is allowable. This value becomes 100 percent. The system then quantifies and reports the quantity of ice in a percentage of the maximum allowable. Thus, again, the system transmits a single value (the percent quantity of ice) and uses a minimal amount of power. The system could also be arranged to transmit an actual image of the ice if the quantity exceeds a predetermined percentage (e.g., 90 percent). While this uses a significant amount of power, the importance of the image overrides the importance of maximizing power efficiency.
[0052] Boolean/V alue detection systems are typically used in situations where knowing a value is desirable but where no gauge or measuring device exists to accurately measure that value. Another application might be the monitoring of corrosion in a particular location. A device monitor could be positioned to periodically image the location (e.g., once per day) and transmit a value (e.g., percent of maximum allowable corrosion) to the central control system.
[0053] In yet another example, a Boolean/V alue detection system could be used to determine the number of objects (e,g., people, cows, boxes, cars, airplanes, etc.) in a space. In this
arrangement, a device monitor is positioned to image the desired space. The image is then analyzed to separate the desired objects and to count those objects. Once counted, a single integer value can be transmitted to the central control system or other desired device.
[0054] Value detection systems are particularly suited to reading gauges, meters or other devices that present data as a measured value. In this application, a device monitor is positioned to image the device and analyze the image to select the region of the image that includes the desired data and to convert the value to a number rather than an image (e.g., optical character recognition). A system similar to this was described in detail with regard to Figs. 3, 6, and 7. [0055] Value Detection systems are therefore well-suited to extracting data in which the data of interest is produced directly on the device being monitored.
[0056] Position Detection systems are used to detect the position of something and to assign a number value to that position. In one example, a position detection system is used to determine the position of a valve. Valves can be arranged to be positioned at any point between opened and closed (analog) or could have fixed positions such as closed, 25 percent open, 50 percent open, 75 percent open, and opened (digital). A device monitor is positioned to image a portion of the valve indicative of the valve position. This could include an actuator handle, a knob, a selector switch, an actuator arm, or some other component. As with other examples, the device monitor determines a position of the valve from the image of the component and reports that value as a number that could be indicative of a valve position or a percentage of open. While many valves include a scale positioned near the valve to indicate the valve position, a Value Detection system is not appropriate as there is no single number to read for the valve position. Rather, the detection system must detect the position of an object (e.g., a pointer) and then must associate that position with a value.
[0057] As one of ordinary skill in the art will realize, not all potential applications will fit neatly into one of the categories identified. However, the four categories broadly define the potential operation of the device monitor.
[0058] While many of the measurements taken by or the data gathered by the device monitor could possible be gathered using more conventional means, the device monitor provides a very inexpensive and completely wireless system that can perform the task. The cost of running wires to the various conventional devices often costs significantly more than the device monitor. In addition, the very efficient use of power, enhanced by taking only periodic measurements (e.g., once per hour, once per day, etc.) assures that the device monitors can be powered by small solar power cells with minimal light to eliminate the need to change batteries or provide other power sources.
[0059] The preferred embodiment of the device monitor includes the imaging device, the power supply in the form of a solar cell, a computer, and a wireless transmitter. The computer preferably runs AI software that is provided, programmed, or trained for the specific application where the device monitor is positioned. While self-learning software could be employed, the training often is more difficult or complex than simply programming the device to operate as desired.
[0060] To save power, the entire system can be shutdown between measurement intervals to allow for the storage of power and the reduction in total power use. In some examples, device monitors can operate for a week or more with only one hour of light exposure to power the device.
[0061] It should also be noted that other systems could be used to capture a single image that includes multiple devices to be analyzed. Data could be extracted from each of the devices such that a single image is used to gather data for two or more devices. In addition, the data extracted could fall into different categories if desired. For example, one image could extract a Boolean value for an indicator light, a positional value for a valve handle, an oil level within a sight glass, and a pump pressure from a pressure indicator. Each value could be extracted, converted to a textual value, and transmitted with an identifier or in a predefined order.
[0062] In addition, in preferred systems, the computer utilizes the AI co-processor 75 to facilitate deep-learning, neural networks or other AI techniques to learn the various controls, objects, devices, or areas that are typically measured such that in some applications little to no training or programming is required to implement the system.
[0063] Although an exemplary embodiment of the present disclosure has been described in detail, those skilled in the art will understand that various changes, substitutions, variations, and improvements disclosed herein may be made without departing from the spirit and scope of the disclosure in its broadest form.
[0064] None of the description in the present application should be read as implying that any particular element, step, act, or function is an essential element, which must be included in the claim scope: the scope of patented subject matter is defined only by the allowed claims.
Moreover, none of these claims are intended to invoke a means plus function claim construction unless the exact words "means for" are followed by a participle.

Claims

CLAIMS What is claimed is:
1. A system for monitoring the state of a remote object, the system comprising: an imaging device positioned to capture an image of the remote object;
a computer proximate to and coupled to the imaging device and operable to extract a textual parameter indicative of a state of the object from the image, the image not including a textual indication of the state of the object; and
a transmitter proximate to and coupled to the computer and operable to wirelessly transmit the textual parameter.
2. The system of claim 1, wherein the imaging device includes a camera that is operable to capture a still image of the first device.
3. The system of claim 1, wherein the transmitting device includes a wireless transceiver.
4. The system of claim 1, further comprising a power supply operable to provide power to each of the imaging device, the computer, and the transmitter.
5. The system of claim 4, wherein the power supply includes a photovoltaic (PV) cell having a surface area of no more than 30 square inches (194 square cm).
6. The system of claim 1, wherein the imaging device, the computer, and the transmitter are coupled to a movable drone to facilitate movement between a plurality of remote objects.
7. The system of claim 1, wherein the imaging device is movable to capture images of a second device spaced apart from the first device.
8. The system of claim 1, wherein the computer is operable to detect an error in the image and the transmitter is operable to transmit the image in response to the computer detecting the error.
9. The system of claim 1, wherein the computer, the imaging device, and the transmitter cooperate to consume less than 1.5 watts of power during operation.
10. The system of claim 1, wherein the computer is operable to extract a second textual parameter from the image.
11. A system for monitoring the state of a remote object, the system comprising: an imaging device positioned to capture an image of the remote object;
a computer proximate to and coupled to the imaging device and operable to extract a textual parameter indicative of a state of the object from the image;
a transmitter proximate to and coupled to the computer and operable to wirelessly transmit the textual parameter; and
a photovoltaic power supply coupled to and operable to power each of the imaging device, the computer, and the transmitter.
12. The system of claim 11, wherein the image does not include a textual indication of the state of the object.
13. The system of claim 12, wherein the imaging device includes a camera that is operable to capture a still image of the first device.
14. The system of claim 12, wherein the transmitting device includes a wireless transceiver.
15. The system of claim 12, wherein the photovoltaic power supply includes a cell having a surface area of no more than 30 square inches (194 square cm).
16. The system of claim 12, wherein at least one of the imaging devices is movable to capture images of a second associated device spaced apart from the associated device.
17. The system of claim 12, wherein the computer is operable to detect an error in the image and the transmitter is operable to transmit the image in response to the computer detecting the error.
18. The system of claim 12, wherein the computer, the imaging device, and the transmitter cooperate to consume less than 1.5 watts of power during operation.
19. A method of acquiring data from a remote object and transmitting that data to a control system, the method comprising:
positioning an imaging device adjacent the remote object;
capturing an image of the remote object with the imaging device;
analyzing the image to extract textual information regarding a state of the remote object from the image, wherein the image does not contain textual data indicative of the state of the remote object; and
wirelessly transmitting the textual information to the control system.
20. The method of claim 19, further comprising moving the imaging device to capture an image of a second remote device.
21. The method of claim 19, further comprising powering the imaging device, the computer, and the transmitter with a power supply.
22. The method of claim 21, wherein the power supply includes a photovoltaic (PV) cell having a surface area of no more than 30 square inches (194 square cm).
23. The method of claim 21 , further comprising extracting a second textual information from the image.
24. A device monitor for monitoring an object of interest having a first state and a second state and for providing data to a separate system spaced apart from the object of interest, the device monitor comprising:
an imaging device positioned to capture an image of the object of interest;
a computer proximate to and coupled to the imaging device and operable to determine a current state of the object of interest, the current state being one of the first state and the second state, and further assigning a first value to a text value if the current state is the first state and a second value to the text value if the current state is the second state; and
a transmitter proximate to and coupled to the computer and operable to wirelessly transmit the text value to the separate system.
25. A method of gathering data from an object of interest, the method comprising: positioning an imaging device adjacent the object of interest to allow the imaging device to periodically capture an image;
determining an allowable quantity of a substance that can be associated with the object of interest;
programming a computer to assign a numerical value to the image based on a quantity of the substance that is visible in the image and the allowable quantity of the substance;
wirelessly transmitting the numerical value from the computer with a transmitter; and powering the imaging device, the computer, and the transmitter with no more than 1.5 watts of electricity.
PCT/US2018/057274 2018-10-24 2018-10-24 System and method to automatically optically monitoring field devices and assess state information WO2020086071A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2018/057274 WO2020086071A1 (en) 2018-10-24 2018-10-24 System and method to automatically optically monitoring field devices and assess state information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2018/057274 WO2020086071A1 (en) 2018-10-24 2018-10-24 System and method to automatically optically monitoring field devices and assess state information

Publications (1)

Publication Number Publication Date
WO2020086071A1 true WO2020086071A1 (en) 2020-04-30

Family

ID=64308810

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/057274 WO2020086071A1 (en) 2018-10-24 2018-10-24 System and method to automatically optically monitoring field devices and assess state information

Country Status (1)

Country Link
WO (1) WO2020086071A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002031442A1 (en) * 2000-10-07 2002-04-18 Lattice Intellectual Property Ltd Method and apparatus for obtaining information from a utility meter
US20090322884A1 (en) * 2008-06-27 2009-12-31 Honeywell International Inc. Apparatus and method for reading gauges and other visual indicators in a process control system or other data collection system
US20140307106A1 (en) * 2010-11-05 2014-10-16 Google Inc. Methods and Systems for Remotely Controlling Electronic Devices
WO2015047596A1 (en) * 2013-09-26 2015-04-02 Rosemount Inc. Wireless industrial process field device with imaging
WO2017029611A1 (en) * 2015-08-17 2017-02-23 H3 Dynamics Holdings Pte. Ltd. Drone box
US20180040123A1 (en) * 2012-10-04 2018-02-08 Cerner Innovation, Inc. Mobile processing device system for patient monitoring data acquisition

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002031442A1 (en) * 2000-10-07 2002-04-18 Lattice Intellectual Property Ltd Method and apparatus for obtaining information from a utility meter
US20090322884A1 (en) * 2008-06-27 2009-12-31 Honeywell International Inc. Apparatus and method for reading gauges and other visual indicators in a process control system or other data collection system
US20140307106A1 (en) * 2010-11-05 2014-10-16 Google Inc. Methods and Systems for Remotely Controlling Electronic Devices
US20180040123A1 (en) * 2012-10-04 2018-02-08 Cerner Innovation, Inc. Mobile processing device system for patient monitoring data acquisition
WO2015047596A1 (en) * 2013-09-26 2015-04-02 Rosemount Inc. Wireless industrial process field device with imaging
WO2017029611A1 (en) * 2015-08-17 2017-02-23 H3 Dynamics Holdings Pte. Ltd. Drone box

Similar Documents

Publication Publication Date Title
US7581434B1 (en) Intelligent fluid sensor for machinery diagnostics, prognostics, and control
CN109752300A (en) A kind of coating material production safe and intelligent crusing robot, system and method
CN104390657B (en) A kind of Generator Unit Operating Parameters measurement sensor fault diagnosis method and system
US10808864B2 (en) System and method for controlling a field device
CN113551775B (en) Equipment fault on-line monitoring alarm method and system based on infrared thermal imaging
CN108957236A (en) A kind of mechanical breakdown monitoring system based on internet big data
CN105606666A (en) Gas sensor based portable switch cabinet partial discharge detection device and method
CN116194759A (en) System and method for inspection and predictive analysis with artificial intelligence dynamics
KR102163910B1 (en) A system for predicting tension of mooring lines through location information prediction and location information learning based on ocean information of the offshore platform
CN104267714A (en) Satellite attitude track control test system and method
CN116308304B (en) New energy intelligent operation and maintenance method and system based on meta learning concept drift detection
WO2019094177A1 (en) Systems and method for human-assisted robotic industrial inspection
CN115329812A (en) Road infrastructure abnormity monitoring method based on artificial intelligence
CN115827411B (en) On-line monitoring and operation and maintenance assessment system and method for automation equipment
CN111124015A (en) Intelligent wind power inspection video monitoring method
CN113643515A (en) Combustible dust automatic identification judges and alarm system
CN108093210A (en) A kind of transformer oil level warning system and its alarm method
CN111562540A (en) Electric energy meter detection monitoring method based on dynamic image recognition and analysis
CN115060312A (en) Building material safety monitoring system based on artificial intelligence
WO2020086071A1 (en) System and method to automatically optically monitoring field devices and assess state information
US20230260097A1 (en) Power station inspection system and power station inspection method
CN117111660A (en) Unattended intelligent granary system and method
KR102267811B1 (en) System for predicting present state and future state of agriculture solar power generation structure based on rnn
CN116502134A (en) Self-diagnosis early warning abnormal functional state identification system
Yucesan et al. Physics-informed digital twin for wind turbine main bearing fatigue: Quantifying uncertainty in grease degradation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18803506

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18803506

Country of ref document: EP

Kind code of ref document: A1