US20170329307A1 - Robot system for asset health management - Google Patents
Robot system for asset health management Download PDFInfo
- Publication number
- US20170329307A1 US20170329307A1 US15/584,995 US201715584995A US2017329307A1 US 20170329307 A1 US20170329307 A1 US 20170329307A1 US 201715584995 A US201715584995 A US 201715584995A US 2017329307 A1 US2017329307 A1 US 2017329307A1
- Authority
- US
- United States
- Prior art keywords
- asset
- defect
- robot
- processor
- instructions
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000036541 health Effects 0.000 title claims description 17
- 230000007547 defect Effects 0.000 claims abstract description 113
- 238000012545 processing Methods 0.000 claims abstract description 29
- 230000008439 repair process Effects 0.000 claims description 37
- 238000000034 method Methods 0.000 claims description 27
- 238000012423 maintenance Methods 0.000 claims description 19
- 239000012636 effector Substances 0.000 claims description 13
- 230000007797 corrosion Effects 0.000 claims description 7
- 238000005260 corrosion Methods 0.000 claims description 7
- 238000012544 monitoring process Methods 0.000 claims description 6
- 238000005336 cracking Methods 0.000 claims description 4
- 238000003466 welding Methods 0.000 claims description 4
- 230000001050 lubricating effect Effects 0.000 claims description 3
- 239000000203 mixture Substances 0.000 claims description 3
- 238000005507 spraying Methods 0.000 claims description 3
- 238000004458 analytical method Methods 0.000 claims description 2
- 238000007689 inspection Methods 0.000 description 22
- 230000008569 process Effects 0.000 description 16
- 230000009471 action Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 9
- 230000033001 locomotion Effects 0.000 description 9
- 238000007726 management method Methods 0.000 description 8
- 238000003860 storage Methods 0.000 description 7
- 230000000007 visual effect Effects 0.000 description 7
- 239000000463 material Substances 0.000 description 6
- 230000007613 environmental effect Effects 0.000 description 5
- 238000004519 manufacturing process Methods 0.000 description 5
- 238000013459 approach Methods 0.000 description 4
- 239000000654 additive Substances 0.000 description 3
- 230000000996 additive effect Effects 0.000 description 3
- 230000003137 locomotive effect Effects 0.000 description 3
- 238000005067 remediation Methods 0.000 description 3
- 239000007921 spray Substances 0.000 description 3
- 238000010146 3D printing Methods 0.000 description 2
- 241000282412 Homo Species 0.000 description 2
- 239000003795 chemical substances by application Substances 0.000 description 2
- 230000009193 crawling Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 239000007788 liquid Substances 0.000 description 2
- 239000002245 particle Substances 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 239000000523 sample Substances 0.000 description 2
- 230000002269 spontaneous effect Effects 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 230000002542 deteriorative effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 239000000314 lubricant Substances 0.000 description 1
- 238000004643 material aging Methods 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000009659 non-destructive testing Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 238000010248 power generation Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 239000002352 surface water Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1661—Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/163—Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1671—Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F01—MACHINES OR ENGINES IN GENERAL; ENGINE PLANTS IN GENERAL; STEAM ENGINES
- F01D—NON-POSITIVE DISPLACEMENT MACHINES OR ENGINES, e.g. STEAM TURBINES
- F01D5/00—Blades; Blade-carrying members; Heating, heat-insulating, cooling or antivibration means on the blades or the members
- F01D5/005—Repairing methods or devices
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F22—STEAM GENERATION
- F22B—METHODS OF STEAM GENERATION; STEAM BOILERS
- F22B37/00—Component parts or details of steam boilers
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/04—Programme control other than numerical control, i.e. in sequence controllers or logic controllers
- G05B19/042—Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/04—Programme control other than numerical control, i.e. in sequence controllers or logic controllers
- G05B19/048—Monitoring; Safety
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/18—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
- G05B19/406—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by monitoring or safety
- G05B19/4065—Monitoring tool breakage, life or condition
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/18—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
- G05B19/4097—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by using design data to control NC machines, e.g. CAD/CAM
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0094—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/35—Nc in input of data, input till input file format
- G05B2219/35134—3-D cad-cam
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40323—Modeling robot environment for sensor based robot system
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/42—Servomotor, servo controller kind till VSS
- G05B2219/42329—Defective measurement, sensor failure
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/49—Nc machine tool, till multiple
- G05B2219/49007—Making, forming 3-D object, model, surface
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S901/00—Robots
- Y10S901/01—Mobile robot
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S901/00—Robots
- Y10S901/30—End effector
- Y10S901/44—End effector inspection
Definitions
- the subject matter disclosed herein relates to asset management, and more particularly, to monitoring and managing health of an asset using a robotic system.
- assets may include physical or mechanical devices or structures, which may in some instances, have electrical and/or chemical aspects as well.
- assets may be used or maintained for a variety of purposes and may be characterized as capital infrastructure, inventory, or by other nomenclature depending on the context.
- assets may include distributed assets, such as a pipeline or an electrical grid as well as individual or discrete assets, such as an airplane, a tower, or a vehicle.
- Assets may be subject to various types of defects (e.g., spontaneous mechanical defects, electrical defects as well as routine wear-and-tear) that may impact their operation. For example, over time, the asset may undergo corrosion or cracking due to weather or may exhibit deteriorating performance or efficiency due to the wear or failure of component parts.
- one or more human inspectors may inspect, maintain, and repair the asset. For example, the inspector may locate corrosion on the asset and clean the corrosion from the asset. However, depending on the location, size, and/or complexity of the asset, having one or more human inspectors performing inspection of the asset may take away time for the inspectors to perform other tasks. Additionally, some inspection tasks may be dull, dirty, or may be otherwise unsuitable for a human to perform. For instance, some assets may have locations that may not be accessible to humans due to height, confined spaces, or the like. Further, inspections may be performed at times that are based on schedules resulting in either over-inspection or under-inspection. Accordingly, improved systems and techniques for managing the health of various types of assets are desirable.
- a robotic system for monitoring health of an asset includes at least one robot comprising at least one sensor capable of detecting one or more characteristics of an asset and at least one effector capable of performing a repair or maintenance operation on the asset, wherein the at least one robot is configured to inspect an asset with the at least one sensor, and a processing system having at least one processor operatively coupled to at least one memory, wherein the processor is configured to receive sensor data from the at least one sensor indicating one or more characteristics of the asset, generate, update, or maintain a digital representation that models the one or more characteristics of the asset, detect a defect of the asset based at least in part on the one or more characteristics, and generate an output signal encoding or conveying instructions to provide a recommendation to an operator, to control the at least one robot to address the defect on the asset, or both, based on the defect and the digital representation of the asset.
- a non-transitory, computer readable medium includes instructions configured to be executed by a processor of a robotic system including at least one robot, wherein the instructions include instructions configured to cause the processor to receive sensor data from at least one sensor of the at least one robot indicating one or more characteristics of the asset, generate, update, or maintain a digital representation that models the one or more characteristics of the asset, detect a defect of the asset based at least in part on the one or more characteristics, and generate an output signal encoding or conveying instructions to provide a recommendation to an operator, to control the at least one robot to address the defect on the asset, or both, based on the defect and the digital representation of the asset.
- a method in a third embodiment, includes receiving sensor data from at least one sensor of at least one robot indicating one or more characteristics of an asset, generating, updating, or maintaining a digital representation that models the one or more characteristics of the asset, detecting a defect of the asset based at least in part on the one or more characteristics, and generating an output signal encoding or conveying instructions to provide a recommendation to an operator, to control the at least one robot to address the defect on the asset, or both, based on the defect and the digital representation of the asset.
- FIG. 1 is a perspective view of a robotic system with a set of robots to monitor and manage the health of an asset, in accordance with aspects of the present disclosure
- FIG. 2 is a block diagram of the robotic system of FIG. 1 having a second set of robots to manage another asset with a control system, in accordance with aspects of the present disclosure
- FIG. 3 is a flow diagram of a process performed by a controller of the control system of FIG. 2 to manage asset health, in accordance with aspects of the present disclosure
- FIG. 4 is a flow diagram of a process performed by the controller when performing the process of FIG. 3 , in accordance with aspects of the present disclosure
- FIG. 5 is a flow diagram of another process performed by the controller when performing the process of FIG. 3 , in accordance with aspects of the present disclosure
- FIG. 6 is a flow diagram of another process performed by the controller when performing the process of FIG. 3 , in accordance with aspects of the present disclosure.
- FIG. 7 is a schematic diagram of a user interface displayed to a user of the control system of FIG. 2 , in accordance with aspects of the present disclosure.
- the subject matter disclosed herein relates to managing repair and/or maintenance of an asset with a robotic system.
- Such an approach may be useful in monitoring or repairing assets associated with various entities, including business or corporate entities, governments, individuals, non-profit organizations, and so forth.
- assets may be generally discrete or limited in their extent (e.g., a vehicle such as a plane, helicopter, ship, submersible, space launch vehicle, satellite, locomotive, and so forth) or may be geographically distributed (e.g., a road or rail track, a port or airport, a pipeline or electrical infrastructure, a power generation facility or manufacturing plant, and so forth).
- the present approach as described herein may be used to monitor and maintain assets of these types (as well as others not listed) in an autonomous or semi-autonomous manner using robotic intermediaries.
- the robotic intermediaries may be used to facilitate one or both of health monitoring of the asset and repair, remediation, or improvement of the asset with limited or no human support.
- assets such as distributed assets and/or individual assets may be used to perform any number of operations.
- assets may deteriorate due to weather, physical wear, or the like.
- one or more components of an asset may wear or deteriorate due to rain and wind or other environmental conditions or due to inadequate maintenance.
- spontaneous failures of one or more components or systems of an asset may occur which may be unrelated to wear or maintenance conditions but may instead be attributable to an undetected defect or an unknown stressor.
- the health of the asset depends on identifying and addressing such defects in a timely and effective manner.
- one or more human agents may inspect the asset for wear at limited intervals to maintain health of the asset and/or to replace parts that appear worn.
- the human agents may be unable to inspect components or locations that may not be easily accessible to humans, such as below the waterline of a marine asset, within a tank or pipe of a pipeline or storage facility, on the exterior surfaces or components of a vehicle in motion (such as a flying plane or helicopter, a moving truck or locomotive), and so forth).
- a vehicle in motion such as a flying plane or helicopter, a moving truck or locomotive
- a robot system may be used to monitor and manage health of an asset that reduces or eliminates human intervention.
- a robot may be a machine (e.g., electro-mechanical) capable of carrying out a set of tasks (e.g., movement of all or part of the machine, operation of one or more type of sensors to acquire sensed data or measurements, and so forth) automatically (e.g., at least partially without input, oversight, or control by a user), such as a set of tasks programmed by a computer.
- the robot may include one or more sensors to detect one or more characteristics of an asset and one or more effectors to perform an operation based on a plan to assess, repair, or service the asset.
- the robot system may include a processing system that includes one or more processors operatively coupled to memory and storage components. While this may be conceptualized and described below in the context of a single processor-based system to simplify explanation, the overall processing system used in implementing an asset management system as discussed herein may be distributed throughout the robotic system and/or implemented as a centralized control system.
- the processor may be configured to generate a plan to assess the asset for defects. For example, the processor may determine a plan based on the tasks (e.g., desired inspection coverage of the asset) and/or resources (e.g., robots) available. Based on the generated plan processor may implement the plan by sending signal(s) to the robots providing instructions to perform the tasks defined in the plan.
- a controller of each robot may process any received instructions and in turn send signal(s) to one or more effectors controlled by the respective robot to control operation of the robot to perform the assigned tasks.
- the processor may determine a plan to monitor the asset.
- the plan may include one or more tasks to be performed by one or more robots of the robotic system.
- the processor may adjust (e.g., revise) the plan based on the data received from the sensors related to the asset. For example, the plan may be adjusted based on acquired data indicative of a potential defect of the asset.
- the processor may send a signal(s) encoding or conveying instructions to travel a specified distance and/or direction that enables the robot to acquire additional data related to the asset associated with the potential defect.
- the processor may assess the quality of data received from the sensors. Due to a variety of factors, the quality of the data may be below a threshold level of quality. For example, pressure sensors or acoustic sensors may have background noise due to the conditions proximate to the asset. As such, the processor may determine a signal-to-noise ratio of the signals from the sensors that indicates a relationship between a desired signal and background noise. If the processor determines that the signal-to-noise ratio falls below a threshold level of quality, the processor may adapt the plan to acquire additional data. If the processor determines that the signal-to-noise ratio is above a threshold level of quality, the processor may proceed to perform maintenance actions based on the sensor data.
- a threshold level of quality For example, pressure sensors or acoustic sensors may have background noise due to the conditions proximate to the asset.
- the processor may determine a signal-to-noise ratio of the signals from the sensors that indicates a relationship between a desired signal and background noise. If the processor determines that the
- the processor may generate, maintain, and update a digital representation of the asset based on one or more characteristics that may be monitored using robot intermediaries and/or derived from known operating specifications. For example, the processor may create a digital representation that includes, among other aspects, a 3D structural model of the asset (which may include separately modeling components of the asset as well as the asset as a whole). Such a structural model may include material data for one or more components, lifespan and/or workload data derived from specifications and/or sensor data, and so forth.
- the digital representation in some implementations may also include operational or functional models of the asset, such as flow models, pressure models, temperature models, acoustic models, lifing models, and so forth.
- the digital representation may incorporate or separately model environmental factors relevant to the asset, such as environmental temperature, humidity, pressure (such as in the context of a submersible asset, airborne asset, or space-based asset).
- environmental factors relevant to the asset such as environmental temperature, humidity, pressure (such as in the context of a submersible asset, airborne asset, or space-based asset).
- one or more defects in the asset as a whole or components of the asset may also be modeled based on sensor data communicated to the processing components.
- the processor may generate a plan specifying one or more tasks or action, such as acquiring additional data related to the asset. For example, if the processor determines that acquired data of a location on the structural model is below a threshold quality or is otherwise insufficient, the processor may generate or update a revised plan that includes one or more tasks that position the robot to acquire additional data regarding the location.
- the sensor data used to generate, maintain, and update the digital representation, including modeling of defects may be derived from sensor data collected using one or more of sensors mounted on robots controlled by the processing components and/or by sensors integral to the asset itself which communicate their sensor data to the processing components.
- the robots used to collect sensor data, as well as effect repairs may be autonomous and capable of movement and orientation in one- (such as along a track), two- (such as along connected roads or along a generally planar surface), or three-dimensions (such as three-dimensional movement within a body of water, air, or space).
- the sensors used to collect the sensor data may vary between robots and/or may be interchangeable so as to allow customization of robots depending on need.
- Example of sensors include, but are not limited to, cameras or visual sensors capable of imaging in one or more of visible, low-light, ultraviolet, and or infrared (i.e., thermal) contexts, thermistors or other temperature sensors, material and electrical sensors, pressure sensors, acoustic sensors, radiation sensors or imagers, probes that apply non-destructive testing technology, and so forth.
- the robot may contact or interact physically with the asset to acquire data.
- the digital representation may incorporate or be updated based on a combination of factors detected from one or more sensors on the robot (or integral to the asset itself).
- the processor may receive visual image data from image sensors (e.g., cameras) on the robots to create or update a 3D model of the asset to localize defects on the 3D model.
- the processor may detect a defect, such as a crack, a region of corrosion, or missing part, of the asset.
- the processor may detect a crack on a location of a vehicle based on visual image data that includes color and/or depth information indicative of the crack.
- the 3D model may additionally be used as a basis for modeling other layers of information related to the asset.
- the processor may determine risk associated with a potential or imminent defect based on the digital representation. Depending on the risk and a severity of the defect, the processor, as described above, may send signal(s) to the robots indicating instructions to repair or otherwise address a present or pending defect.
- the processor may create a 3D model of a part or component pieces of the part of the asset needed for the repair.
- the processor may generate descriptions of printable parts or part components (i.e., parts suitable for generation using additive manufacturing techniques) that may be used by a 3D printer (or other additive manufacturing apparatus) to generate the part or part components.
- the 3D printer may create the 3D printed part to be attached to or integrated with the asset as part of a repair process.
- one or more robots may be used to repair the asset with the 3D printed part(s). While a 3D printed part is described in this example, other repair or remediation approaches may also be employed.
- the processor may send signal(s) indicating instructions to a controller of a robot to control the robot to spray a part of the asset (e.g., with a lubricant or spray paint) or to replace a part of the asset from an available inventory of parts.
- a robot may include a welding apparatus that may be autonomously employed to perform an instructed repair.
- the processor may send signal(s) to a display to indicate to an operator to enable the operator to repair the defect.
- FIG. 1 shows a perspective view of a robotic system 10 that manages health of an asset 12 by inspecting and/or repairing the asset 12 .
- the robotic system 10 may include a fleet of robots, such as drones (capable of autonomous movement in one-, two-, or three-dimensions, including movement with or without an attached electrical and/or data tether), machines, computing systems, and so forth.
- Each of the robots may receive data via sensors and/or may control operation of one or more effectors of the robot.
- the robotic system 10 includes robots, such as drones, that each have red-green-blue (RGB) sensors, such as cameras, image sensors, photodiodes, or the like, to generate signals indicating characteristics of the asset 12 when the RGB sensor is directed toward the asset.
- the drones with RGB sensors are referred to as a first red-green-blue (RGB) drone 14 , a second RGB drone 16 , and a third RGB drone 18 .
- the drones may be referred to more generally as robots.
- the first RGB drone 14 , the second RGB drone 16 , and the third RGB drone 18 may receive signals indicative of colors of an exterior of the asset 12 .
- the robotic system 10 includes may include a drone having an infrared (IR) camera, referred to as an IR drone 20 .
- IR drone 20 any suitable robot that operates at least partially autonomous (e.g., without input from an occupant within the vehicle) may be included in the robotic system 10 , such as unmanned aerial vehicles, unmanned ground vehicles (e.g., autonomous trucks or locomotives), unmanned underwater or surface water vehicles, unmanned space vehicles, crawling robots, or a combination thereof.
- the robots may operate in one dimension, two dimensions, or three dimensions. While the illustrated embodiment includes four drones, this is meant to be an example, and any suitable number of any suitable number and types of robots (e.g., drones) may be employed.
- the robots may include drones that are manually guided by an operator. For example, the operator may have a remote control that sends signal(s) to the manually guided drone to control a location and/or orientation of the drone.
- the RGB drones 14 , 16 , and 18 may move with respect to the asset 12 to receive signal(s) from the RGB sensors indicating the characteristics of the asset 12 . That is, the drones 14 , 16 , and 18 may fly in an at least partially autonomous manner. For instance, the drones 14 , 16 , and 18 may obtain instructions to control a propeller or wings of the respective drones to adjust the position of the drone with respect to the asset such that the respective drone 14 , 16 , and 18 may acquire additional characteristics of the asset 12 from another perspective.
- the instructions may be received from a control system or the instructions may be stored on memory of the RGB drones 14 , 16 , and 18 .
- the instructions may instruct each of the RGB drones 14 , 16 , and 18 to capture images at regular intervals in a flight path with respect to the asset 12 with or without continuous communication and instruction from a separate controller (e.g., a centralized controller).
- a separate controller e.g., a centralized controller
- each of the RGB drones 14 , 16 , and 18 may move along a respective path 22 , 24 , and 26 that orbits the asset 12 and/or directs the RGB sensors towards the asset 12 .
- the IR drone 20 may move along a path 20 that orbits the asset and/or directs the IR sensor towards to asset 12 to enable the IR drone 20 to capture infrared data indicating depth information of the asset 12 .
- the robotic system 10 may be self-organizing in which tasks are allocated to various members of a multi-robot team based on each of the robots capabilities.
- a control system may include a controller that acquires a list of robots with each capability of each robot.
- the controller may determine task assignments of each robot based on the respective capabilities of each robot. For example, a light drone having more flight endurance may be assigned by the controller to perform rough identification of anomalies. Another drone carrying a high resolution camera having less flight time may be assigned by the controller to move to specific locations to capture high resolution imagery.
- the controller may send signal(s) to each of the drones indicating instructions to perform the assigned tasks based on the capabilities of each robot.
- the robotic system 10 may include inspection systems 28 (e.g., video cameras) positioned in locations proximate to the asset 12 to acquire various characteristics of the asset 12 .
- inspection systems 28 e.g., video cameras
- Such positioned systems may be stationary or have limited movement from a fixed position, such as being mounted on a remotely controlled moveable arm or having pan, tilt, zoom functionality.
- the inspection systems may acquire color and/or depth information related to the asset 12 as well as acquire information related to the process performed by the drones 14 , 16 , 18 , and 20 , such as flight path information with respect to the asset 12 , flight path information with respect to each other, altitude information, or the like.
- the robotic system 10 may include a manually controlled drone 30 to acquire various characteristics of the asset 12 , similar to those described above regarding the autonomous RGB and IR drones 14 , 16 , 18 , and 20 .
- the robotic system 10 may plan a mission to inspect the asset and analyze data from the inspection to find one or more defects. Such a plan may be generated based on available inspection and/or repair assets (e.g., what robots are available having what sensing modalities or which can be outfitted with what sensing modalities, what robots are available having what repair modalities, and stationary or integral sensor data is available for the asset, and so forth) as well as on the age and/or inspection and repair history of the asset.
- FIG. 2 is a block diagram of the robotic system 10 having a second set of robots that each include one or more sensors and one or more effectors.
- the robotic system 10 includes a control system 34 , a first robot 36 , a second robot 38 , a third robot 40 , and a three dimensional (3D) printer.
- the first robot 36 may be an RGB drone
- the second robot 38 may be an autonomous vehicle
- the third robot 40 may be a manipulator system.
- the robotic system 10 includes an RGB drone, an autonomous vehicle, and a manipulator system
- the robots used in FIG. 2 are simply meant to be an example, and any robots (e.g., crawling robot, underwater robot, manually controlled robot, etc.) suitable may be included.
- the robots 36 , 38 , and 40 include a first processing system 42 , a second processing system 44 , and a third processing system 46 , respectively. While the robotic system 10 may include the centralized control system 34 as shown in FIG. 2 , in other embodiments, parts of the planning and/or control may be distributed to each of the processing systems of the robotic system 10 . Further, while three processing systems are shown, it should be appreciated that any suitable number of processing systems may be used.
- control system 34 each include a controller 50 , 52 , 54 , 56 , and 58 , respectively.
- Each controller 50 , 52 , 54 , 56 , and 58 includes a processor 60 , 62 , 64 , 66 , and 68 , respectively.
- the controllers 50 , 52 , 54 , 56 , and 58 may also include one or more storage devices and/or other suitable components, such as the memory devices 70 , 72 , 74 , 76 , and 78 , respectively, operatively coupled to the processors 60 , 62 , 64 , 66 , and 68 , respectively, to execute software, such as software for controlling the vehicles (e.g., drones, autonomous vehicles, etc.), detecting defects of the asset 12 , repairing and/or maintaining the asset 12 , and so forth.
- the vehicles e.g., drones, autonomous vehicles, etc.
- processors 60 , 62 , 64 , 66 , and 68 may each include multiple processors, one or more “general-purpose” microprocessors, one or more special-purpose microprocessors, and/or one or more application specific integrated circuits (ASICS), or some combination thereof.
- ASICS application specific integrated circuits
- each processor 60 , 62 , 64 , 66 , and 68 may include one or more reduced instruction set (RISC) processors.
- RISC reduced instruction set
- Each memory device 70 , 72 , 74 , 76 , and 78 may include a volatile memory, such as random access memory (RAM), and/or a nonvolatile memory, such as read-only memory (ROM). Each memory device 70 , 72 , 74 , 76 , and 78 may store a variety of information that may be used for various purposes.
- RAM random access memory
- ROM read-only memory
- each memory device 70 , 72 , 74 , 76 , and 78 may store processor-executable instructions (e.g., firmware or software) for the respective processors 60 , 62 , 64 , 66 , and 68 to execute, such as instructions for controlling the vehicles (e.g., drones, autonomous vehicles, etc.), detecting defects of the asset 12 , repairing and/or maintaining the asset 12 , and so forth.
- the storage device(s) e.g., nonvolatile storage
- the storage device(s) may store data (e.g., planned flight paths, sensor data, etc.), the model of the asset used for health management, instructions (e.g., software or firmware for controlling the vehicle, etc.), and any other suitable data.
- control system 34 may each include a radio frequency (RF) antenna 80 , 82 , 84 , 86 , and 88 , respectively, to communicate with each other.
- RF radio frequency
- Each of the controllers 50 , 52 , 54 , 56 , and 58 may communicate using any suitable standard, such as WiFi (e.g., IEEE 802.11), ZigBee (e.g., IEEE 802.15.4), or Bluetooth, among others.
- the first processing system 42 of the RGB drone 36 may send signal(s), via the antenna 82 , to the antenna 80 of the control system 34 indicative of a position of the RGB drone 36 .
- the control system 34 may send signal(s), via the antenna 80 , to the antenna 82 of the first processing system 42 of the RGB drone 36 indicative of instructions to control the RGB drone 36 based on the position of the RGB drone 36 .
- each of the controllers 50 , 52 , 54 , 56 , and 58 may communicate with each other to synchronize inspection of the asset 12 . That is, the flight patterns (e.g., direction, distance, and timing) of drones may be synchronized with one another to prevent drones from interfering with one another while inspecting the asset 12 .
- Each of the processing systems 42 , 44 , and 46 may include spatial locating devices 90 , 92 , and 94 , respectively, which are each mounted to the respective robot, and configured to determine a position of the drone 36 , the autonomous vehicle 38 , and the ground robot 40 , respectively.
- the spatial locating devices 90 , 92 , and 94 may include any suitable system configured to determine the position of the drone 36 , the autonomous vehicle 38 , and the ground robot 40 , respectively, such as global positioning system (GPS) receivers, for example.
- GPS global positioning system
- the processing systems 42 , 44 , and 46 may receive signal(s) via one or more sensors 96 , 98 , and 100 , respectively, indicative of visual inputs of the environment.
- Each of the respective processors 62 , 64 , and 68 may generate a map of the environment and localize the respective robot 36 , 38 , or 40 within the map. Further, localization may include an absolute position (e.g., fixed global coordinate system or fixed local coordinate system) as well as position in relation to the asset (e.g., orientation, distance, etc.).
- absolute position e.g., fixed global coordinate system or fixed local coordinate system
- position in relation to the asset e.g., orientation, distance, etc.
- Each of the processing systems 42 , 44 , and 46 may include one or more sensors 96 , 98 , and 100 that send signal(s) to the respective controllers 52 , 54 , and 56 to facilitate control of the respective robots 36 , 38 , and 40 as well as to acquire data indicative of various properties of the asset 12 .
- the sensors 96 , 98 , 100 may include infrared sensors, ultrasonic sensors, magnetic sensors, thermal sensors, radiation detection sensors, imaging sensors (e.g., RGB sensors), Light Detection and Ranging (LIDAR) sensors, or the like.
- each robot 36 , 38 , and 40 may include one or more types of sensors.
- each of the robots 36 , 38 , and 40 may include one or more effectors, such as actuators, motors, or other controls.
- the robot 36 may include one or more motors 102 that control operation of the robot 36 .
- Each of the robots 36 , 38 , and 40 may be self-powered (e.g., an engine and/or battery) and/or receive power from another power source (e.g., via a power tether).
- the controller 52 may send signal(s) to the motors 102 of the robot 36 to control a speed of the rotor of the motor 102 , thereby controlling the position of the robot 36 .
- the controller may send signal(s) indicating instructions to increase or decrease speed of one or more of the rotors of the motors 102 to adjust yaw, pitch, roll, or altitude, of the robot 36 .
- the controller 54 of the robot 38 may generate and send signal(s) to control one or more operations of the robot 38 .
- the controller 56 may send signal(s) to a steering control system 104 to control a direction of movement of the robot 38 and/or to a speed control system 106 to control a speed of the robot 38 .
- the steering control system 104 may include a wheel angle control system 106 that rotates to one or more wheels and/or tracks of the robot 38 may be controlled to steer the robot 38 along a desired route.
- the wheel angle control system 104 may rotate front wheels/tracks, rear wheels/tracks, and/or intermediate wheels/tracks of the robot 38 , either individually or in groups.
- a differential braking system may independently vary the braking force on each lateral side of the robot 38 to direct the robot 38 along the desired route.
- a torque vectoring system may differentially apply torque from an engine to wheels and/or tracks on each lateral side of the robot 38 , thereby directing the robot 38 along a desired route. While the illustrated embodiment of the steering control system 104 includes the wheel angle control system 106 , it should be appreciated that alternative embodiments may include one, two, or more of these systems, among others, in any suitable combination.
- the speed control system 106 may include an engine output control system 110 and/or a braking control system 112 .
- the engine output control system 110 is configured to vary the output of the engine to control the speed of the robot 38 .
- the engine output control system 110 may vary a throttle setting of the engine, a fuel/air mixture of the engine, a timing of the engine, other suitable engine parameters to control engine output, or a combination thereof.
- the braking control system 112 may adjust braking force, thereby controlling the speed of the robot 38 . While the illustrated automated speed control system 106 includes the engine output control system 110 and the braking control system 129 , it should be appreciated that alternative embodiments may include one of these systems, among other systems.
- the robot 40 may include a mechanism to repair, replace, or otherwise maintain the asset 12 , such as a manipulator, magnet, suction system, sprayer, or lubricator.
- the robot 40 includes a manipulator arm 114 , such as electronic, hydraulic, or mechanical arm.
- robot 40 includes an effector 116 , such as a clamp, a container, a handler, or the like. The manipulator arm 114 and the effector 116 may operate in conjunction with each other to maintain the asset 12 .
- the controller 56 of the robot 40 may send signal(s) indicating instructions to control one or more motors 118 of the manipulator arm 114 and the effector 116 to control the position of the manipulator arm 114 and the effector 116 to perform the desired operation.
- the controller 56 may send signal(s) indicating instructions to cause the motors to move the manipulator arm 114 to a position and/or to secure a 3D printed part 119 onto a defect 121 of the asset 12 .
- controller 50 may determine a plan that instructs the robots 36 , 38 , 46 , and 48 to address the defect (e.g., repair, remediate, or otherwise prevent).
- the plan may include one or more tasks to be performed by the robots 36 , 38 , 46 , and 48 .
- One or more of the controllers 50 , 52 , 54 , 56 , and 58 may determine a path (e.g., distance, direction, and/or orientation) along which one or more of the robots 36 , 38 , 46 , and 48 is moved to address the defect.
- the controller 50 may send signal(s) to the robots 36 , 38 , 46 , and 48 indicative of one or more tasks to spray a part of the asset, weld a part of the asset, replace a part of the asset 12 from an inventory of parts or with a 3D printed part, or the like.
- the controller 50 may determine a plan that instructs the robots 36 , 38 , 46 , and 48 to acquire data to confirm the sufficiency of the repair or preventative maintenance, e.g., indicative that the defect was addressed.
- the controller 50 may send signal(s) encoding or conveying instructions to control the robots 36 , 38 , 46 , and 48 to travel along a path planned with respect to the asset 12 .
- the controller 50 may acquire sensor data from the sensors 96 , 98 , and 100 indicative of one or more characteristics of the asset 12 (e.g., via the transceivers 80 , 82 , 84 , 88 ).
- the controller 50 may then adjust the plan to monitor the addressed defect of the asset 12 by adjusting or adding one or more tasks to the plan to acquire additional data related to the asset 12 .
- the controller 50 may then send signals to a display 130 to display data related to the asset 12 , such as detected defects, potential defects, recommendations, repairs, replacement parts, or the like, to an operator. Further, in some embodiments, the robots may be monitored and/or controlled by an operator from the control system 34 via the user interface 128 (e.g., touchscreen display).
- the user interface 128 e.g., touchscreen display
- the robotic system 10 may include a 3D printer 48 that prints a 3D printed part 119 . While a 3D printer is described in detail, this is meant to be an example. In certain embodiments, the 3D model may be sent to another suitable fabrication device capable of fabricating the part using additive manufacturing in which a device deposits particles to the asset or another location. For example, the particles may be deposited to a location in successive layers to create an object.
- the 3D printer 48 may include a gantry 120 or other structure that supports a printer head having an extruder 122 that moves across a build platform.
- the 3D printer 48 may also include one or more motors 124 (e.g., stepper motors) that move the extruder 122 with respect to the build platform.
- the processor 66 may send signal(s) indicating instructions to control the one or more motors 124 and the extruder 122 to heat a source material 126 and extrude successive layers of the source material 126 to create the 3D printed part 119 .
- the processor 66 may send signal(s) indicating instructions to control the one or more motors 124 and the extruder 122 to heat a source material 126 and extrude successive layers of the source material 126 to create the 3D printed part 119 .
- there are a various types of 3D printers 48 that may print 3D printed parts in any suitable manner.
- the control system 34 may include a user interface 128 having a display 130 to display data related to the asset 12 , such as detected defects, potential defects, recommendations, repairs, replacement parts, or the like, to an operator. Further, in some embodiments, the robots may be monitored and/or controlled by an operator from the control system 34 via the user interface 128 (e.g., touchscreen display).
- the user interface 128 e.g., touchscreen display
- Each of the robots 36 , 38 , and 46 may include a collision avoidance system 148 , 150 , and 152 , respectively.
- the collision avoidance systems 148 , 150 , and 152 may include circuitry and/or instructions (e.g., processor-executable code) to control the sensors 96 , 98 , and 100 and motors 102 and 118 of the robots 36 , 38 , and 46 .
- the collision avoidance system 148 on-board the robot 36 may send signals to instruct the motors 102 to control the robot 36 based on a location of the obstacle. That is, the controller 52 may determine, via the collision avoidance system 148 , a path for the robot 36 to travel that avoids interacting with the obstacle while still completing the tasks assigned to the robot 36 .
- the robotic systems 10 of FIGS. 1 and 2 are meant to be examples, and any suitable combination of robots, including as few as one robot, may be used.
- the processor 60 of the control system and/or the processor 62 of the robot 36 are used as examples, and any suitable combination of robots and/or control systems may be used. For example, some of the steps performed by the control system may be distributed and performed by the processors 62 , 64 , and 68 of the respective robots 36 , 38 , and 40 .
- the processor 60 may determine inspection, maintenance, or repair actions to be performed by the robots 36 , 38 , and 40 based on the digital representation.
- the processor 60 may determine a time, schedule, or location at which to perform the inspection, maintenance, or repair, based on the digital representation. Further, the processor 60 may predict health of the asset by comparing the digital representation to data of other assets. That is, the processor 60 may use domain knowledge of the digital representation to predict when a defect is likely to occur on the asset 12 . For instance, the processor 60 may perform an inspection based on the digital representation that indicates a prediction of a condition of the asset. As such, the processor may perform inspection or maintenance at times based on the condition of the asset, thereby reducing time spent on inspection or maintenance as compared to inspections or maintenance actions performed according to a schedule.
- FIG. 3 shows a high level flow diagram of a method 134 that the robotic system 10 may perform to manage the asset 12 to reduce or eliminate human intervention and improve the lifespan of the asset 12 .
- the robotic system 10 may first obtain an asset 12 for inspection.
- the manner in which robotic system 10 obtains the asset 12 may depend on the type of asset.
- the robots may move to certain assets 12 (e.g., an oil pipeline, power transmission lines, etc.) to assess the asset for defects (e.g., cracks in an oil pipeline).
- the processor 60 may determine a plan to assess the asset 12 for defects. As explained in detail below, the plan may include one or more tasks based on the resources (e.g., available robots) and/or the asset 12 .
- the robotic system may then detect and assess a defect 121 associated with the asset 12 .
- the robotic system 10 may manage the asset based on the defect 121 .
- the robotic system may repair and/or replace one or more parts of the asset 12 based on the severity of the defect 12 .
- FIG. 4 is a flow diagram of a method that may be performed at block 138 of FIG. 3 by the processor of the control system 34 .
- the processor 60 may plan a mission based one or more tasks and/or resources. Tasks may include instructions for one of the processors 62 , 64 , 66 , and 68 to send signal(s) to cause the robots 36 , 38 , 40 , and 48 to move (e.g., localize the robots 36 , 38 , 40 , and 48 around the asset 12 ), to angle the robots 36 , 38 , and 40 , to adjust settings, to wait, to capture data, or to perform any other suitable action.
- the processors 62 , 64 , 66 , and 68 may send signal(s) to cause the robots 36 , 38 , 40 , and 48 to move (e.g., localize the robots 36 , 38 , 40 , and 48 around the asset 12 ), to angle the robots 36 , 38 , and 40 , to adjust
- the processors 62 , 64 , 66 , and 68 may perform a task of movement by sending signal(s) to the motors 102 , the steering control system 104 , the speed control system 106 , the motors 118 , and the motors 124 , respectively, to move the respective robot 36 , 38 , 40 , and 48 .
- the processors 62 , 64 , 66 , and 68 may perform a task of acquiring data by receiving signal(s) via the sensors.
- the processor 60 may generate the plan to be based on predictions derived from the digital representation or risk. For example, certain conditions may be associated with a certain level of risk.
- the processor 60 may control an overall objective of the robotic system 10 in a manner that manages risk. As such, in some embodiments, the processor 60 may generate the plan in a manner that maximizes availability of the asset (e.g., minimizes operation disruption) while applying economic remediation measures (e.g., to minimize cost).
- the processor 60 may plan the paths 22 , 24 , 26 , and 32 of the drones 14 , 16 , 18 , and 20 , respectively, in a manner that enables the robot to provide data of the asset 12 based desired coverage of the asset, excluded areas from visibility of the asset, high risk areas of the asset 12 more likely to have defects than other areas of the asset 12 , or the like.
- the processor 60 may plan tasks of the robots based on a type of robot available, a type of sensor available, energy usage, or any combination thereof. For example, certain locations on the asset 12 may be difficult to inspect on the ground, and may be suitable to be inspected aerially due to the location (e.g., on top of the asset).
- the processor 60 may assign tasks based on the location on the asset 12 to be inspected and the robot available.
- the processor 60 may then send signal(s) to the robots 14 , 16 , 18 , and 20 indicating instructions to perform the tasks of the plan.
- the processors 62 , 64 , and 68 may begin by receiving the signal(s) indicating instructions to perform the tasks.
- the robots may then receive data related to the asset 12 by performing the tasks.
- the processors 62 , 64 , and 68 may then send and/or signal(s) from the sensors and or effectors, as described above, to execute the tasks.
- the processors 62 , 64 , and 68 may then receive sensor data indicating one or more characteristics of the asset 12 and send the sensor data to the controller 50 of the control system 34 .
- each of the processors 62 , 64 , and 68 may analyze the sensor data to detect defects (e.g., defect recognition), as described in detail below, and/or to assess quality of the data.
- the processor 60 may send signal(s) to the robots 36 , 38 , and 46 to perform adaptive sampling in which data received from the sensors is included in the determination of the plan to adjust a manner in which later data is received to better capture data (e.g., where data is most desired). For example, data received from the robots 36 , 38 , and 46 may be used to adapt the plan to acquire more data related to missing information regarding the asset.
- each of the controllers may determine if the sensor data is above the threshold level of quality desired.
- each of the processors 62 , 64 , and 68 may determine if the signal-to-noise ratio of the signals from the sensors indicates that the quality of data is above the threshold level of quality.
- each of the processors 62 , 64 , and 68 may adapt the plan (e.g., on-the-fly) to acquire additional data (e.g., via the sensors 96 , 98 , and 100 ) related to a potential defect and/or to send signal(s) (e.g., via the antennas 82 , 84 , and 88 ) to the control system to inform an operator of the quality of data. If sufficient data has been acquired that is greater than the threshold level of quality, the process may continue to block 140 , as shown at block 156 .
- FIG. 5 shows a process performed at block 140 by one or more of the processors 60 , 62 , 64 , and 68 to perform automated defect recognition (ADR).
- each controller 52 , 54 , and 56 of the robots 36 , 38 , and 40 may acquire data related to one or more characteristics of an asset 12 .
- the data may be acquired via the respective sensors 96 , 98 , and 100 of the robots 36 , 38 , and 40 .
- the data may include environmental data from one or more environmental sensors other than the sensors 96 , 98 , and 100 .
- the asset 12 may include one or more sensors to provide information to the robotic system 10 .
- the processor 60 of the control system 34 may receive the data from the robots 36 , 38 , and 40 and generate a digital representation of the asset 12 based on the one or more characteristics. That is, the data collected may be used to build, update, and maintain a digital representation of the asset 12 as described above. Additionally and/or alternatively, the processor 60 may generate the digital representation based in part on physics models and/or domain knowledge.
- the digital representation may include a mathematical model that has variables extrapolated from various parts of the asset 12 .
- the processor 60 may generate a digital representation that includes physical geometry of the asset 12 (e.g., gathered via the sensors 96 , 98 , and 100 ), a 3D model of the asset 12 , materials of the asset 12 , lifespan of the asset 12 , observed or measured performance of the asset 12 , or any combination thereof.
- each of the robots 36 , 38 , and 40 may, solely or collaboratively, generate a digital representation of all or part of the asset 12 based on the acquired data.
- one or more of the processors 60 , 62 , 64 , and 68 may detect the defect 121 of the asset 12 based on the one or more characteristics.
- the defect 121 may include a crack in the physical structure of the asset 12 , corrosion on the asset 12 , debris on the asset 12 , material aging of the asset 12 , missing parts of the asset 12 , or any other suitable anomaly of the asset 12 .
- the digital representation may include a location of the defect with respect to geometry of the asset 12 .
- the processor 60 may recognize the defect 121 by comparing the one or more characteristics with prior knowledge of the asset 12 or by analysis of the digital representation against known parameters or patterns.
- the controller 50 of the control system 34 may send signal(s) to the controllers 52 , 54 , and 56 indicating instructions to adapt the plans to acquire additional data related to the potential defect.
- the controller 50 may send signal(s) to the display 130 to display data related to the defect 121 to inform an operator.
- the processor 60 may determine risk associated with the defect 121 of the asset 12 based on the severity of the defect, the location of the defect, the likelihood of poor performance due to the defect, among others. Further, depending on the risk associated with the defect, the processor 60 may determine whether or not to perform a maintenance action. For example, if the processor 60 determines that a likelihood of improved performance from repairing the defect 121 of the asset 12 outweighs the cost associated with repairing the defect, then the processor 60 may send signal(s) indicating instructions to perform the maintenance action (block 172 ). For example, the controller 50 may send signal(s) to the 3D printer indicating instructions to print a 3D printed part, as described in detail below. In some embodiments, the maintenance actions may be related to robot fleet management.
- the processor 60 may send signal(s) indicating instructions to inspect areas based on previously detected anomalies and the risk of the anomalies. For instance, the processor 60 may send signal(s) indicating instructions to inspect an area of the asset 12 that is prone to cracking.
- a maintenance action related to robot fleet management may relate to setting or modifying an inspection interval, specifying certain types of robots and/or sensors be deployed for an inspection, acquiring operation or functional data related to asset performance that might relate to a possible or pending defect, and so forth.
- FIG. 6 shows an example of a process performed at block 142 by one or more of the processors 60 , 62 , 64 , and 68 to manage the asset 12 based on the defect 121 .
- the example shown describes a process of 3D printing a repair part. The process described is meant to be an example, and other processes may be performed to manage the asset 12 , such as replacing, removing, cleaning, welding, or lubricating a part of the asset 12 , among others.
- the processor 60 may send signal(s) to the display 130 indicating instructions to display a recommendation to an operator.
- the processor 60 may determine an action to be performed, such as a maintenance and/or a repair operation.
- the processor may create a 3D model of a repair to a part of an asset 12 from the digital representation of the asset 12 .
- each of the robots 36 , 38 , and 40 may acquire visual image data from image sensors as well as depth information from infrared sensors.
- the robot controllers 52 , 54 , and 56 may send signal(s), via the antennas 82 , 84 , and 88 , to the controller 50 indicating the visual image data and depth information of the asset 12 .
- the controller 50 may receive the signal(s) via the antenna 80 and the processor 60 may construct a 3D model of the repair to the part of the asset 12 based on the visual image data and depth information.
- the 3D model may constructed by having a part library that includes each parts of the asset 12 . Further, the 3D model associated with the part having the defect may be printed to replace the existing part of the asset 12 .
- the asset 12 may have known repair parts that are associated with defects from prior inspections. Upon recognizing a defect that shares characteristics of the prior defect, the processor 60 may select the 3D model from the known repair parts. In some embodiments, the processor 60 augment the 3D model via coloring based on the sensor data and provide the augmented 3D model to an operator via the display.
- the processor 60 may create the 3D model based on domain knowledge regarding the asset 12 .
- the processor 60 may be assessing an oil and gas pipeline for defects.
- the processor 60 may create a 3D model that secures the oil and gas within the pipeline by detecting locations and distances of edges of the aperture on the pipeline to create a 3D model having a size and shape that matches the detected locations and distances.
- the processor 60 may determine whether the oil and gas pipeline is liquid tight such that liquids would not leak from the application of the 3D model.
- the processor 60 may split the 3D model into one or more 3D printable parts to meet desired print times and/or based on the source materials 126 used to print the 3D printed part 119 .
- the processor 60 may send signal(s) to the controller 58 of the 3D printer 48 indicating the 3D model to be printed (block 186 ).
- the controller 58 of the 3D printer 48 may receive the 3D model via the antenna 86 and send signal(s) to the motors 124 to control the gantry 120 and/or the extruder 122 to create the 3D printed part 119 from the 3D model.
- the robotic system 10 may repair the asset with the 3D printed part.
- the controller 58 of the 3D printer may send signal(s) to the controller 56 of the robot 40 indicating that the 3D printed part 119 is created.
- the robot 40 may send signal(s) indicating instructions to manipulate the manipulator arm 114 and the effector 116 to receive the 3D printed part 119 and to install the 3D printed part 119 onto the defect 121 of the asset 12 .
- FIG. 7 shows an example of a user interface 128 displayed on the display 130 to a user of the control system 34 of FIG. 2 , in accordance with aspects of the present disclosure.
- the user interface 128 may include a display panel 196 that displays sensor data, such as images, from the robotic system 10 (e.g., from the controllers 50 , 52 , 54 , 56 , and 58 of the robots 34 , 36 , 38 , 46 , and 48 ).
- the user interface 128 may include one or more overlays 198 that may overlay features of the data on the display panel 196 .
- the controller 50 may receive signals (e.g., via a touchscreen, a keyboard, a mouse, etc.) indicating a selection of one or more overlays 198 .
- the controller 50 may then send signal(s) indicating instructions to display a heat map overlaid on a model 200 of the asset 12 in the display panel 196 having heat signatures 202 from an IR sensor in an identifying color to enable the user to recognize the heat signatures 202 on the asset 12 .
- the controller 50 may receive a selection indicative of instructions to overlay recognized defects on the display panel 196 or any other overlay suitable for an operator to assess the asset 12 , such corrosion, cracks, or the like.
- a robotic system may plan one or more paths for robots to perform tasks to acquire characteristics of the asset.
- the robots may inspect the asset and receive data from sensors indicating the characteristics of the asset.
- a processing system of the robotic system may then detect a defect of the asset.
- the robotic system may then repair the asset by replacing a part, 3D printing a part, or performing another maintenance operation.
- the processing system may create a 3D model to print to repair the asset.
- the processing system may send signal(s) to a 3D printer to print the 3D model.
- the processing system may display the model of the asset on a display.
- the display may display one or more overlays onto the model to enable an operator to assess various characteristics of the asset, such as heat signatures.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Business, Economics & Management (AREA)
- Robotics (AREA)
- Economics (AREA)
- Manufacturing & Machinery (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Strategic Management (AREA)
- Remote Sensing (AREA)
- Quality & Reliability (AREA)
- Marketing (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Operations Research (AREA)
- Development Economics (AREA)
- Human Resources & Organizations (AREA)
- Entrepreneurship & Innovation (AREA)
- Thermal Sciences (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Investigating Or Analyzing Materials By The Use Of Magnetic Means (AREA)
Abstract
Description
- This application claims priority to U.S. Provisional Patent Application No. 62/343,615, entitled “ROBOT SYSTEM FOR ASSET HEALTH MANAGEMENT”, filed May 31, 2016, and U.S. Provisional Patent Application No. 62/336,332, entitled “ROBOT SYSTEM FOR ASSET HEALTH MANAGEMENT”, filed May 13, 2016, which are both herein incorporated by reference in their entirety for all purposes.
- The subject matter disclosed herein relates to asset management, and more particularly, to monitoring and managing health of an asset using a robotic system.
- Various entities may own or maintain various types of assets as part of their operation. Such assets may include physical or mechanical devices or structures, which may in some instances, have electrical and/or chemical aspects as well. Such assets may be used or maintained for a variety of purposes and may be characterized as capital infrastructure, inventory, or by other nomenclature depending on the context. For example, assets may include distributed assets, such as a pipeline or an electrical grid as well as individual or discrete assets, such as an airplane, a tower, or a vehicle. Assets may be subject to various types of defects (e.g., spontaneous mechanical defects, electrical defects as well as routine wear-and-tear) that may impact their operation. For example, over time, the asset may undergo corrosion or cracking due to weather or may exhibit deteriorating performance or efficiency due to the wear or failure of component parts.
- Typically, one or more human inspectors may inspect, maintain, and repair the asset. For example, the inspector may locate corrosion on the asset and clean the corrosion from the asset. However, depending on the location, size, and/or complexity of the asset, having one or more human inspectors performing inspection of the asset may take away time for the inspectors to perform other tasks. Additionally, some inspection tasks may be dull, dirty, or may be otherwise unsuitable for a human to perform. For instance, some assets may have locations that may not be accessible to humans due to height, confined spaces, or the like. Further, inspections may be performed at times that are based on schedules resulting in either over-inspection or under-inspection. Accordingly, improved systems and techniques for managing the health of various types of assets are desirable.
- Certain embodiments commensurate in scope with the originally claimed disclosure are summarized below. These embodiments are not intended to limit the scope of the claimed disclosure, but rather these embodiments are intended only to provide a brief summary of possible forms of the disclosure. Indeed, embodiments may encompass a variety of forms that may be similar to or different from the embodiments set forth below.
- In a first embodiment, a robotic system for monitoring health of an asset includes at least one robot comprising at least one sensor capable of detecting one or more characteristics of an asset and at least one effector capable of performing a repair or maintenance operation on the asset, wherein the at least one robot is configured to inspect an asset with the at least one sensor, and a processing system having at least one processor operatively coupled to at least one memory, wherein the processor is configured to receive sensor data from the at least one sensor indicating one or more characteristics of the asset, generate, update, or maintain a digital representation that models the one or more characteristics of the asset, detect a defect of the asset based at least in part on the one or more characteristics, and generate an output signal encoding or conveying instructions to provide a recommendation to an operator, to control the at least one robot to address the defect on the asset, or both, based on the defect and the digital representation of the asset.
- In a second embodiment, a non-transitory, computer readable medium includes instructions configured to be executed by a processor of a robotic system including at least one robot, wherein the instructions include instructions configured to cause the processor to receive sensor data from at least one sensor of the at least one robot indicating one or more characteristics of the asset, generate, update, or maintain a digital representation that models the one or more characteristics of the asset, detect a defect of the asset based at least in part on the one or more characteristics, and generate an output signal encoding or conveying instructions to provide a recommendation to an operator, to control the at least one robot to address the defect on the asset, or both, based on the defect and the digital representation of the asset.
- In a third embodiment, a method includes receiving sensor data from at least one sensor of at least one robot indicating one or more characteristics of an asset, generating, updating, or maintaining a digital representation that models the one or more characteristics of the asset, detecting a defect of the asset based at least in part on the one or more characteristics, and generating an output signal encoding or conveying instructions to provide a recommendation to an operator, to control the at least one robot to address the defect on the asset, or both, based on the defect and the digital representation of the asset.
- These and other features, aspects, and advantages of the present disclosure will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
-
FIG. 1 is a perspective view of a robotic system with a set of robots to monitor and manage the health of an asset, in accordance with aspects of the present disclosure; -
FIG. 2 is a block diagram of the robotic system ofFIG. 1 having a second set of robots to manage another asset with a control system, in accordance with aspects of the present disclosure; and -
FIG. 3 is a flow diagram of a process performed by a controller of the control system ofFIG. 2 to manage asset health, in accordance with aspects of the present disclosure; -
FIG. 4 is a flow diagram of a process performed by the controller when performing the process ofFIG. 3 , in accordance with aspects of the present disclosure; -
FIG. 5 is a flow diagram of another process performed by the controller when performing the process ofFIG. 3 , in accordance with aspects of the present disclosure; -
FIG. 6 is a flow diagram of another process performed by the controller when performing the process ofFIG. 3 , in accordance with aspects of the present disclosure; and -
FIG. 7 is a schematic diagram of a user interface displayed to a user of the control system ofFIG. 2 , in accordance with aspects of the present disclosure. - The subject matter disclosed herein relates to managing repair and/or maintenance of an asset with a robotic system. Such an approach may be useful in monitoring or repairing assets associated with various entities, including business or corporate entities, governments, individuals, non-profit organizations, and so forth. As discussed herein, such assets may be generally discrete or limited in their extent (e.g., a vehicle such as a plane, helicopter, ship, submersible, space launch vehicle, satellite, locomotive, and so forth) or may be geographically distributed (e.g., a road or rail track, a port or airport, a pipeline or electrical infrastructure, a power generation facility or manufacturing plant, and so forth). The present approach as described herein may be used to monitor and maintain assets of these types (as well as others not listed) in an autonomous or semi-autonomous manner using robotic intermediaries. As discussed herein, the robotic intermediaries may be used to facilitate one or both of health monitoring of the asset and repair, remediation, or improvement of the asset with limited or no human support.
- With this in mind, it will be appreciated that in a variety of fields, assets, such as distributed assets and/or individual assets may be used to perform any number of operations. Over time, assets may deteriorate due to weather, physical wear, or the like. For example, over months or years, one or more components of an asset may wear or deteriorate due to rain and wind or other environmental conditions or due to inadequate maintenance. Alternatively, in some instances, spontaneous failures of one or more components or systems of an asset may occur which may be unrelated to wear or maintenance conditions but may instead be attributable to an undetected defect or an unknown stressor. Regardless of whether an asset defect is due to gradual process or a sudden occurrence, the health of the asset depends on identifying and addressing such defects in a timely and effective manner.
- In conventional approaches, one or more human agents (e.g., field engineers, operators, or other users of the asset) may inspect the asset for wear at limited intervals to maintain health of the asset and/or to replace parts that appear worn. However, the human agents may be unable to inspect components or locations that may not be easily accessible to humans, such as below the waterline of a marine asset, within a tank or pipe of a pipeline or storage facility, on the exterior surfaces or components of a vehicle in motion (such as a flying plane or helicopter, a moving truck or locomotive), and so forth). Further, when defects are located, for some assets human-based repair may require taking the asset out of operation to implement a repair. As such, it is desirable to find improved ways to monitor and maintain various assets.
- With the preceding in mind, in certain embodiments disclosed herein, a robot system may be used to monitor and manage health of an asset that reduces or eliminates human intervention. A robot may be a machine (e.g., electro-mechanical) capable of carrying out a set of tasks (e.g., movement of all or part of the machine, operation of one or more type of sensors to acquire sensed data or measurements, and so forth) automatically (e.g., at least partially without input, oversight, or control by a user), such as a set of tasks programmed by a computer. For example, the robot may include one or more sensors to detect one or more characteristics of an asset and one or more effectors to perform an operation based on a plan to assess, repair, or service the asset. The robot system may include a processing system that includes one or more processors operatively coupled to memory and storage components. While this may be conceptualized and described below in the context of a single processor-based system to simplify explanation, the overall processing system used in implementing an asset management system as discussed herein may be distributed throughout the robotic system and/or implemented as a centralized control system. With this in mind, the processor may be configured to generate a plan to assess the asset for defects. For example, the processor may determine a plan based on the tasks (e.g., desired inspection coverage of the asset) and/or resources (e.g., robots) available. Based on the generated plan processor may implement the plan by sending signal(s) to the robots providing instructions to perform the tasks defined in the plan. A controller of each robot may process any received instructions and in turn send signal(s) to one or more effectors controlled by the respective robot to control operation of the robot to perform the assigned tasks.
- The processor may determine a plan to monitor the asset. The plan may include one or more tasks to be performed by one or more robots of the robotic system. Further, the processor may adjust (e.g., revise) the plan based on the data received from the sensors related to the asset. For example, the plan may be adjusted based on acquired data indicative of a potential defect of the asset. The processor may send a signal(s) encoding or conveying instructions to travel a specified distance and/or direction that enables the robot to acquire additional data related to the asset associated with the potential defect.
- Upon performing the assigned tasks, the processor may assess the quality of data received from the sensors. Due to a variety of factors, the quality of the data may be below a threshold level of quality. For example, pressure sensors or acoustic sensors may have background noise due to the conditions proximate to the asset. As such, the processor may determine a signal-to-noise ratio of the signals from the sensors that indicates a relationship between a desired signal and background noise. If the processor determines that the signal-to-noise ratio falls below a threshold level of quality, the processor may adapt the plan to acquire additional data. If the processor determines that the signal-to-noise ratio is above a threshold level of quality, the processor may proceed to perform maintenance actions based on the sensor data.
- In certain embodiments, to perform maintenance actions, the processor may generate, maintain, and update a digital representation of the asset based on one or more characteristics that may be monitored using robot intermediaries and/or derived from known operating specifications. For example, the processor may create a digital representation that includes, among other aspects, a 3D structural model of the asset (which may include separately modeling components of the asset as well as the asset as a whole). Such a structural model may include material data for one or more components, lifespan and/or workload data derived from specifications and/or sensor data, and so forth. The digital representation, in some implementations may also include operational or functional models of the asset, such as flow models, pressure models, temperature models, acoustic models, lifing models, and so forth. Further, the digital representation may incorporate or separately model environmental factors relevant to the asset, such as environmental temperature, humidity, pressure (such as in the context of a submersible asset, airborne asset, or space-based asset). As part of maintaining and updating the digital representation, one or more defects in the asset as a whole or components of the asset may also be modeled based on sensor data communicated to the processing components.
- Depending on the characteristics of the structural model, the processor may generate a plan specifying one or more tasks or action, such as acquiring additional data related to the asset. For example, if the processor determines that acquired data of a location on the structural model is below a threshold quality or is otherwise insufficient, the processor may generate or update a revised plan that includes one or more tasks that position the robot to acquire additional data regarding the location.
- The sensor data used to generate, maintain, and update the digital representation, including modeling of defects, may be derived from sensor data collected using one or more of sensors mounted on robots controlled by the processing components and/or by sensors integral to the asset itself which communicate their sensor data to the processing components. As used herein, the robots used to collect sensor data, as well as effect repairs, may be autonomous and capable of movement and orientation in one- (such as along a track), two- (such as along connected roads or along a generally planar surface), or three-dimensions (such as three-dimensional movement within a body of water, air, or space). The sensors used to collect the sensor data may vary between robots and/or may be interchangeable so as to allow customization of robots depending on need. Example of sensors include, but are not limited to, cameras or visual sensors capable of imaging in one or more of visible, low-light, ultraviolet, and or infrared (i.e., thermal) contexts, thermistors or other temperature sensors, material and electrical sensors, pressure sensors, acoustic sensors, radiation sensors or imagers, probes that apply non-destructive testing technology, and so forth. With respect to probes, for example, the robot may contact or interact physically with the asset to acquire data.
- The digital representation may incorporate or be updated based on a combination of factors detected from one or more sensors on the robot (or integral to the asset itself). For instance, the processor may receive visual image data from image sensors (e.g., cameras) on the robots to create or update a 3D model of the asset to localize defects on the 3D model. Based on the sensor data, as incorporated into the 3D model, the processor may detect a defect, such as a crack, a region of corrosion, or missing part, of the asset. For example, the processor may detect a crack on a location of a vehicle based on visual image data that includes color and/or depth information indicative of the crack. The 3D model may additionally be used as a basis for modeling other layers of information related to the asset. Further, the processor may determine risk associated with a potential or imminent defect based on the digital representation. Depending on the risk and a severity of the defect, the processor, as described above, may send signal(s) to the robots indicating instructions to repair or otherwise address a present or pending defect.
- In some embodiments, to repair, remediate, or otherwise prevent a defect, the processor may create a 3D model of a part or component pieces of the part of the asset needed for the repair. The processor may generate descriptions of printable parts or part components (i.e., parts suitable for generation using additive manufacturing techniques) that may be used by a 3D printer (or other additive manufacturing apparatus) to generate the part or part components. Based on the generated instructions or descriptions, the 3D printer may create the 3D printed part to be attached to or integrated with the asset as part of a repair process. Further, one or more robots may be used to repair the asset with the 3D printed part(s). While a 3D printed part is described in this example, other repair or remediation approaches may also be employed. For example, in other embodiments, the processor may send signal(s) indicating instructions to a controller of a robot to control the robot to spray a part of the asset (e.g., with a lubricant or spray paint) or to replace a part of the asset from an available inventory of parts. Similarly, in some embodiments, a robot may include a welding apparatus that may be autonomously employed to perform an instructed repair. In some embodiments, the processor may send signal(s) to a display to indicate to an operator to enable the operator to repair the defect.
- With the preceding introductory comments in mind,
FIG. 1 shows a perspective view of arobotic system 10 that manages health of anasset 12 by inspecting and/or repairing theasset 12. Therobotic system 10 may include a fleet of robots, such as drones (capable of autonomous movement in one-, two-, or three-dimensions, including movement with or without an attached electrical and/or data tether), machines, computing systems, and so forth. Each of the robots may receive data via sensors and/or may control operation of one or more effectors of the robot. In the illustrated embodiment, therobotic system 10 includes robots, such as drones, that each have red-green-blue (RGB) sensors, such as cameras, image sensors, photodiodes, or the like, to generate signals indicating characteristics of theasset 12 when the RGB sensor is directed toward the asset. In the present disclosure, the drones with RGB sensors are referred to as a first red-green-blue (RGB)drone 14, asecond RGB drone 16, and a third RGB drone 18. Alternatively, the drones may be referred to more generally as robots. Thefirst RGB drone 14, thesecond RGB drone 16, and the third RGB drone 18 may receive signals indicative of colors of an exterior of theasset 12. Further, therobotic system 10 includes may include a drone having an infrared (IR) camera, referred to as anIR drone 20. While therobotic system 10 of the illustrated embodiment includes drones, any suitable robot that operates at least partially autonomous (e.g., without input from an occupant within the vehicle) may be included in therobotic system 10, such as unmanned aerial vehicles, unmanned ground vehicles (e.g., autonomous trucks or locomotives), unmanned underwater or surface water vehicles, unmanned space vehicles, crawling robots, or a combination thereof. Further, the robots may operate in one dimension, two dimensions, or three dimensions. While the illustrated embodiment includes four drones, this is meant to be an example, and any suitable number of any suitable number and types of robots (e.g., drones) may be employed. Additionally and/or alternatively, the robots may include drones that are manually guided by an operator. For example, the operator may have a remote control that sends signal(s) to the manually guided drone to control a location and/or orientation of the drone. - In some embodiments, it may be desirable to have robots that move (autonomously or under direction) to various positions proximate to the
asset 12 to acquire sensor data describing one or more characteristics of the asset from different perspectives with respect to theasset 12. For example, the RGB drones 14, 16, and 18 may move with respect to theasset 12 to receive signal(s) from the RGB sensors indicating the characteristics of theasset 12. That is, thedrones drones respective drone asset 12 from another perspective. In some embodiments, the instructions may be received from a control system or the instructions may be stored on memory of the RGB drones 14, 16, and 18. For example, the instructions may instruct each of the RGB drones 14, 16, and 18 to capture images at regular intervals in a flight path with respect to theasset 12 with or without continuous communication and instruction from a separate controller (e.g., a centralized controller). For example, each of the RGB drones 14, 16, and 18 may move along arespective path asset 12 and/or directs the RGB sensors towards theasset 12. Similarly, theIR drone 20 may move along apath 20 that orbits the asset and/or directs the IR sensor towards toasset 12 to enable theIR drone 20 to capture infrared data indicating depth information of theasset 12. - The
robotic system 10 may be self-organizing in which tasks are allocated to various members of a multi-robot team based on each of the robots capabilities. For example, a control system may include a controller that acquires a list of robots with each capability of each robot. The controller may determine task assignments of each robot based on the respective capabilities of each robot. For example, a light drone having more flight endurance may be assigned by the controller to perform rough identification of anomalies. Another drone carrying a high resolution camera having less flight time may be assigned by the controller to move to specific locations to capture high resolution imagery. The controller may send signal(s) to each of the drones indicating instructions to perform the assigned tasks based on the capabilities of each robot. - Moreover, the
robotic system 10 may include inspection systems 28 (e.g., video cameras) positioned in locations proximate to theasset 12 to acquire various characteristics of theasset 12. Such positioned systems, unlike the drones described above, may be stationary or have limited movement from a fixed position, such as being mounted on a remotely controlled moveable arm or having pan, tilt, zoom functionality. For example, the inspection systems may acquire color and/or depth information related to theasset 12 as well as acquire information related to the process performed by thedrones asset 12, flight path information with respect to each other, altitude information, or the like. Similarly, therobotic system 10 may include a manually controlleddrone 30 to acquire various characteristics of theasset 12, similar to those described above regarding the autonomous RGB and IR drones 14, 16, 18, and 20. - The
robotic system 10 may plan a mission to inspect the asset and analyze data from the inspection to find one or more defects. Such a plan may be generated based on available inspection and/or repair assets (e.g., what robots are available having what sensing modalities or which can be outfitted with what sensing modalities, what robots are available having what repair modalities, and stationary or integral sensor data is available for the asset, and so forth) as well as on the age and/or inspection and repair history of the asset.FIG. 2 is a block diagram of therobotic system 10 having a second set of robots that each include one or more sensors and one or more effectors. In the illustrated embodiment, therobotic system 10 includes acontrol system 34, afirst robot 36, asecond robot 38, athird robot 40, and a three dimensional (3D) printer. Further, in the example shown inFIG. 2 , thefirst robot 36 may be an RGB drone, thesecond robot 38 may be an autonomous vehicle, and thethird robot 40 may be a manipulator system. While therobotic system 10 includes an RGB drone, an autonomous vehicle, and a manipulator system, the robots used inFIG. 2 are simply meant to be an example, and any robots (e.g., crawling robot, underwater robot, manually controlled robot, etc.) suitable may be included. Therobots first processing system 42, asecond processing system 44, and athird processing system 46, respectively. While therobotic system 10 may include thecentralized control system 34 as shown inFIG. 2 , in other embodiments, parts of the planning and/or control may be distributed to each of the processing systems of therobotic system 10. Further, while three processing systems are shown, it should be appreciated that any suitable number of processing systems may be used. - In the illustrated embodiment, the
control system 34, thefirst processing system 42, thesecond processing system 44, thethird processing system 46 and the3D printer 48 each include acontroller controller processor controllers memory devices processors asset 12, repairing and/or maintaining theasset 12, and so forth. Moreover, theprocessors processor - Each
memory device memory device memory device respective processors asset 12, repairing and/or maintaining theasset 12, and so forth. The storage device(s) (e.g., nonvolatile storage) may include ROM, flash memory, a hard drive, or any other suitable optical, magnetic, or solid-state storage medium, or a combination thereof. The storage device(s) may store data (e.g., planned flight paths, sensor data, etc.), the model of the asset used for health management, instructions (e.g., software or firmware for controlling the vehicle, etc.), and any other suitable data. - Further, the
control system 34, thefirst processing system 42, thesecond processing system 44, thethird processing system 46 and the3D printer 48 may each include a radio frequency (RF)antenna controllers first processing system 42 of theRGB drone 36 may send signal(s), via theantenna 82, to theantenna 80 of thecontrol system 34 indicative of a position of theRGB drone 36. Thecontrol system 34 may send signal(s), via theantenna 80, to theantenna 82 of thefirst processing system 42 of theRGB drone 36 indicative of instructions to control theRGB drone 36 based on the position of theRGB drone 36. For instance, each of thecontrollers asset 12. That is, the flight patterns (e.g., direction, distance, and timing) of drones may be synchronized with one another to prevent drones from interfering with one another while inspecting theasset 12. - Each of the
processing systems spatial locating devices drone 36, theautonomous vehicle 38, and theground robot 40, respectively. As will be appreciated, thespatial locating devices drone 36, theautonomous vehicle 38, and theground robot 40, respectively, such as global positioning system (GPS) receivers, for example. In certain embodiments, theprocessing systems more sensors respective processors respective robot - Each of the
processing systems more sensors respective controllers respective robots asset 12. Thesensors robot - Moreover, each of the
robots robot 36 may include one ormore motors 102 that control operation of therobot 36. Each of therobots controller 52 may send signal(s) to themotors 102 of therobot 36 to control a speed of the rotor of themotor 102, thereby controlling the position of therobot 36. For example, the controller may send signal(s) indicating instructions to increase or decrease speed of one or more of the rotors of themotors 102 to adjust yaw, pitch, roll, or altitude, of therobot 36. - The
controller 54 of therobot 38 may generate and send signal(s) to control one or more operations of therobot 38. For instance, thecontroller 56 may send signal(s) to asteering control system 104 to control a direction of movement of therobot 38 and/or to aspeed control system 106 to control a speed of therobot 38. For example, thesteering control system 104 may include a wheelangle control system 106 that rotates to one or more wheels and/or tracks of therobot 38 may be controlled to steer therobot 38 along a desired route. By way of example, the wheelangle control system 104 may rotate front wheels/tracks, rear wheels/tracks, and/or intermediate wheels/tracks of therobot 38, either individually or in groups. In certain embodiments, a differential braking system may independently vary the braking force on each lateral side of therobot 38 to direct therobot 38 along the desired route. Similarly, a torque vectoring system may differentially apply torque from an engine to wheels and/or tracks on each lateral side of therobot 38, thereby directing therobot 38 along a desired route. While the illustrated embodiment of thesteering control system 104 includes the wheelangle control system 106, it should be appreciated that alternative embodiments may include one, two, or more of these systems, among others, in any suitable combination. - In the illustrated embodiment, the
speed control system 106 may include an engineoutput control system 110 and/or abraking control system 112. The engineoutput control system 110 is configured to vary the output of the engine to control the speed of therobot 38. For example, the engineoutput control system 110 may vary a throttle setting of the engine, a fuel/air mixture of the engine, a timing of the engine, other suitable engine parameters to control engine output, or a combination thereof. Furthermore, thebraking control system 112 may adjust braking force, thereby controlling the speed of therobot 38. While the illustrated automatedspeed control system 106 includes the engineoutput control system 110 and the braking control system 129, it should be appreciated that alternative embodiments may include one of these systems, among other systems. - The
robot 40 may include a mechanism to repair, replace, or otherwise maintain theasset 12, such as a manipulator, magnet, suction system, sprayer, or lubricator. In the illustrated embodiment, therobot 40 includes amanipulator arm 114, such as electronic, hydraulic, or mechanical arm. Further,robot 40 includes aneffector 116, such as a clamp, a container, a handler, or the like. Themanipulator arm 114 and theeffector 116 may operate in conjunction with each other to maintain theasset 12. As an example, thecontroller 56 of therobot 40 may send signal(s) indicating instructions to control one ormore motors 118 of themanipulator arm 114 and theeffector 116 to control the position of themanipulator arm 114 and theeffector 116 to perform the desired operation. As will be appreciated, thecontroller 56 may send signal(s) indicating instructions to cause the motors to move themanipulator arm 114 to a position and/or to secure a 3D printedpart 119 onto adefect 121 of theasset 12. - In some embodiments,
controller 50 may determine a plan that instructs therobots robots controllers robots controller 50 may send signal(s) to therobots asset 12 from an inventory of parts or with a 3D printed part, or the like. Upon addressing the defect (e.g., applying a patch, replacing a part, or spraying a part), thecontroller 50 may determine a plan that instructs therobots controller 50 may send signal(s) encoding or conveying instructions to control therobots asset 12. Thecontroller 50 may acquire sensor data from thesensors transceivers controller 50 may then adjust the plan to monitor the addressed defect of theasset 12 by adjusting or adding one or more tasks to the plan to acquire additional data related to theasset 12. Upon acquiring data indicative of the defect being addressed, thecontroller 50 may then send signals to adisplay 130 to display data related to theasset 12, such as detected defects, potential defects, recommendations, repairs, replacement parts, or the like, to an operator. Further, in some embodiments, the robots may be monitored and/or controlled by an operator from thecontrol system 34 via the user interface 128 (e.g., touchscreen display). - The
robotic system 10 may include a3D printer 48 that prints a 3D printedpart 119. While a 3D printer is described in detail, this is meant to be an example. In certain embodiments, the 3D model may be sent to another suitable fabrication device capable of fabricating the part using additive manufacturing in which a device deposits particles to the asset or another location. For example, the particles may be deposited to a location in successive layers to create an object. The3D printer 48 may include agantry 120 or other structure that supports a printer head having anextruder 122 that moves across a build platform. The3D printer 48 may also include one or more motors 124 (e.g., stepper motors) that move theextruder 122 with respect to the build platform. For example, theprocessor 66 may send signal(s) indicating instructions to control the one ormore motors 124 and theextruder 122 to heat asource material 126 and extrude successive layers of thesource material 126 to create the 3D printedpart 119. As will be appreciated, there are a various types of3D printers 48 that may print 3D printed parts in any suitable manner. - The
control system 34 may include auser interface 128 having adisplay 130 to display data related to theasset 12, such as detected defects, potential defects, recommendations, repairs, replacement parts, or the like, to an operator. Further, in some embodiments, the robots may be monitored and/or controlled by an operator from thecontrol system 34 via the user interface 128 (e.g., touchscreen display). - Each of the
robots collision avoidance system collision avoidance systems sensors motors robots robot 36 is traveling along a path provided by the plan, thecollision avoidance system 148 on-board therobot 36 may send signals to instruct themotors 102 to control therobot 36 based on a location of the obstacle. That is, thecontroller 52 may determine, via thecollision avoidance system 148, a path for therobot 36 to travel that avoids interacting with the obstacle while still completing the tasks assigned to therobot 36. - As mentioned above, the
robotic systems 10 ofFIGS. 1 and 2 are meant to be examples, and any suitable combination of robots, including as few as one robot, may be used. Further, as described in detail below, theprocessor 60 of the control system and/or theprocessor 62 of therobot 36 are used as examples, and any suitable combination of robots and/or control systems may be used. For example, some of the steps performed by the control system may be distributed and performed by theprocessors respective robots processor 60 may determine inspection, maintenance, or repair actions to be performed by therobots processor 60 may determine a time, schedule, or location at which to perform the inspection, maintenance, or repair, based on the digital representation. Further, theprocessor 60 may predict health of the asset by comparing the digital representation to data of other assets. That is, theprocessor 60 may use domain knowledge of the digital representation to predict when a defect is likely to occur on theasset 12. For instance, theprocessor 60 may perform an inspection based on the digital representation that indicates a prediction of a condition of the asset. As such, the processor may perform inspection or maintenance at times based on the condition of the asset, thereby reducing time spent on inspection or maintenance as compared to inspections or maintenance actions performed according to a schedule. -
FIG. 3 shows a high level flow diagram of amethod 134 that therobotic system 10 may perform to manage theasset 12 to reduce or eliminate human intervention and improve the lifespan of theasset 12. Atblock 136, therobotic system 10 may first obtain anasset 12 for inspection. The manner in whichrobotic system 10 obtains theasset 12 may depend on the type of asset. For example, the robots may move to certain assets 12 (e.g., an oil pipeline, power transmission lines, etc.) to assess the asset for defects (e.g., cracks in an oil pipeline). Atblock 138, theprocessor 60 may determine a plan to assess theasset 12 for defects. As explained in detail below, the plan may include one or more tasks based on the resources (e.g., available robots) and/or theasset 12. Atblock 140, the robotic system may then detect and assess adefect 121 associated with theasset 12. Atblock 142, therobotic system 10 may manage the asset based on thedefect 121. For example, the robotic system may repair and/or replace one or more parts of theasset 12 based on the severity of thedefect 12. Each ofblocks -
FIG. 4 is a flow diagram of a method that may be performed atblock 138 ofFIG. 3 by the processor of thecontrol system 34. Atblock 144, theprocessor 60 may plan a mission based one or more tasks and/or resources. Tasks may include instructions for one of theprocessors robots robots robots processors motors 102, thesteering control system 104, thespeed control system 106, themotors 118, and themotors 124, respectively, to move therespective robot processors - The
processor 60 may generate the plan to be based on predictions derived from the digital representation or risk. For example, certain conditions may be associated with a certain level of risk. Theprocessor 60 may control an overall objective of therobotic system 10 in a manner that manages risk. As such, in some embodiments, theprocessor 60 may generate the plan in a manner that maximizes availability of the asset (e.g., minimizes operation disruption) while applying economic remediation measures (e.g., to minimize cost). - Further, the
processor 60 may plan thepaths drones asset 12 based desired coverage of the asset, excluded areas from visibility of the asset, high risk areas of theasset 12 more likely to have defects than other areas of theasset 12, or the like. Moreover, theprocessor 60 may plan tasks of the robots based on a type of robot available, a type of sensor available, energy usage, or any combination thereof. For example, certain locations on theasset 12 may be difficult to inspect on the ground, and may be suitable to be inspected aerially due to the location (e.g., on top of the asset). As such, theprocessor 60 may assign tasks based on the location on theasset 12 to be inspected and the robot available. Theprocessor 60 may then send signal(s) to therobots processors - At
block 146, the robots may then receive data related to theasset 12 by performing the tasks. For example, theprocessors processors asset 12 and send the sensor data to thecontroller 50 of thecontrol system 34. In some embodiments, each of theprocessors - That is, by using
robots processor 60 may send signal(s) to therobots robots - Due to a variety of reasons, the sensor data may be captured below a threshold level of quality. For example, image sensors may have smudges due to weather, deterioration, or the like. At decision block 154, each of the controllers may determine if the sensor data is above the threshold level of quality desired. For example, each of the
processors block 158, each of theprocessors sensors antennas block 156. -
FIG. 5 shows a process performed atblock 140 by one or more of theprocessors controller robots asset 12. As mentioned above, the data may be acquired via therespective sensors robots sensors asset 12 may include one or more sensors to provide information to therobotic system 10. - At
block 166, theprocessor 60 of thecontrol system 34 may receive the data from therobots asset 12 based on the one or more characteristics. That is, the data collected may be used to build, update, and maintain a digital representation of theasset 12 as described above. Additionally and/or alternatively, theprocessor 60 may generate the digital representation based in part on physics models and/or domain knowledge. The digital representation may include a mathematical model that has variables extrapolated from various parts of theasset 12. For example, theprocessor 60 may generate a digital representation that includes physical geometry of the asset 12 (e.g., gathered via thesensors asset 12, materials of theasset 12, lifespan of theasset 12, observed or measured performance of theasset 12, or any combination thereof. In certain embodiments, each of therobots asset 12 based on the acquired data. - At
block 168, one or more of theprocessors defect 121 of theasset 12 based on the one or more characteristics. For example, thedefect 121 may include a crack in the physical structure of theasset 12, corrosion on theasset 12, debris on theasset 12, material aging of theasset 12, missing parts of theasset 12, or any other suitable anomaly of theasset 12. The digital representation may include a location of the defect with respect to geometry of theasset 12. Theprocessor 60 may recognize thedefect 121 by comparing the one or more characteristics with prior knowledge of theasset 12 or by analysis of the digital representation against known parameters or patterns. Further, if a potential defect is detected, thecontroller 50 of thecontrol system 34 may send signal(s) to thecontrollers controller 50 may send signal(s) to thedisplay 130 to display data related to thedefect 121 to inform an operator. - At
block 170, theprocessor 60 may determine risk associated with thedefect 121 of theasset 12 based on the severity of the defect, the location of the defect, the likelihood of poor performance due to the defect, among others. Further, depending on the risk associated with the defect, theprocessor 60 may determine whether or not to perform a maintenance action. For example, if theprocessor 60 determines that a likelihood of improved performance from repairing thedefect 121 of theasset 12 outweighs the cost associated with repairing the defect, then theprocessor 60 may send signal(s) indicating instructions to perform the maintenance action (block 172). For example, thecontroller 50 may send signal(s) to the 3D printer indicating instructions to print a 3D printed part, as described in detail below. In some embodiments, the maintenance actions may be related to robot fleet management. That is, theprocessor 60 may send signal(s) indicating instructions to inspect areas based on previously detected anomalies and the risk of the anomalies. For instance, theprocessor 60 may send signal(s) indicating instructions to inspect an area of theasset 12 that is prone to cracking. Similarly, a maintenance action related to robot fleet management may relate to setting or modifying an inspection interval, specifying certain types of robots and/or sensors be deployed for an inspection, acquiring operation or functional data related to asset performance that might relate to a possible or pending defect, and so forth. -
FIG. 6 shows an example of a process performed atblock 142 by one or more of theprocessors asset 12 based on thedefect 121. The example shown describes a process of 3D printing a repair part. The process described is meant to be an example, and other processes may be performed to manage theasset 12, such as replacing, removing, cleaning, welding, or lubricating a part of theasset 12, among others. As another example, theprocessor 60 may send signal(s) to thedisplay 130 indicating instructions to display a recommendation to an operator. As mentioned above, theprocessor 60 may determine an action to be performed, such as a maintenance and/or a repair operation. Atblock 182, the processor may create a 3D model of a repair to a part of anasset 12 from the digital representation of theasset 12. For example, each of therobots robot controllers antennas controller 50 indicating the visual image data and depth information of theasset 12. Thecontroller 50 may receive the signal(s) via theantenna 80 and theprocessor 60 may construct a 3D model of the repair to the part of theasset 12 based on the visual image data and depth information. For instance, the 3D model may constructed by having a part library that includes each parts of theasset 12. Further, the 3D model associated with the part having the defect may be printed to replace the existing part of theasset 12. As another example, theasset 12 may have known repair parts that are associated with defects from prior inspections. Upon recognizing a defect that shares characteristics of the prior defect, theprocessor 60 may select the 3D model from the known repair parts. In some embodiments, theprocessor 60 augment the 3D model via coloring based on the sensor data and provide the augmented 3D model to an operator via the display. - In certain embodiments, the
processor 60 may create the 3D model based on domain knowledge regarding theasset 12. For example, theprocessor 60 may be assessing an oil and gas pipeline for defects. Upon locating an aperture in the oil and gas pipeline, theprocessor 60 may create a 3D model that secures the oil and gas within the pipeline by detecting locations and distances of edges of the aperture on the pipeline to create a 3D model having a size and shape that matches the detected locations and distances. Further, theprocessor 60 may determine whether the oil and gas pipeline is liquid tight such that liquids would not leak from the application of the 3D model. - At
block 184, theprocessor 60 may split the 3D model into one or more 3D printable parts to meet desired print times and/or based on thesource materials 126 used to print the 3D printedpart 119. Theprocessor 60 may send signal(s) to thecontroller 58 of the3D printer 48 indicating the 3D model to be printed (block 186). Atblock 188, thecontroller 58 of the3D printer 48 may receive the 3D model via theantenna 86 and send signal(s) to themotors 124 to control thegantry 120 and/or theextruder 122 to create the 3D printedpart 119 from the 3D model. - At
block 190, therobotic system 10 may repair the asset with the 3D printed part. For example, inFIG. 2 , thecontroller 58 of the 3D printer may send signal(s) to thecontroller 56 of therobot 40 indicating that the 3D printedpart 119 is created. Upon creating of the 3D printed part, therobot 40 may send signal(s) indicating instructions to manipulate themanipulator arm 114 and theeffector 116 to receive the 3D printedpart 119 and to install the 3D printedpart 119 onto thedefect 121 of theasset 12. - Data acquired via the controllers to the
asset 12 may be displayed to the user in a variety of ways.FIG. 7 shows an example of auser interface 128 displayed on thedisplay 130 to a user of thecontrol system 34 ofFIG. 2 , in accordance with aspects of the present disclosure. Theuser interface 128 may include adisplay panel 196 that displays sensor data, such as images, from the robotic system 10 (e.g., from thecontrollers robots user interface 128 may include one ormore overlays 198 that may overlay features of the data on thedisplay panel 196. Thecontroller 50 may receive signals (e.g., via a touchscreen, a keyboard, a mouse, etc.) indicating a selection of one ormore overlays 198. Thecontroller 50 may then send signal(s) indicating instructions to display a heat map overlaid on amodel 200 of theasset 12 in thedisplay panel 196 havingheat signatures 202 from an IR sensor in an identifying color to enable the user to recognize theheat signatures 202 on theasset 12. Further, thecontroller 50 may receive a selection indicative of instructions to overlay recognized defects on thedisplay panel 196 or any other overlay suitable for an operator to assess theasset 12, such corrosion, cracks, or the like. - Technical effects of the disclosure include management of health of an asset. A robotic system may plan one or more paths for robots to perform tasks to acquire characteristics of the asset. The robots may inspect the asset and receive data from sensors indicating the characteristics of the asset. A processing system of the robotic system may then detect a defect of the asset. The robotic system may then repair the asset by replacing a part, 3D printing a part, or performing another maintenance operation. For example, the processing system may create a 3D model to print to repair the asset. The processing system may send signal(s) to a 3D printer to print the 3D model. Further, the processing system may display the model of the asset on a display. In certain embodiments, the display may display one or more overlays onto the model to enable an operator to assess various characteristics of the asset, such as heat signatures.
- This written description uses examples to disclose various embodiments, including the best mode, and also to enable any person skilled in the art to practice the disclosure, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the disclosure is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.
Claims (20)
Priority Applications (12)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/584,995 US20170329307A1 (en) | 2016-05-13 | 2017-05-02 | Robot system for asset health management |
CA3023722A CA3023722A1 (en) | 2016-05-13 | 2017-05-12 | Robot system for asset health management |
PCT/US2017/032329 WO2018026409A2 (en) | 2016-05-13 | 2017-05-12 | Robot system for asset health management |
BR112018072406-8A BR112018072406A2 (en) | 2016-05-13 | 2017-05-12 | robotic system and method for managing asset integrity and non-transient computer readable media and method |
EP17822058.8A EP3455803A2 (en) | 2016-05-13 | 2017-05-12 | Robot system for asset health management |
US16/114,318 US10300601B2 (en) | 2014-11-14 | 2018-08-28 | Vehicle control system with task manager |
US16/379,976 US11660756B2 (en) | 2014-11-14 | 2019-04-10 | Control system with task manager |
US16/411,788 US11358615B2 (en) | 2002-06-04 | 2019-05-14 | System and method for determining vehicle orientation in a vehicle consist |
US16/692,784 US11312018B2 (en) | 2014-11-14 | 2019-11-22 | Control system with task manager |
US17/726,594 US20220241975A1 (en) | 2014-11-14 | 2022-04-22 | Control system with task manager |
US18/050,341 US11865726B2 (en) | 2014-11-14 | 2022-10-27 | Control system with task manager |
US18/299,517 US20230249351A1 (en) | 2014-11-14 | 2023-04-12 | Fastener system and method |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662336332P | 2016-05-13 | 2016-05-13 | |
US201662343615P | 2016-05-31 | 2016-05-31 | |
US15/584,995 US20170329307A1 (en) | 2016-05-13 | 2017-05-02 | Robot system for asset health management |
Related Parent Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/473,384 Continuation-In-Part US10518411B2 (en) | 2002-06-04 | 2017-03-29 | Robotic repair or maintenance of an asset |
US15/473,345 Continuation-In-Part US10618168B2 (en) | 2002-06-04 | 2017-03-29 | Robot system path planning for asset health management |
US16/114,318 Continuation-In-Part US10300601B2 (en) | 2002-06-04 | 2018-08-28 | Vehicle control system with task manager |
Related Child Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/541,370 Continuation-In-Part US10110795B2 (en) | 2002-06-04 | 2014-11-14 | Video system and method for data communication |
US15/585,502 Continuation-In-Part US10521960B2 (en) | 2002-06-04 | 2017-05-03 | System and method for generating three-dimensional robotic inspection plan |
US16/114,318 Continuation-In-Part US10300601B2 (en) | 2002-06-04 | 2018-08-28 | Vehicle control system with task manager |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170329307A1 true US20170329307A1 (en) | 2017-11-16 |
Family
ID=60294677
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/473,345 Active 2038-02-03 US10618168B2 (en) | 2002-06-04 | 2017-03-29 | Robot system path planning for asset health management |
US15/473,384 Active 2037-11-17 US10518411B2 (en) | 2002-06-04 | 2017-03-29 | Robotic repair or maintenance of an asset |
US15/584,995 Abandoned US20170329307A1 (en) | 2002-06-04 | 2017-05-02 | Robot system for asset health management |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/473,345 Active 2038-02-03 US10618168B2 (en) | 2002-06-04 | 2017-03-29 | Robot system path planning for asset health management |
US15/473,384 Active 2037-11-17 US10518411B2 (en) | 2002-06-04 | 2017-03-29 | Robotic repair or maintenance of an asset |
Country Status (5)
Country | Link |
---|---|
US (3) | US10618168B2 (en) |
EP (1) | EP3455803A2 (en) |
BR (1) | BR112018072406A2 (en) |
CA (1) | CA3023722A1 (en) |
WO (1) | WO2018026409A2 (en) |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190004489A1 (en) * | 2017-06-30 | 2019-01-03 | Laird Technologies, Inc. | Wireless emergency stop systems, and corresponding methods of operating a wireless emergency stop system for a machine safety interface |
US20190161103A1 (en) * | 2017-11-28 | 2019-05-30 | Westinghouse Air Brake Technologies Corporation | System, Method, and Computer Program Product for Automatic Inspection of a Train |
WO2019135835A1 (en) * | 2018-01-02 | 2019-07-11 | General Electric Company | Systems and method for robotic learning of industrial tasks based on human demonstration |
US10452078B2 (en) * | 2017-05-10 | 2019-10-22 | General Electric Company | Self-localized mobile sensor network for autonomous robotic inspection |
EP3588405A1 (en) * | 2018-06-29 | 2020-01-01 | Tata Consultancy Services Limited | Systems and methods for scheduling a set of non-preemptive tasks in a multi-robot environment |
US10597054B2 (en) * | 2016-12-15 | 2020-03-24 | Progress Rail Locomotive Inc. | Real-time drone infrared inspection of moving train |
US10970586B2 (en) * | 2018-06-28 | 2021-04-06 | General Electric Company | Systems and methods of 3D scene segmentation and matching for robotic operations |
US11086311B2 (en) | 2016-05-09 | 2021-08-10 | Strong Force Iot Portfolio 2016, Llc | Systems and methods for data collection having intelligent data collection bands |
US11126173B2 (en) | 2017-08-02 | 2021-09-21 | Strong Force Iot Portfolio 2016, Llc | Data collection systems having a self-sufficient data acquisition box |
US20220001549A1 (en) * | 2020-07-06 | 2022-01-06 | International Business Machines Corporation | Managing shared robots in a data center |
US11221613B2 (en) | 2016-05-09 | 2022-01-11 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for noise detection and removal in a motor |
US11261797B2 (en) | 2018-11-05 | 2022-03-01 | General Electric Company | System and method for cleaning, restoring, and protecting gas turbine engine components |
US20220092234A1 (en) * | 2020-09-23 | 2022-03-24 | International Business Machines Corporation | Detection of defects within physical infrastructure by leveraging ai |
US11300543B2 (en) | 2019-11-13 | 2022-04-12 | Honeywell International Inc. | Anomaly and fault detection of industrial assets using magnetic mapping |
US11315272B2 (en) * | 2017-08-24 | 2022-04-26 | General Electric Company | Image and video capture architecture for three-dimensional reconstruction |
US11487263B2 (en) | 2020-01-24 | 2022-11-01 | Cattron North America, Inc. | Wireless emergency stop systems including mobile device controllers linked with safety stop devices |
US11555413B2 (en) | 2020-09-22 | 2023-01-17 | General Electric Company | System and method for treating an installed and assembled gas turbine engine |
US11645925B2 (en) * | 2018-05-03 | 2023-05-09 | Arkidan Systems Inc. | Computer-assisted aerial surveying and navigation |
US11774944B2 (en) | 2016-05-09 | 2023-10-03 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for the industrial internet of things |
US11809200B1 (en) * | 2019-12-06 | 2023-11-07 | Florida A&M University | Machine learning based reconfigurable mobile agents using swarm system manufacturing |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10251331B2 (en) | 2016-05-16 | 2019-04-09 | International Business Machines Corporation | Automated deployment of autonomous devices performing localized environment altering actions |
GB2552019B (en) * | 2016-07-08 | 2020-01-08 | Rolls Royce Plc | Methods and apparatus for controlling at least one of a first robot and a second robot to collaborate within a system |
US11561251B2 (en) * | 2018-08-01 | 2023-01-24 | Florida Power & Light Company | Remote autonomous inspection of utility system components utilizing drones and rovers |
US20200175438A1 (en) * | 2018-12-04 | 2020-06-04 | General Electric Company | Method and system for strategic deployment of components |
US11562227B2 (en) * | 2019-03-13 | 2023-01-24 | Accenture Global Solutions Limited | Interactive assistant |
EP3751370B1 (en) | 2019-06-14 | 2024-07-24 | General Electric Company | Additive manufacturing-coupled digital twin ecosystem based on multi-variant distribution model of performance |
EP3751368B1 (en) | 2019-06-14 | 2023-09-27 | General Electric Company | Additive manufacturing-coupled digital twin ecosystem based on a surrogate model of measurement |
GB2586621A (en) * | 2019-08-29 | 2021-03-03 | Rolls Royce Plc | Automated operation of unmanned waterborne vessels |
CN111443730B (en) * | 2020-05-12 | 2021-02-12 | 江苏方天电力技术有限公司 | Unmanned aerial vehicle track automatic generation method and device for power transmission line inspection |
CN112150072A (en) * | 2020-09-27 | 2020-12-29 | 北京海益同展信息科技有限公司 | Asset checking method and device based on intelligent robot, electronic equipment and medium |
EP4351968A1 (en) * | 2021-06-11 | 2024-04-17 | Netdrones, Inc. | Systems and methods for 3d model based drone flight planning and control |
US11841695B2 (en) | 2021-06-22 | 2023-12-12 | International Business Machines Corporation | 3D printing in a confined space |
US20230166452A1 (en) * | 2021-11-29 | 2023-06-01 | International Business Machines Corporation | Self-repairing 3d printer |
US11961030B2 (en) * | 2022-01-27 | 2024-04-16 | Applied Materials, Inc. | Diagnostic tool to tool matching methods for manufacturing equipment |
US20230259112A1 (en) * | 2022-01-27 | 2023-08-17 | Applied Materials, Inc. | Diagnostic tool to tool matching and comparative drill-down analysis methods for manufacturing equipment |
US11628869B1 (en) | 2022-03-04 | 2023-04-18 | Bnsf Railway Company | Automated tie marking |
US11565730B1 (en) | 2022-03-04 | 2023-01-31 | Bnsf Railway Company | Automated tie marking |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070050221A1 (en) * | 2005-08-29 | 2007-03-01 | Abtar Singh | Dispatch management model |
US20160307377A1 (en) * | 2015-04-17 | 2016-10-20 | Snecma | System and method for maintaining an aircraft engine |
Family Cites Families (47)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5947051A (en) | 1997-06-04 | 1999-09-07 | Geiger; Michael B. | Underwater self-propelled surface adhering robotically operated vehicle |
US6425865B1 (en) * | 1998-06-12 | 2002-07-30 | The University Of British Columbia | Robotically assisted medical ultrasound |
US20070094268A1 (en) | 2005-10-21 | 2007-04-26 | Tabe Joseph A | Broadband centralized transportation communication vehicle for extracting transportation topics of information and monitoring terrorist data |
US6453272B1 (en) * | 2000-02-28 | 2002-09-17 | The Foxboro Company | Spurious noise filter |
US20160078695A1 (en) | 2000-05-01 | 2016-03-17 | General Electric Company | Method and system for managing a fleet of remote assets and/or ascertaining a repair for an asset |
IL152185A0 (en) * | 2001-02-12 | 2003-05-29 | Raytheon Co | A system and method for time-to-intercept determination |
US6714831B2 (en) | 2002-01-24 | 2004-03-30 | Ford Motor Company | Paint defect automated seek and repair assembly and method |
CA2595453C (en) | 2005-01-18 | 2016-02-23 | Redzone Robotics, Inc. | Autonomous inspector mobile platform |
KR100812724B1 (en) | 2006-09-29 | 2008-03-12 | 삼성중공업 주식회사 | Multi function robot for moving on wall using indoor global positioning system |
US8070473B2 (en) | 2008-01-08 | 2011-12-06 | Stratasys, Inc. | System for building three-dimensional objects containing embedded inserts, and method of use thereof |
US8060270B2 (en) * | 2008-02-29 | 2011-11-15 | The Boeing Company | System and method for inspection of structures and objects by swarm of remote unmanned vehicles |
KR101010267B1 (en) * | 2008-08-25 | 2011-01-24 | 주식회사 이제이텍 | Method for controlling robot |
US8193987B2 (en) * | 2008-08-25 | 2012-06-05 | DRS Soneticom. Inc. | Apparatus and method for determining signal quality in a geolocation system |
US8583313B2 (en) | 2008-09-19 | 2013-11-12 | International Electronic Machines Corp. | Robotic vehicle for performing rail-related actions |
US20100215212A1 (en) | 2009-02-26 | 2010-08-26 | Honeywell International Inc. | System and Method for the Inspection of Structures |
US8812154B2 (en) * | 2009-03-16 | 2014-08-19 | The Boeing Company | Autonomous inspection and maintenance |
US8666553B2 (en) | 2010-02-10 | 2014-03-04 | Electric Power Research Institute, Inc. | Line inspection robot and system |
CN201828247U (en) | 2010-05-24 | 2011-05-11 | 天津工业大学 | Device for detecting three-dimensional shape of laser remanufacturing part on line |
WO2012035718A1 (en) * | 2010-09-17 | 2012-03-22 | パナソニック株式会社 | Welding condition determining method, and welding device |
US20120261144A1 (en) | 2011-04-14 | 2012-10-18 | The Boeing Company | Fire Management System |
EP2537642A1 (en) | 2011-06-23 | 2012-12-26 | Raytheon BBN Technologies Corp. | Robot fabricator |
US9183527B1 (en) | 2011-10-17 | 2015-11-10 | Redzone Robotics, Inc. | Analyzing infrastructure data |
US8833169B2 (en) | 2011-12-09 | 2014-09-16 | General Electric Company | System and method for inspection of a part with dual multi-axis robotic devices |
WO2013157978A1 (en) | 2012-04-19 | 2013-10-24 | Esaulov Evgeny Igorevich | A self-propelled system of cleanup, inspection and repairs of the surface of vessel hulls and underwater objects |
US9533773B1 (en) * | 2012-04-30 | 2017-01-03 | The Boeing Company | Methods and systems for automated vehicle asset tracking |
US9162753B1 (en) * | 2012-12-31 | 2015-10-20 | Southern Electrical Equipment Company, Inc. | Unmanned aerial vehicle for monitoring infrastructure assets |
US20140336928A1 (en) * | 2013-05-10 | 2014-11-13 | Michael L. Scott | System and Method of Automated Civil Infrastructure Metrology for Inspection, Analysis, and Information Modeling |
US9665843B2 (en) | 2013-06-03 | 2017-05-30 | Abb Schweiz Ag | Industrial asset health profile |
US9944412B2 (en) | 2013-10-04 | 2018-04-17 | Busek Co., Inc. | Spacecraft system for debris disposal and other operations and methods pertaining to the same |
EP3068607B1 (en) | 2013-11-13 | 2020-08-05 | ABB Schweiz AG | System for robotic 3d printing |
US9193068B2 (en) | 2013-11-26 | 2015-11-24 | Elwha Llc | Structural assessment, maintenance, and repair apparatuses and methods |
US9193402B2 (en) * | 2013-11-26 | 2015-11-24 | Elwha Llc | Structural assessment, maintenance, and repair apparatuses and methods |
US9734448B2 (en) | 2013-11-27 | 2017-08-15 | Shawn Patrick Bolich | Software application for managing a collection of robot repairing resources for a technician |
JP5859065B2 (en) * | 2014-06-04 | 2016-02-10 | 株式会社神戸製鋼所 | Welding condition deriving device |
US10589465B2 (en) * | 2014-08-13 | 2020-03-17 | Lg Electronics Inc. | Terminal apparatus, system comprising terminal apparatus, and method for controlling terminal apparatus |
US9862149B2 (en) | 2014-08-29 | 2018-01-09 | Microsoft Technology Licensing, Llc | Print bureau interface for three-dimensional printing |
US9129355B1 (en) * | 2014-10-09 | 2015-09-08 | State Farm Mutual Automobile Insurance Company | Method and system for assessing damage to infrastructure |
JP6387782B2 (en) * | 2014-10-17 | 2018-09-12 | ソニー株式会社 | Control device, control method, and computer program |
US10417076B2 (en) | 2014-12-01 | 2019-09-17 | Uptake Technologies, Inc. | Asset health score |
WO2016106715A1 (en) * | 2014-12-31 | 2016-07-07 | SZ DJI Technology Co., Ltd. | Selective processing of sensor data |
US20160249021A1 (en) * | 2015-02-23 | 2016-08-25 | Industrial Technology Group, LLC | 3d asset inspection |
CN105171742B (en) | 2015-07-20 | 2016-06-22 | 南京工业大学 | 3D printing and welding method using multi-degree-of-freedom robot |
CN105154870B (en) | 2015-09-01 | 2018-01-23 | 广东工业大学 | A kind of metal parts Stress Control 3D printing reproducing method |
US9711851B1 (en) * | 2016-02-04 | 2017-07-18 | Proxy Technologies, Inc. | Unmanned vehicle, system and method for transmitting signals |
US20170345317A1 (en) | 2016-05-24 | 2017-11-30 | Sharper Shape Oy | Dynamic routing based on captured data quality |
US10184813B2 (en) * | 2016-11-09 | 2019-01-22 | The Boeing Company | System and method for performing an automated inspection operation |
US11099581B2 (en) * | 2017-09-28 | 2021-08-24 | Gopro, Inc. | Position-based control of unmanned aerial vehicles |
-
2017
- 2017-03-29 US US15/473,345 patent/US10618168B2/en active Active
- 2017-03-29 US US15/473,384 patent/US10518411B2/en active Active
- 2017-05-02 US US15/584,995 patent/US20170329307A1/en not_active Abandoned
- 2017-05-12 CA CA3023722A patent/CA3023722A1/en not_active Abandoned
- 2017-05-12 WO PCT/US2017/032329 patent/WO2018026409A2/en unknown
- 2017-05-12 BR BR112018072406-8A patent/BR112018072406A2/en not_active IP Right Cessation
- 2017-05-12 EP EP17822058.8A patent/EP3455803A2/en not_active Withdrawn
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070050221A1 (en) * | 2005-08-29 | 2007-03-01 | Abtar Singh | Dispatch management model |
US20160307377A1 (en) * | 2015-04-17 | 2016-10-20 | Snecma | System and method for maintaining an aircraft engine |
Cited By (91)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11353852B2 (en) | 2016-05-09 | 2022-06-07 | Strong Force Iot Portfolio 2016, Llc | Method and system of modifying a data collection trajectory for pumps and fans |
US11366455B2 (en) | 2016-05-09 | 2022-06-21 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for optimization of data collection and storage using 3rd party data from a data marketplace in an industrial internet of things environment |
US12099911B2 (en) | 2016-05-09 | 2024-09-24 | Strong Force loT Portfolio 2016, LLC | Systems and methods for learning data patterns predictive of an outcome |
US12079701B2 (en) | 2016-05-09 | 2024-09-03 | Strong Force Iot Portfolio 2016, Llc | System, methods and apparatus for modifying a data collection trajectory for conveyors |
US12039426B2 (en) | 2016-05-09 | 2024-07-16 | Strong Force Iot Portfolio 2016, Llc | Methods for self-organizing data collection, distribution and storage in a distribution environment |
US11996900B2 (en) | 2016-05-09 | 2024-05-28 | Strong Force Iot Portfolio 2016, Llc | Systems and methods for processing data collected in an industrial environment using neural networks |
US11838036B2 (en) | 2016-05-09 | 2023-12-05 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for detection in an industrial internet of things data collection environment |
US11836571B2 (en) | 2016-05-09 | 2023-12-05 | Strong Force Iot Portfolio 2016, Llc | Systems and methods for enabling user selection of components for data collection in an industrial environment |
US11797821B2 (en) | 2016-05-09 | 2023-10-24 | Strong Force Iot Portfolio 2016, Llc | System, methods and apparatus for modifying a data collection trajectory for centrifuges |
US11791914B2 (en) | 2016-05-09 | 2023-10-17 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for detection in an industrial Internet of Things data collection environment with a self-organizing data marketplace and notifications for industrial processes |
US11086311B2 (en) | 2016-05-09 | 2021-08-10 | Strong Force Iot Portfolio 2016, Llc | Systems and methods for data collection having intelligent data collection bands |
US11119473B2 (en) | 2016-05-09 | 2021-09-14 | Strong Force Iot Portfolio 2016, Llc | Systems and methods for data collection and processing with IP front-end signal conditioning |
US11126171B2 (en) | 2016-05-09 | 2021-09-21 | Strong Force Iot Portfolio 2016, Llc | Methods and systems of diagnosing machine components using neural networks and having bandwidth allocation |
US11774944B2 (en) | 2016-05-09 | 2023-10-03 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for the industrial internet of things |
US11137752B2 (en) | 2016-05-09 | 2021-10-05 | Strong Force loT Portfolio 2016, LLC | Systems, methods and apparatus for data collection and storage according to a data storage profile |
US11770196B2 (en) | 2016-05-09 | 2023-09-26 | Strong Force TX Portfolio 2018, LLC | Systems and methods for removing background noise in an industrial pump environment |
US11169511B2 (en) | 2016-05-09 | 2021-11-09 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for network-sensitive data collection and intelligent process adjustment in an industrial environment |
US11755878B2 (en) | 2016-05-09 | 2023-09-12 | Strong Force Iot Portfolio 2016, Llc | Methods and systems of diagnosing machine components using analog sensor data and neural network |
US11181893B2 (en) | 2016-05-09 | 2021-11-23 | Strong Force Iot Portfolio 2016, Llc | Systems and methods for data communication over a plurality of data paths |
US11194319B2 (en) | 2016-05-09 | 2021-12-07 | Strong Force Iot Portfolio 2016, Llc | Systems and methods for data collection in a vehicle steering system utilizing relative phase detection |
US11194318B2 (en) | 2016-05-09 | 2021-12-07 | Strong Force Iot Portfolio 2016, Llc | Systems and methods utilizing noise analysis to determine conveyor performance |
US11728910B2 (en) | 2016-05-09 | 2023-08-15 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for detection in an industrial internet of things data collection environment with expert systems to predict failures and system state for slow rotating components |
US11360459B2 (en) | 2016-05-09 | 2022-06-14 | Strong Force Iot Portfolio 2016, Llc | Method and system for adjusting an operating parameter in a marginal network |
US11243528B2 (en) | 2016-05-09 | 2022-02-08 | Strong Force Iot Portfolio 2016, Llc | Systems and methods for data collection utilizing adaptive scheduling of a multiplexer |
US11243522B2 (en) | 2016-05-09 | 2022-02-08 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for detection in an industrial Internet of Things data collection environment with intelligent data collection and equipment package adjustment for a production line |
US11243521B2 (en) | 2016-05-09 | 2022-02-08 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for data collection in an industrial environment with haptic feedback and data communication and bandwidth control |
US11256242B2 (en) | 2016-05-09 | 2022-02-22 | Strong Force Iot Portfolio 2016, Llc | Methods and systems of chemical or pharmaceutical production line with self organizing data collectors and neural networks |
US11256243B2 (en) | 2016-05-09 | 2022-02-22 | Strong Force loT Portfolio 2016, LLC | Methods and systems for detection in an industrial Internet of Things data collection environment with intelligent data collection and equipment package adjustment for fluid conveyance equipment |
US11262737B2 (en) | 2016-05-09 | 2022-03-01 | Strong Force Iot Portfolio 2016, Llc | Systems and methods for monitoring a vehicle steering system |
US11663442B2 (en) | 2016-05-09 | 2023-05-30 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for detection in an industrial Internet of Things data collection environment with intelligent data management for industrial processes including sensors |
US11269319B2 (en) | 2016-05-09 | 2022-03-08 | Strong Force Iot Portfolio 2016, Llc | Methods for determining candidate sources of data collection |
US11281202B2 (en) | 2016-05-09 | 2022-03-22 | Strong Force Iot Portfolio 2016, Llc | Method and system of modifying a data collection trajectory for bearings |
US11646808B2 (en) | 2016-05-09 | 2023-05-09 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for adaption of data storage and communication in an internet of things downstream oil and gas environment |
US11609552B2 (en) | 2016-05-09 | 2023-03-21 | Strong Force Iot Portfolio 2016, Llc | Method and system for adjusting an operating parameter on a production line |
US11307565B2 (en) | 2016-05-09 | 2022-04-19 | Strong Force Iot Portfolio 2016, Llc | Method and system of a noise pattern data marketplace for motors |
US11609553B2 (en) | 2016-05-09 | 2023-03-21 | Strong Force Iot Portfolio 2016, Llc | Systems and methods for data collection and frequency evaluation for pumps and fans |
US11327475B2 (en) | 2016-05-09 | 2022-05-10 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for intelligent collection and analysis of vehicle data |
US11334063B2 (en) | 2016-05-09 | 2022-05-17 | Strong Force Iot Portfolio 2016, Llc | Systems and methods for policy automation for a data collection system |
US11340589B2 (en) | 2016-05-09 | 2022-05-24 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for detection in an industrial Internet of Things data collection environment with expert systems diagnostics and process adjustments for vibrating components |
US11347205B2 (en) | 2016-05-09 | 2022-05-31 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for network-sensitive data collection and process assessment in an industrial environment |
US11347215B2 (en) | 2016-05-09 | 2022-05-31 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for detection in an industrial internet of things data collection environment with intelligent management of data selection in high data volume data streams |
US11347206B2 (en) | 2016-05-09 | 2022-05-31 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for data collection in a chemical or pharmaceutical production process with haptic feedback and control of data communication |
US11353850B2 (en) | 2016-05-09 | 2022-06-07 | Strong Force Iot Portfolio 2016, Llc | Systems and methods for data collection and signal evaluation to determine sensor status |
US11353851B2 (en) | 2016-05-09 | 2022-06-07 | Strong Force Iot Portfolio 2016, Llc | Systems and methods of data collection monitoring utilizing a peak detection circuit |
US11507075B2 (en) | 2016-05-09 | 2022-11-22 | Strong Force Iot Portfolio 2016, Llc | Method and system of a noise pattern data marketplace for a power station |
US11586181B2 (en) | 2016-05-09 | 2023-02-21 | Strong Force Iot Portfolio 2016, Llc | Systems and methods for adjusting process parameters in a production environment |
US11221613B2 (en) | 2016-05-09 | 2022-01-11 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for noise detection and removal in a motor |
US11366456B2 (en) | 2016-05-09 | 2022-06-21 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for detection in an industrial internet of things data collection environment with intelligent data management for industrial processes including analog sensors |
US11372395B2 (en) | 2016-05-09 | 2022-06-28 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for detection in an industrial Internet of Things data collection environment with expert systems diagnostics for vibrating components |
US11372394B2 (en) | 2016-05-09 | 2022-06-28 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for detection in an industrial internet of things data collection environment with self-organizing expert system detection for complex industrial, chemical process |
US11378938B2 (en) | 2016-05-09 | 2022-07-05 | Strong Force Iot Portfolio 2016, Llc | System, method, and apparatus for changing a sensed parameter group for a pump or fan |
US11385622B2 (en) | 2016-05-09 | 2022-07-12 | Strong Force Iot Portfolio 2016, Llc | Systems and methods for characterizing an industrial system |
US11385623B2 (en) | 2016-05-09 | 2022-07-12 | Strong Force Iot Portfolio 2016, Llc | Systems and methods of data collection and analysis of data from a plurality of monitoring devices |
US11392109B2 (en) | 2016-05-09 | 2022-07-19 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for data collection in an industrial refining environment with haptic feedback and data storage control |
US11392111B2 (en) | 2016-05-09 | 2022-07-19 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for intelligent data collection for a production line |
US11586188B2 (en) | 2016-05-09 | 2023-02-21 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for a data marketplace for high volume industrial processes |
US11397421B2 (en) | 2016-05-09 | 2022-07-26 | Strong Force Iot Portfolio 2016, Llc | Systems, devices and methods for bearing analysis in an industrial environment |
US11397422B2 (en) | 2016-05-09 | 2022-07-26 | Strong Force Iot Portfolio 2016, Llc | System, method, and apparatus for changing a sensed parameter group for a mixer or agitator |
US11402826B2 (en) | 2016-05-09 | 2022-08-02 | Strong Force Iot Portfolio 2016, Llc | Methods and systems of industrial production line with self organizing data collectors and neural networks |
US11409266B2 (en) | 2016-05-09 | 2022-08-09 | Strong Force Iot Portfolio 2016, Llc | System, method, and apparatus for changing a sensed parameter group for a motor |
US11415978B2 (en) | 2016-05-09 | 2022-08-16 | Strong Force Iot Portfolio 2016, Llc | Systems and methods for enabling user selection of components for data collection in an industrial environment |
US11573558B2 (en) | 2016-05-09 | 2023-02-07 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for sensor fusion in a production line environment |
US11507064B2 (en) | 2016-05-09 | 2022-11-22 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for industrial internet of things data collection in downstream oil and gas environment |
US11493903B2 (en) | 2016-05-09 | 2022-11-08 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for a data marketplace in a conveyor environment |
US10597054B2 (en) * | 2016-12-15 | 2020-03-24 | Progress Rail Locomotive Inc. | Real-time drone infrared inspection of moving train |
US10452078B2 (en) * | 2017-05-10 | 2019-10-22 | General Electric Company | Self-localized mobile sensor network for autonomous robotic inspection |
US10782665B2 (en) * | 2017-06-30 | 2020-09-22 | Cattron North America, Inc. | Wireless emergency stop systems, and corresponding methods of operating a wireless emergency stop system for a machine safety interface |
US20190004489A1 (en) * | 2017-06-30 | 2019-01-03 | Laird Technologies, Inc. | Wireless emergency stop systems, and corresponding methods of operating a wireless emergency stop system for a machine safety interface |
US11073811B2 (en) | 2017-06-30 | 2021-07-27 | Cattron North America, Inc. | Wireless emergency stop systems, and corresponding methods of operating a wireless emergency stop system for a machine safety interface |
US11126173B2 (en) | 2017-08-02 | 2021-09-21 | Strong Force Iot Portfolio 2016, Llc | Data collection systems having a self-sufficient data acquisition box |
US11442445B2 (en) | 2017-08-02 | 2022-09-13 | Strong Force Iot Portfolio 2016, Llc | Data collection systems and methods with alternate routing of input channels |
US11397428B2 (en) | 2017-08-02 | 2022-07-26 | Strong Force Iot Portfolio 2016, Llc | Self-organizing systems and methods for data collection |
US11144047B2 (en) | 2017-08-02 | 2021-10-12 | Strong Force Iot Portfolio 2016, Llc | Systems for data collection and self-organizing storage including enhancing resolution |
US11175653B2 (en) | 2017-08-02 | 2021-11-16 | Strong Force Iot Portfolio 2016, Llc | Systems for data collection and storage including network evaluation and data storage profiles |
US11315272B2 (en) * | 2017-08-24 | 2022-04-26 | General Electric Company | Image and video capture architecture for three-dimensional reconstruction |
US20190161103A1 (en) * | 2017-11-28 | 2019-05-30 | Westinghouse Air Brake Technologies Corporation | System, Method, and Computer Program Product for Automatic Inspection of a Train |
US10913154B2 (en) | 2018-01-02 | 2021-02-09 | General Electric Company | Systems and method for robotic learning of industrial tasks based on human demonstration |
WO2019135835A1 (en) * | 2018-01-02 | 2019-07-11 | General Electric Company | Systems and method for robotic learning of industrial tasks based on human demonstration |
US11645925B2 (en) * | 2018-05-03 | 2023-05-09 | Arkidan Systems Inc. | Computer-assisted aerial surveying and navigation |
US11670178B2 (en) * | 2018-05-03 | 2023-06-06 | Arkidan Systems Inc. | Computer-assisted aerial surveying and navigation |
US10970586B2 (en) * | 2018-06-28 | 2021-04-06 | General Electric Company | Systems and methods of 3D scene segmentation and matching for robotic operations |
EP3588405A1 (en) * | 2018-06-29 | 2020-01-01 | Tata Consultancy Services Limited | Systems and methods for scheduling a set of non-preemptive tasks in a multi-robot environment |
US11261797B2 (en) | 2018-11-05 | 2022-03-01 | General Electric Company | System and method for cleaning, restoring, and protecting gas turbine engine components |
US11300543B2 (en) | 2019-11-13 | 2022-04-12 | Honeywell International Inc. | Anomaly and fault detection of industrial assets using magnetic mapping |
US11809200B1 (en) * | 2019-12-06 | 2023-11-07 | Florida A&M University | Machine learning based reconfigurable mobile agents using swarm system manufacturing |
US11693381B2 (en) | 2020-01-24 | 2023-07-04 | Cattron North America, Inc. | Wireless emergency stop systems including mobile device controllers linked with safety stop devices |
US11487263B2 (en) | 2020-01-24 | 2022-11-01 | Cattron North America, Inc. | Wireless emergency stop systems including mobile device controllers linked with safety stop devices |
US20220001549A1 (en) * | 2020-07-06 | 2022-01-06 | International Business Machines Corporation | Managing shared robots in a data center |
US11555413B2 (en) | 2020-09-22 | 2023-01-17 | General Electric Company | System and method for treating an installed and assembled gas turbine engine |
US20220092234A1 (en) * | 2020-09-23 | 2022-03-24 | International Business Machines Corporation | Detection of defects within physical infrastructure by leveraging ai |
US11651119B2 (en) * | 2020-09-23 | 2023-05-16 | International Business Machines Corporation | Detection of defects within physical infrastructure by leveraging AI |
Also Published As
Publication number | Publication date |
---|---|
WO2018026409A2 (en) | 2018-02-08 |
US10618168B2 (en) | 2020-04-14 |
CA3023722A1 (en) | 2018-02-08 |
US20170329297A1 (en) | 2017-11-16 |
BR112018072406A2 (en) | 2019-02-19 |
EP3455803A2 (en) | 2019-03-20 |
US10518411B2 (en) | 2019-12-31 |
WO2018026409A3 (en) | 2018-03-15 |
US20170326729A1 (en) | 2017-11-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10618168B2 (en) | Robot system path planning for asset health management | |
US10777004B2 (en) | System and method for generating three-dimensional robotic inspection plan | |
US10633093B2 (en) | Three-dimensional robotic inspection system | |
US10452078B2 (en) | Self-localized mobile sensor network for autonomous robotic inspection | |
US11145051B2 (en) | Three-dimensional modeling of an object | |
US20220169381A1 (en) | Deep learning-based localization of uavs with respect to nearby pipes | |
KR102675315B1 (en) | Systems and methods for automatically inspecting surfaces | |
Bonnin-Pascual et al. | On the use of robots and vision technologies for the inspection of vessels: A survey on recent advances | |
US10682677B2 (en) | System and method providing situational awareness for autonomous asset inspection robot monitor | |
EP2625105B1 (en) | Automated visual inspection system | |
JP5697592B2 (en) | System and method for inspection of structures and objects by remote unmanned transport means | |
US11661190B2 (en) | Rapid aircraft inspection with autonomous drone base station systems | |
EP3850456B1 (en) | Control and navigation systems, pose optimisation, mapping, and localisation techniques | |
GB2552092A (en) | Inspection system and method for automatic visual inspection of a motor vehicle | |
Donadio et al. | Artificial intelligence and collaborative robot to improve airport operations | |
Shim et al. | Remote robotic system for 3D measurement of concrete damage in tunnel with ground vehicle and manipulator | |
GB2581403A (en) | Pose optimisation, mapping, and localisation techniques | |
Rodríguez et al. | Inspection of aircrafts and airports using UAS: a review | |
Ameli et al. | Impact of UAV Hardware Options on Bridge Inspection Mission Capabilities. Drones 2022, 6, 64 | |
Ortiz et al. | New Steps towards the Integration of Robotic and Autonomous Systems in the Inspection of Vessel Holds | |
Bibuli et al. | The minoas project: Marine inspection robotic assistant system | |
US20230242279A1 (en) | Automated method and system for aircraft inspection with data validation | |
Futterlieb et al. | Air-Cobot: Aircraft Enhanced Inspection by Smart and Collaborative Robot | |
Sun et al. | Drone-based Automated Exterior Inspection of an Aircraft using Reinforcement Learning Technique | |
Ortiz et al. | Visual Inspection of Vessels Cargo Holds: Use of a Micro-Aerial Vehicle as a Smart Assistant |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GENERAL ELECTRIC COMPANY, NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CASTILLO-EFFEN, MAURICIO;ABATE, VICTOR ROBERT;LIZZI, JOHN MICHAEL, JR.;AND OTHERS;SIGNING DATES FROM 20161116 TO 20170110;REEL/FRAME:042222/0616 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |