GB2589418A - Fabric maintenance system and method of use - Google Patents
Fabric maintenance system and method of use Download PDFInfo
- Publication number
- GB2589418A GB2589418A GB2012374.1A GB202012374A GB2589418A GB 2589418 A GB2589418 A GB 2589418A GB 202012374 A GB202012374 A GB 202012374A GB 2589418 A GB2589418 A GB 2589418A
- Authority
- GB
- United Kingdom
- Prior art keywords
- work
- plan
- sensor
- manipulator assembly
- robotic manipulator
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 239000004744 fabric Substances 0.000 title claims abstract description 82
- 238000012423 maintenance Methods 0.000 title claims abstract description 82
- 238000000034 method Methods 0.000 title claims description 83
- 238000012545 processing Methods 0.000 claims abstract description 43
- 238000005422 blasting Methods 0.000 claims abstract description 39
- 239000011248 coating agent Substances 0.000 claims abstract description 33
- 238000000576 coating method Methods 0.000 claims abstract description 33
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims abstract description 31
- 238000007689 inspection Methods 0.000 claims abstract description 28
- CURLTUGMZLYLDI-UHFFFAOYSA-N Carbon dioxide Chemical compound O=C=O CURLTUGMZLYLDI-UHFFFAOYSA-N 0.000 claims abstract description 18
- 235000011089 carbon dioxide Nutrition 0.000 claims abstract description 18
- 238000004381 surface treatment Methods 0.000 claims abstract description 12
- 230000004886 head movement Effects 0.000 claims abstract description 5
- 238000005270 abrasive blasting Methods 0.000 claims abstract description 3
- 239000003973 paint Substances 0.000 claims description 44
- 230000033001 locomotion Effects 0.000 claims description 35
- 230000008569 process Effects 0.000 claims description 27
- 238000013439 planning Methods 0.000 claims description 20
- 238000005259 measurement Methods 0.000 claims description 19
- 238000013519 translation Methods 0.000 claims description 14
- 238000004088 simulation Methods 0.000 claims description 10
- 230000001939 inductive effect Effects 0.000 claims description 5
- 230000004044 response Effects 0.000 claims description 2
- 238000010422 painting Methods 0.000 abstract description 14
- 239000012636 effector Substances 0.000 description 96
- 238000002360 preparation method Methods 0.000 description 48
- 230000000007 visual effect Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 11
- 230000004807 localization Effects 0.000 description 10
- JEIPFZHSYJVQDO-UHFFFAOYSA-N iron(III) oxide Inorganic materials O=[Fe]O[Fe]=O JEIPFZHSYJVQDO-UHFFFAOYSA-N 0.000 description 9
- 229910000831 Steel Inorganic materials 0.000 description 7
- 239000010959 steel Substances 0.000 description 7
- 239000002699 waste material Substances 0.000 description 7
- 239000000356 contaminant Substances 0.000 description 6
- 238000010801 machine learning Methods 0.000 description 5
- 230000007246 mechanism Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 239000007921 spray Substances 0.000 description 5
- 230000003190 augmentative effect Effects 0.000 description 4
- 239000002184 metal Substances 0.000 description 4
- 229920000642 polymer Polymers 0.000 description 4
- 238000012384 transportation and delivery Methods 0.000 description 4
- 238000013459 approach Methods 0.000 description 3
- 238000012790 confirmation Methods 0.000 description 3
- 238000011109 contamination Methods 0.000 description 3
- 238000005260 corrosion Methods 0.000 description 3
- 230000007797 corrosion Effects 0.000 description 3
- 239000002360 explosive Substances 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000003134 recirculating effect Effects 0.000 description 3
- 238000000926 separation method Methods 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 230000003466 anti-cipated effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 239000012530 fluid Substances 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 239000003921 oil Substances 0.000 description 2
- 238000007592 spray painting technique Methods 0.000 description 2
- 238000005406 washing Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 239000011449 brick Substances 0.000 description 1
- 230000003749 cleanliness Effects 0.000 description 1
- 238000009675 coating thickness measurement Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 239000004567 concrete Substances 0.000 description 1
- 239000010779 crude oil Substances 0.000 description 1
- 230000008021 deposition Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000011010 flushing procedure Methods 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000002329 infrared spectrum Methods 0.000 description 1
- 230000001050 lubricating effect Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 230000013011 mating Effects 0.000 description 1
- 239000010445 mica Substances 0.000 description 1
- 229910052618 mica group Inorganic materials 0.000 description 1
- 238000009659 non-destructive testing Methods 0.000 description 1
- 239000012188 paraffin wax Substances 0.000 description 1
- 238000005086 pumping Methods 0.000 description 1
- 238000003908 quality control method Methods 0.000 description 1
- 238000002601 radiography Methods 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 150000003839 salts Chemical class 0.000 description 1
- 239000000523 sample Substances 0.000 description 1
- 239000004576 sand Substances 0.000 description 1
- 239000002904 solvent Substances 0.000 description 1
- 238000004611 spectroscopical analysis Methods 0.000 description 1
- 239000004575 stone Substances 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 230000003746 surface roughness Effects 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 238000011179 visual inspection Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B05—SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
- B05B—SPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
- B05B13/00—Machines or plants for applying liquids or other fluent materials to surfaces of objects or other work by spraying, not covered by groups B05B1/00 - B05B11/00
- B05B13/005—Machines or plants for applying liquids or other fluent materials to surfaces of objects or other work by spraying, not covered by groups B05B1/00 - B05B11/00 mounted on vehicles or designed to apply a liquid on a very large surface, e.g. on the road, on the surface of large containers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B05—SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
- B05B—SPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
- B05B13/00—Machines or plants for applying liquids or other fluent materials to surfaces of objects or other work by spraying, not covered by groups B05B1/00 - B05B11/00
- B05B13/02—Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work
- B05B13/04—Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work the spray heads being moved during spraying operation
- B05B13/0431—Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work the spray heads being moved during spraying operation with spray heads moved by robots or articulated arms, e.g. for applying liquid or other fluent material to 3D-surfaces
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/005—Manipulators for mechanical processing tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/0075—Manipulators for painting or coating
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/088—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
- B25J13/089—Determining the position of the robot with reference to its environment
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
- B25J5/005—Manipulators mounted on wheels or on carriages mounted on endless tracks or belts
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
- B25J5/02—Manipulators mounted on wheels or on carriages travelling along a guideway
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
- B25J9/1666—Avoiding collision or forbidden zones
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0221—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0248—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B05—SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
- B05B—SPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
- B05B12/00—Arrangements for controlling delivery; Arrangements for controlling the spray area
- B05B12/08—Arrangements for controlling delivery; Arrangements for controlling the spray area responsive to condition of liquid or other fluent material to be discharged, of ambient medium or of target ; responsive to condition of spray devices or of supply means, e.g. pipes, pumps or their drive means
- B05B12/084—Arrangements for controlling delivery; Arrangements for controlling the spray area responsive to condition of liquid or other fluent material to be discharged, of ambient medium or of target ; responsive to condition of spray devices or of supply means, e.g. pipes, pumps or their drive means responsive to condition of liquid or other fluent material already sprayed on the target, e.g. coating thickness, weight or pattern
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B24—GRINDING; POLISHING
- B24C—ABRASIVE OR RELATED BLASTING WITH PARTICULATE MATERIAL
- B24C1/00—Methods for use of abrasive blasting for producing particular effects; Use of auxiliary equipment in connection with such methods
- B24C1/003—Methods for use of abrasive blasting for producing particular effects; Use of auxiliary equipment in connection with such methods using material which dissolves or changes phase after the treatment, e.g. ice, CO2
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B24—GRINDING; POLISHING
- B24C—ABRASIVE OR RELATED BLASTING WITH PARTICULATE MATERIAL
- B24C3/00—Abrasive blasting machines or devices; Plants
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/45—Nc applications
- G05B2219/45066—Inspection robot
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/45—Nc applications
- G05B2219/45082—Sanding robot, to clean surfaces
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Manipulator (AREA)
Abstract
Apparatus for industrial fabric maintenance of a work object 325 in a work location comprises a robotic manipulator assembly 311, a functional module 322 at an operative end of the robotic manipulator assembly 311, a sensor system and a processing module. The sensor system is operable to collect data relating to the work location and/or work object 325 and provide a data set to the processing module, and the processing module is operable generate a plan for a fabric maintenance operation on a work area comprising at least a part of the work object 325. The plan comprises at least a head movement path plan 329 for the functional module in relation to the work object 325. The functional module 322 may be for inspection, surface treatment or coating. The surface treatment may be water jetting, dry ice blasting or abrasive blasting and the coating may be painting.
Description
1 FABRIC MAINTENANCE SYSTEM AND METHOD OF USE 3 The present invention relates to inspection, surface preparation and coating of objects and 4 structures in an industrial environment. Aspects of the invention relate to apparatus and methods for industrial fabric maintenance of work objects and structures in a work 6 location. The invention has particular application to the blasting and painting of steel 7 surfaces.
9 Backyound to the invention 11 Large industrial complexes including oil and gas platforms, petrochemical plants and 12 refineries contain significant areas of steel surfaces which are often exposed to corrosive 13 environments and so must be regularly protected against the elements.
The inspection, surface preparation and coating of these structures (known as "fabric 16 maintenance") requires an array of human labour, handheld tools, machines and safety 17 and power equipment.
19 Approaches to fabric maintenance vary but rely on manual or remote-controlled methods of inspection, surface preparation and coating. With a reliance on human labour, asset 21 owners can often store a backlog of required fabric maintenance which can equate to 22 many years of required effort.
24 Technology development in this area has focussed to date on manual or remote-controlled equipment and does not address the labour challenge.
27 Summary of the invention
29 It is an object of an aspect of the present invention to obviate or at least mitigate the foregoing disadvantages of fabric maintenance prior art.
32 It is another object of an aspect of the present invention to provide an apparatus and 33 method for industrial fabric maintenance of a work object in a work location in a safe and 34 effective manner.
1 It is a further object of an aspect of the present invention to provide an apparatus capable 2 of conducting industrial fabric maintenance of a work object reliably, remotely and/or 3 autonomously.
Further aims of the invention will become apparent from the following description.
7 According to a first aspect of the invention there is provided an apparatus for industrial 8 fabric maintenance of a work object in a work location, the apparatus comprising: 9 a robotic manipulator assembly; a functional module at an operative end of the robotic manipulator assembly; 11 a sensor system; 12 and one or more processing module; 13 wherein the sensor system is operable to collect data relating to the work location and/or 14 work object and provide a data set to the processing module, and the processing module is operable generate a plan for a fabric maintenance operation on a work area comprising 16 at least a part of the work object; 17 and wherein the plan comprises at least a head movement path plan for the functional 18 module in relation to the work object.
The industrial fabric maintenance of a work object in a work location may include 21 inspection operations, surface preparation operations and/or coating operations.
23 The plan may comprise a robotic movement sequence plan for the robotic manipulator 24 assembly. The robotic movement sequence plan may define the head movement path for the functional module. The plan may comprise a robot translation plan for the apparatus, 26 which may define the displacement of the apparatus in the work location in relation to the 27 work object and/or the work area.
29 The apparatus may be configured to present at least a part of the plan to an operator in a plan presentation, which may be a visual representation of the movement path plan in a 31 virtual representation of the work location. The movement path plan may for example be 32 represented as a line in a 3D representation of the work location.
34 The plan presentation may comprise a robotic movement sequence plan presentation, which may be a confirmation that the robotic movement sequence does not collide or 1 otherwise interfere with items or structures in the work location. Alternatively, a robotic 2 movement sequence plan presentation may comprise an animation of the robotic 3 movement sequence plan in a 3D representation of the work location.
The plan presentation may comprise a robot translation plan presentation, which may be a 6 confirmation that the robot translation does not collide or otherwise interfere with items or 7 structures in the work location. Alternatively, a robot translation plan presentation may 8 comprise an animation of the robot translation plan in a 3D representation of the work 9 location.
11 Preferably the generation of the plan uses a planning algorithm. The planning algorithm 12 preferably comprises a machine learning algorithm. The planning algorithm may be 13 configured to optimise the plan in relation to one or more of characteristics, including but 14 not limited to: duration of operation; time of operation; duration of translation of apparatus; distance of apparatus from items or objects in the work location; user interventions 16 required during the operation; user priorities.
18 The apparatus may have the ability to show asset owners the planned path of the 19 apparatus for the fabric maintenance operation. This may comprise of a virtual representation of the environment to be worked in, including the area to be prepared 21 and/or coated, with the manipulator and heads, and show how both the manipulator will 22 move as well as the base of the robot (which may be fixed or could be mobile). Key 23 statistics may be calculated from simulations of operations, including surface area blasted 24 and/ or painted per unit time, number of base station moves, expected volume of paint, grit, dry ice etc., long with areas that the manipulator will not be able to cover (which may 26 be flagged for a human to do).
28 The functional module may be selected from the group consisting of an inspection system, 29 a surface treatment system, or a coating system.
31 The surface treatment system may be selected from the group consisting of a water jetting 32 system, a dry ice blasting system, and an abrasive blasting system such as a grit blasting 33 system.
The dry ice blasting, grit blasting, water or steam jetting systems may either be closed or 2 open loop. The closed loop system may contain two or more hoses with at least one hose 3 configured to project the dry ice, grit, steam or water medium onto the required surface 4 and at least one hose acting as a return line for the residue dry ice, grit, steam or water medium and treated surface waste such as rust.
7 The coating device may be a spray paint device. The coating may be a paint. The paint 8 may be a clear paint, a mica paint, a metallic paint, a water-based paint, a solvent-based 9 paint and/or a multi-component paint.
11 The sensor system may comprise at least one sensor. The at least one sensor may be 12 selected from the group consisting of camera, laser, a range sensor, ultrasonic sensor, 13 spectrometer, wet film thickness sensor, load cell, inertial measurement unit, infra-red 14 sensor, proximity sensor, inductive sensor or a lidar sensor.
16 The apparatus may be configured to undertake fabric maintenance tasks remotely and/or 17 autonomously. The sensor system may be configured to enable real time feedback to 18 assess the operation of the functional module and/or the quality of the fabric maintenance.
The sensor system may be configured to avoid obstacles and collisions with the external 21 environment and personnel.
23 The one or more processing modules may be configured to process the collected data 24 using at least two algorithms wherein a first algorithm processes collected sensor data to locate the robotic apparatus in relation to the work location and/or work object to a first 26 resolution or accuracy and/or wherein a second algorithm may process the collected data 27 to locate the robotic apparatus in relation to the work location and/or work object to a 28 second resolution or accuracy.
All or part of the data set may be processed in an algorithm to locate the robotic apparatus 31 in relation to the work location and/or work object to a first resolution or accuracy. All or 32 part of the same data set may be processed in another algorithm to locate a robotic 33 apparatus in relation to the work location and/or work object to a second resolution or 34 accuracy. The second resolution or accuracy is higher than the first resolution or accuracy.
2 The at least two algorithms may be considered to be at least two separate algorithms or at 3 least two parts of the same algorithm. An algorithm is considered to be made up of at least 4 one component where a core part of the algorithm may be used to complete an intended purpose of the algorithm. However, aspects of the core part of the algorithm may be 6 augmented to improve the performance of the algorithm. The core part of the algorithm 7 may be augmented by a second component of the same algorithm or by another 8 algorithm.
An algorithm may comprise two (or more) components where one of those components 11 could be used as part of a system independently without the second component to perform 12 a task such as to estimate the position of a work piece is considered to be two algorithms.
14 An algorithm may have multiple components that work together to provide a single accuracy each component may provide independent information that is complementary 16 with the other components. For example, a core part of an algorithm may process small 17 movements between each data frame that is captured (say at 30 frames per second) and 18 another part of the algorithm may independently identify when the apparatus intersects 19 with a previous position and closes a loop. Where a loop closure is detected, this may be used to improve the accuracy of the core part of the algorithm.
22 Multiple parts of one algorithm may be considered to be separate algorithms. For example, 23 a first part of an algorithm which identifies a loop closure may be used independently to a 24 second part of the algorithm which may be used to estimate the position of the workpiece.
These two parts of the same algorithm may be used independently, albeit with lower 26 accuracy, and may be considered as two different algorithms.
28 The sensor system may comprise two or more sensors. The sensor system data may be 29 processed to localise the robotic manipulator assembly and/or the at least one functional module with respect to the work location and/or work object. The sensor data may be 31 processed using algorithms to recognize a workpiece and estimate its most likely position.
32 A model or a scan of the work location and/or work object may estimate where the robotic 33 manipulator assembly and/or functional module may be in relation the work location and/or 34 work object. This may be a probabilistic estimate which may be updated using information gathered from the sensor system.
2 Different sensors may be used to provide varying degrees of accuracy in localising the 3 position of the robotic manipulator assembly and/or functional module. A first sensor may 4 provide a high level low accuracy location. Alternatively and/or additionally a second sensor may provide fine alignment or high accuracy location and may be capable of 6 precision distance measurement to a point. The first and/or second sensors may be 7 selected from the list consisting of camera, laser, a range sensor, ultrasonic sensor, 8 spectrometer, wet film thickness sensor, load cell, inertial measurement unit, infra-red 9 sensor, proximity sensor, inductive sensor or a lidar sensor.
11 The first and second sensor may be a camera. Preferably the first sensor is a camera and 12 the second sensor is a laser.
14 The sensor data may be used to create a probabilistic map of free and occupied volumes that is updated. This map may be updated in real time. The sensor data may be used in a 16 simulated environment for path planning to ensure that there are no collisions.
18 The sensor system data may be processed to localise robotic manipulator assembly 19 and/or functional module with respect to the work location and/or work object.
21 The at least one processing module may comprise one or more localisation algorithms.
22 The at least one processing module may be configured to process sensor data using one 23 or more localisation algorithms to recognize a work object, estimate its position and/or 24 align the robotic assembly with the work object.
26 The at least one processing module may comprise two or more localisation algorithms.
27 The at least one processing module may be configured to process sensor data from one or 28 more sensors. A first localisation algorithm may be configured to align the robotic 29 assembly roughly or approximately with the work object. A second localisation algorithm may be configured to provide finer alignment or more accurately align the robotic assembly 31 with the work object.
33 The apparatus may be configured to enable the user to select paths for the 34 separated clusters or parts of a surface to be treated. The at least one processing module may provide a reference plan which intersects with the separated parts of the surface, 1 which may be separated by edges. The at least one processing module may generate 2 neighbouring paths by calculating a path that is a constant (user defined distance) from the 3 first path.
The work object may be located in a booth. The fabric maintenance operation may be 6 conducted in a paint and/or blast booth. The booth may be a sealed area that allows 7 inspection, painting or blasting. In the case of blasting the booth may allow open blasting 8 (grit fired at objects to remove rust and other contaminants). The booth may have a grit 9 recovery system that allow for the grit to be reused. The booth may comprise a standard 20ft or 40ft container.
12 According to a second aspect of the invention there is provided a method for a fabric 13 maintenance operation of a work object in a work location, the method comprising: 14 providing a fabric maintenance apparatus comprising: a robotic manipulator assembly; 16 a functional module at an operative end of the robotic manipulator assembly: 17 a sensor system: 18 and one or more processing module; 19 collecting data relating to the work location and/or work object and providing a data set to the processing module; 21 generating a plan for a fabric maintenance operation on a work area comprising at least a 22 part of the work object; and 23 carrying out the fabric maintenance operation on the work area in accordance with the 24 plan.
26 The method may comprise carrying out the fabric maintenance operation of a work object 27 in accordance with the plan in response to an approval signal input by an operator. The 28 plan may comprise at least a movement path plan for the functional module in relation to 29 the work object. The method may comprise presenting the plan to an operator for approval of the plan.
32 The method may comprise using an existing 3D model or generating a 3D model of the 33 work location and/or work object. The method may comprise selecting at least a part of the 34 3D model the work area to be worked on.
1 The method may comprise using at least one algorithm to generate the path for the robotic 2 manipulator assembly or a functional module at an operative end of the robotic 3 manipulator assembly. The method may comprise performing a process simulation to 4 check for possible collisions or to assess the efficiency of the plan.
6 The method may comprise collecting data from at least one sensor located on the robotic 7 manipulator assembly and/or a functional module during the fabric maintenance operation 8 to accurately position the robotic manipulator assembly and/or a functional module.
The method may comprise, subsequent to carrying out the fabric maintenance operation 11 on the work area, collecting data relating to the performance of the operation over the work 12 area, and identifying parts of the work area that require a further operation. The method 13 may comprise generating a second plan for carrying out a further fabric maintenance on 14 the work area. For example, the method may comprise inspecting the quality of the operation and identifying areas that require a repeat operation.
17 Embodiments of the second aspect of the invention may include one or more features of 18 the first aspect of the invention or its embodiments, or vice versa.
According to a third aspect of the invention there is provided a method for planning a fabric 21 maintenance operation of a work object in a work location, the method comprising: 22 providing a fabric maintenance apparatus comprising: 23 a robotic manipulator assembly; 24 a functional module at an operative end of the robotic manipulator assembly: a sensor system: 26 and one or more processing module; 27 collecting data relating to the work location and/or work object and providing a data set to 28 the processing module; 29 generating a plan for a fabric maintenance operation on a work area comprising at least a part of the work object; and 31 presenting the plan to an operator of the system for approval of the plan.
33 The method may comprise adjusting or changing the plan. The plan may be adjusted to 34 the operator's specifications. The plan may be adjusted to suit the client objectives or to adapt to changes in the environment where the work is to be performed.
1 The method may comprise carrying out the fabric maintenance operation on the work area 2 in accordance with the approved plan.
4 Embodiments of the third aspect of the invention may include one or more features of the first or second aspects of the invention or their embodiments, or vice versa.
7 According to a fourth aspect of the invention there is provided a method for planning a 8 fabric maintenance operation of a work object in a work location, the method comprising: 9 providing a sensor system: 11 and one or more processing module; 12 collecting data relating to the work location and/or work object and providing a data set to 13 the processing module, and 14 generating a plan for a fabric maintenance operation on a work area comprising at least a part of the work object.
17 The method may comprise presenting the plan to an operator of the system for approval of 18 the plan. The method may comprise providing a fabric maintenance apparatus comprising 19 a robotic manipulator assembly. The method may comprise a functional module at an operative end of the robotic manipulator assembly.
22 The method generating a plan for manoeuvring a robotic manipulator assembly or a 23 functional module at an operative end of the robotic manipulator assembly to conduct a 24 fabric maintenance operation of a work object in a work location.
26 The method may comprise generating a 3D model of the work location and/or work object.
27 The method may comprise selecting on at least a part of the 3D model the work area to be 28 worked on. The method may comprise using an algorithm to generate a path for the 29 robotic manipulator assembly or a functional module at an operative end of the robotic manipulator assembly. The method may comprise performing a process simulation to 31 check for possible collisions or assess the efficiency of the plan.
33 The method may comprise communicating the plan to the robotic manipulator assembly.
34 The method may comprise carrying out the fabric maintenance operation on the work area in accordance with the plan.
2 The method may comprise adjusting the plan. The plan may be adjusted in real time or 3 repeated after the plan has been executed.
The method may comprise, subsequent to carrying out the fabric maintenance operation 6 on the work area, collecting data relating to the performance of the operation over the work 7 area, and identifying parts Of the work area that require a further Operation. The method 8 may comprise generating a second plan for carrying out a further fabric maintenance on 9 the work area. For example, the method may comprise inspecting the quality of the operation and identifying areas that require a repeat operation.
12 The method may comprise collecting data from sensors located on the robotic manipulator 13 assembly or a functional module at an operative end of the robotic manipulator assembly 14 during or after the fabric maintenance operation. The plan may be adjusted based on the analysis of the collected data from the sensors located on the robotic manipulator 16 assembly or a functional module at an operative end of the robotic manipulator assembly.
18 The sensor system may be located on the robotic manipulator assembly. The sensor 19 system may be a handheld sensor system.
21 Embodiments of the fourth aspect of the invention may include one or more features of the 22 first to third aspects of the invention or their embodiments, or vice versa.
24 According to a fifth aspect of the invention, there is provided an apparatus for industrial fabric maintenance of a work object in a work location, the apparatus comprising: 26 a robotic manipulator assembly; 27 a sensor system operable to collect data relating to the work location and/or work object; 28 and a functional module at an operative end of the robotic manipulator assembly; 29 wherein the functional module comprises: a first inlet and a first outlet for a surface treatment medium; 31 a first conduit between the first inlet and first outlet; 32 and a connector for removably connecting the functional module to the robotic 33 manipulator assembly such that the first inlet is coupled to a source of the surface 34 treatment medium; 1 wherein the functional module comprises a shape or form selected for delivery of the 2 surface treatment medium in dependence on the geometry of a work area on the work 3 object.
The apparatus may be configured to position and/or move the functional module according 6 to a plan. The plan may comprise a robotic movement sequence plan for the robotic 7 manipulator assembly. The robotic movement sequence plan may define the head 8 movement path for the functional module.
The functional module may comprise a surface preparation head, and the surface 11 treatment medium may be a surface preparation medium. The functional module may 12 comprise a surface coating head, and the surface treatment medium may be a surface 13 coating medium. The surface coating medium may comprise a paint.
The shape or form of the functional module may comprise a head surface profile, which 16 may be configured to be presented to the work area of the work object, and which may be 17 configured to engage or otherwise interact with the work area of the work object.
19 The head surface profile may comprise a substantially flat or flat planar surface configured to be presented to the work area of the work object. Such a head surface profile may be 21 particularly suitable for a substantially flat or flat planar work area surface. The head 22 surface profile may comprise a concave surface configured to be presented to the work 23 area of the work object. Such a head surface profile may be particularly suitable for a 24 convex work area surface. The head surface profile may comprise a convex surface configured to be presented to the work area of the work object. Such a head surface 26 profile may be particularly suitable for a concave work area surface.
28 The head surface profile may comprise a cylindrical or part-cylindrical surface configured 29 to be presented to the work area of the work object. Such a head surface profile may be particularly suitable for a cylindrical or part-cylindrical work area surface, for example the 31 surface of a pipe.
33 The head surface profile may comprise a surface configured to be presented to the work 34 area of the work object that is curved with respect to two orthogonal axes.
1 The head surface profile may comprise one or more surface projections configured to be 2 presented to the work area of the work object. Such a head surface profile may be 3 particularly suitable for a recess, groove, or relief in a work area surface.
The apparatus may comprise one or more sensors. The one or more sensors may be 6 mounted on the functional module, such that they are removably connected to the robotic 7 manipulator assembly with the functional module.
9 Alternatively, the one or more sensors may be mounted on the robotic manipulator assembly, for example such that they remain on the robotic manipulator assembly when 11 the functional module is removed.
12 Where the apparatus comprises a plurality of sensors, a first subset of the sensors may be 13 mounted on the functional module, and a second subset of sensors may be mounted on 14 the robotic manipulator assembly.
16 The one or more sensors may form a part of a sensor system. The sensor system may 17 comprise a first optical system for collecting a first data set relating to a work location 18 and/or work object and may comprise a second optical system for collecting a second data 19 set relating to a work location and/or work object. The sensor system may comprise at least one processing module for processing the first and second data sets.
22 The first optical system may comprise an optical camera, and the first data set may 23 comprise camera imaging data. The second optical system may comprise a laser 24 positioning system, and the second data set may comprise laser positioning data.
26 Preferably, the sensor system is operable to process the first data set to locate the robotic 27 apparatus in relation to the work location and/or work object to a first resolution or 28 accuracy. More preferably, and the sensor system is operable to process the second data 29 set to locate a robotic apparatus in relation to the work location and/or work object to a second resolution or accuracy, wherein the second resolution or accuracy is higher than 31 the first resolution or accuracy.
33 Embodiments of the fifth aspect of the invention may include one or more features of the 34 first to fourth aspects of the invention or their embodiments, or vice versa 1 According to a sixth aspect of the invention, there is provided a functional module for 2 removable connection to the apparatus of the fifth aspect of the invention.
4 The apparatus may be configured to position and/or move the functional module according to a plan. The plan may comprise a robotic movement sequence plan for the robotic 6 manipulator assembly. The robotic movement sequence plan may define the head 7 movement path for the functional module.
9 Embodiments of the sixth aspect of the invention may include one or more features of the first to fifth aspects of the invention or their embodiments, or vice versa.
12 According to a seventh aspect of the invention, there is provided a modular system of 13 components comprising the apparatus of the first aspect of the invention and a plurality of 14 functional modules interchangeable on the robotic manipulator assembly of the apparatus, wherein each functional module wherein the functional module comprises a shape or form 16 selected for delivery of the surface treatment medium in dependence on the geometry of a 17 work area on the work object.
19 Embodiments of the seventh aspect of the invention may include one or more features of the first to sixth aspects of the invention or their embodiments, or vice versa.
22 According to an eighth aspect of the invention, there is provided a method of performing a 23 fabric maintenance operation using the apparatus according to first or fifth aspects of the 24 invention or the system according to the sixth aspect of the invention, the method comprising removing a first functional module from the apparatus, and connecting a 26 second functional module to the apparatus.
28 Embodiments of the eighth aspect of the invention may include one or more features of the 29 first to seventh aspects of the invention or their embodiments, or vice versa.
31 According to a ninth aspect of the invention there is provided a system for industrial fabric 32 maintenance of a work object in a work location, the system comprising: 33 a robotic manipulator assembly; 34 a functional module at an operative end of the robotic manipulator assembly; a sensor system; 1 and one or more processing module; 2 wherein the sensor system is operable to collect data relating to the work location and/or 3 work object and provide a data set to the processing module, and the processing module 4 is operable generate a plan for a fabric maintenance operation on a work area comprising at least a part of the work object; 6 and wherein the plan comprises at least a head movement path plan for the functional 7 module in relation to the work object.
9 Embodiments of the ninth aspect of the invention may include one or more features of the first to eighth aspects of the invention or their embodiments, or vice versa.
12 Brief description of the drawings
14 There will now be described, by way of example only, various embodiments of the invention with reference to the drawings, of which: 17 Figure 1A and 1B are a sketch and a schematic view of a fabric maintenance system for 18 inspection, surface preparation and coating of steel structures according to the invention; Figure 2 is a schematic view of different elements of a sensor system which may be 21 incorporated into the fabric maintenance system of Figure 1A; 23 Figure 3A and 38 are a sketch and a schematic view of a robotic manipulator system set 24 up for water jetting surface preparation according to one embodiment of the invention; 26 Figure 4A is a flow diagram showing an example blasting operation surface preparation 27 according to an embodiment of the invention; 29 Figure 43 is a flow diagram showing an example paint operation of a prepared surface according to an embodiment of the invention; 32 Figure 5A is a flow diagram showing an example process flow of a surface preparation 33 operation according to an embodiment of the invention; 1 Figure 5B is a flow diagram showing an example process flow of a painting operation 2 according to an embodiment of the invention; 4 Figure 6 shows a fabric maintenance system 300 set up for surface preparation according to an embodiment of the invention; 7 Figure 7A and 78 are a sketch and an enlarged schematic view of a manipulator arm of a 8 fabric maintenance system with a sensor system according to the invention; Figure 8 is a schematic view of selection of different end effector heads that may be 11 removably attached to the fabric maintenance system of Figure 1A; 13 Figure 9 is schematic view of a fabric maintenance system with an end effector head 14 attachment system according to an embodiment of the invention.
16 Figure 10A and 10B are side and end profile schematic views of a fabric maintenance 17 system mounted on a rail according to the invention; and 19 Figure 11A is schematic side view of a fabric maintenance system mounted on a rail with a powered mechanism according to the invention; 22 Figure 11B is perspective schematic view of a component of the powered mechanism 23 described in Figure 11A.
Figure 12A and 12B are side and perspective views of a fabric maintenance system 26 mounted on a rail according to the invention.
28 Detailed description of preferred embodiments
Figures 1A and 1B show a fabric maintenance system 10 for inspection, surface 31 preparation and coating of steel structures according to the invention.
33 The system 10 comprises a robotic manipulator assembly 11 having a base 12 mounted 34 on endless tracks 14. Each of the endless tracks are mounted on rollers 15. An articulated manipulator arm 16 is pivotally mounted at first end 16a on the base 12. The 36 manipulator arm 16 has joints 18 which may be individually actuated to provide the arm 16 1 with at least six degrees of freedom. The manipulator arm 16 carries an end effector head 2 mount 20 which is movably secured at a second end of the manipulator arm 16b.
4 A variety of end effector heads 22 may be reversibly fixed to the end effector head mount 20 depending on the desired application including inspection, surface preparation and/or 6 coating operation. A variety of end effector heads 22 may also be provided depending on 7 the geometry of the surface to be treated. This is discussed further in relation to Figure 8 8 below.
It will be appreciated that the system may be used for a number of different applications 11 including surface preparation, inspection, or coating operation. Surface preparation (e.g. 12 water jetting) may be used for the purpose of preparing an area for non-destructive testing 13 (NOT) and/or to prepare a surface for a coating operation. The NOT work could include 14 ultrasonic (such as phased array) or radiography. The manipulator could also conduct the inspection work by use of an inspection head. Following the surface preparation and/or 16 inspection operations, application of a coating may or may not be required.
18 The different mountable end effector heads 22 enables the system to conduct inspection, 19 surface preparation and coating operations. The inspection operations include quality control checks of the treated surfaces such as blasted surfaces and painted surfaces. The 21 surface preparation operations include dry ice blasting, grit blasting or water jetting.
23 The dry ice blasting, grit blasting or water jetting system may either be closed or open 24 loop. In a closed loop system the end effector heads comprise at least a first conduit to deliver or dispense medium (dry ice, grit or water) to the surface to be treated and a 26 second conduit for suction or removal of the waste medium and contaminants for the 27 treated surface.
29 The closed loop end effector heads may comprise bristles or fibres around its surface engaging periphery which acts as a curtain or screen to assist in the controlled delivery of 31 medium (dry ice, grit or water) from the end effector head to the surface to be treated and 32 the containment and recirculation of the waste medium and contaminants to the second 33 conduit in the end effector head.
The coating operations may include painting and/or spray painting.
2 As an example, Figure 1 shows the robotic manipulator assembly 11 is set up for quality 3 control for inspection operations; water jetting, dry ice blasting and grit blasting as surface 4 preparation operations; and spray painting as a coating operation.
6 The system 10 is connected to a dry ice reservoir 32, a grit reservoir 34, a water reservoir 7 36 and to a paint reservoir 38, via conduits 32a, 34a, 36a and 38a, respectively. Although 8 these are shown in Figure 1 as individual conduits, these may be bundled together and 9 housed into a single conduit called an umbilical.
11 The dry ice reservoir 32, grit reservoir 34, water reservoir 36 and paint reservoir 38 are 12 connected to a compressor 40 via pressure lines 40a to enable the dry ice, grit, water and 13 paint to be dispensed under pressure.
It will be appreciated that the system may not comprise or use a dry ice reservoir, grit 16 reservoir, water reservoir and paint reservoir but a selection or combination of these 17 depending on type of fabric maintenance application, work scope, client objectives, type of 18 material to be worked on and the type of environment where the work is to be performed 19 (e.g. inside an oil storage vessel, outdoors in an potentially explosive environment, on a helideck etc.).
22 It will be appreciated that the conduits 32a, 34a, 36a and 38a either individually or an 23 umbilical when the conduits are bundled together, may be spooled on a reel to assist with 24 conduit handling and management. The conduits or umbilical may be paid out when required and spare conduit or umbilical removed when not.
27 The conduits 32a, 34a, 36a and 38a either individually or an umbilical when the conduits 28 are bundled together, may be connected to a plurality of small rollers to provide low friction 29 on the surface when the conduit or umbilical follows the automated device. The umbilical rollers may be powered such that the umbilical can be moved to avoid collisions.
32 The robotic manipulator assembly 11 has a sensor system 50. The sensor system may be 33 located on the robotic manipulator assembly 11, manipulator arm 16, end effector head 34 mount 20 and/or the end effector head 22. It will be appreciated that components of the 1 sensor system may be located on different parts of the robotic manipulator assembly 11, 2 manipulator arm 16, end effector head mount 20 and the end effector head 22.
4 In this example the sensor system 50 is located on the end effector head 22. In this example the sensor system 50 comprises cameras 52 and a laser-based system 54.
6 However, it will be appreciated that the camera 52 may be used without the laser-based 7 system where accurate positioning is not required.
9 The camera and laser-based sensor system enables the robotic manipulator assembly 11 and the connected end effector head 22 to be precisely positioned relative to the surface 11 of the structure or object being worked on.
13 The fabric maintenance system 10 autonomously creates a path for movement of the 14 robotic manipulator assembly, manipulator arm and/or the end effector head to avoid potential obstacles in the surrounding environment and to undertake an effective treatment 16 of the surface. The system may create a path plan based on cartesian coordinates to 17 control the overall movement of the robotic manipulator assembly, manipulator arm and/or 18 the end effector head during the treatment operation and a plan "pick and place" plan of 19 how the blasting sequence should be conducted. This may allow the robotic manipulator assembly to perform surface preparation and painting operations in congested areas 21 safely.
23 A handheld controller 60 may be connected to the robotic manipulator assembly 11 to 24 control the operation and movement of the robotic manipulator assembly 11.
26 In this example the handheld controller 60 includes a position tracking system 62 which 27 may include one or more sensors. In this example a camera 64, rangefinder 66 and/or a 28 lidar based system 68. The position tracking system 62 enables the user to identify and 29 track the position of objects in a 3D environment such that instructions for the robotic manipulator assembly 11 can be created. The tracked movements may be recorded as a 31 series of stored instructions.
33 Creation of work scope for the robot assembly may comprise of some or all of the following 34 steps: 36 a. use of an existing 3D model or scan or generation of a 3D model; 1 b. the user selecting on the 3D model the area to be worked on; 2 c. running the path generation algorithm; 3 d. simulation of the process (such as to verify the path and define the speed e.g. paint 4 thickness modelling using the paint spray cone); e. presentation of the planned path to the user for approval; 6 f. translation of the planned path into robot coordinate frame based on robot 7 localisation with respect to the work piece.
8 g. updating of the path during the process to ensure the correct distance and angle to 9 the workpiece is maintained and/ or re-planning to loop back over areas (e.g. during blasting) and/ or changing of the velocity (e.g. if blasting time needs to be longer than 11 expected); and 12 h. appending information to the location in the generated 3D model (e.g. visual 13 confirmation of successful blasting).
A. Generation of 3D model 17 Where a 3D model is not available, it may need to be generated. This can be either using 18 laser based systems (lidar) or camera based systems (e.g. photogrammetry). A handheld 19 controller (also known as a handheld scanner) may be used for the scanning. The handheld controller includes a position tracking system which may include a camera, 21 rangefinder and/or a lidar based system. The position of the handheld controller will be 22 calculated using a simultaneous localisation and mapping system. The system works by 23 estimating between successive sensor measurements the movement of the handheld 24 controller.
26 The handheld controller tracks key points On 2D or 3D) that have a degree of rotational 27 and translation invariance (approximately 20 degrees rotation and 30cm translation).
28 These points are used to estimate the motion of the handheld scanner between sensor 29 readings. This estimate is augmented with inertial measurement unit data of the rotational and linear accelerations to provide an estimate with less uncertainty. If the estimate 31 position of the current sensor measurements is sufficiently different from the last recorded 32 frame, a new frame will be inserted into a map.
34 The frame will have the six degree of freedom pose and will be linked to the previous recoded frame along with the uncertainty of the measurement. In addition to this a global 1 system of identifying whether a loop has been made will operate. This will aim to find 2 correspondences with frames recorded in other parts of the map which might (based on 3 probability distribution) be part of a loop. When a loop is detected all of the poses in the 4 map are updated to reflect the new information that a loop has been closed. The pose will then be used to fuse point clouds captured from individual positions together.
7 B. User area selection 9 Prior to user selection the 3D points generated from the scan data or where available from an existing model are converted into a surface representation. An algorithm is run to 11 identify standard geometries and sharp edges. These are used to segment the model into 12 a series of sub models. A machine learning algorithm can optionally classify the surface to 13 select optimal meshing parameters (such as the algorithm type and the smoothing 14 parameters). The edges have lines or curves fitted to them to improve the quality of the mesh by preserving sharp edges.
17 The user can select points on the meshed object. The points are selected by generating a 18 line from the user selection tool on top of a virtual surface. Where the line intersects the 19 surface a point is generated and displayed.
21 A line across the surface of the object will be created connecting two successive points.
22 Either the shortest path across the surface or using a plane with the third degree of 23 freedom selected by the user. The user can select as many points as desired with a 24 minimum of three points. The user can see the model in 3D and manipulate the model during the selection process.
27 C. Path generation algorithm 29 The selected surface may be segmented into smaller sub surfaces by extracting surfaces separated by edges (using an edge detection algorithm) or splitting large surface in to two 31 subclusters (and repeating this until the clusters are below a specified size -max distance 32 between any two points or similar).
34 A reference edge is either determined by a machine learning algorithm trained on simulated data to minimise total object path length and complexity or provided by the user.
1 User input is created by providing three degrees of freedom, such as two point on the 2 surface and an angle. The reference path will have a corresponding plane. At regular 3 distances along the path (user or machine learning algorithm selected) points will be 4 generated with planes generated at each point where the plane is perpendicular to the reference edge plane.
7 A separation of paths (e.g. for painting) will be generated by an algorithm based on the 8 spray cone size or may be user selected. A distance along each of the new planes across 9 the surface they intersect will be traversed until it is at the path separation. All of these new points will be linked together to generate a new path. The mid points of the new path 11 straight sections will be used to generate new planes (one for each straight section of 12 path) with the normal the vector connecting the two points making up the straight section 13 of path and the midpoint being on the plane. New paths are generated by traversing the 14 path separation distance across the intersection of the new planes and the surface. The process is repeated until the surface is covered in paths. An optimisation algorithm may be 16 used to reduce path complexity and time (e.g. reference path plane parameters clustering 17 parameters for separating the area into smaller separate surfaces).
19 D. Process simulation 21 Once the paths across the surface have been generated a simulation is run to ensure that 22 there are no collisions, (some path segments may need to be modified. This can be done 23 by allowing a number of parameters to be changed with a range including distance from 24 the surface and angle of the tool head relative to the surface. An optimisation algorithm can be used to identify new parameters that do not result in a collision where the loss 26 parameter is based on the number of points on the path that result in collisions (the 27 spacing between waypoints can be modified to ensure convergence). VVhere paths cannot 28 be found these are excluded from further processing and are marked for the user.
29 Optionally, the further process simulation may be performed. For painting this may include a deposition model based on the spray tip selected (either by the user or the path 31 optimisation algorithm in C.), the distance from the object and system pressure. For 32 blasting grit open loop this is based on the spray cone, distance from the surface, type of 33 blasting media and system pressure.
1 The velocity along the path is optimised to ensure that the required media is deposited on 2 the surface. This may be modified while executing the path based on sensor feedback.
4 E. Presentation to user for approval 6 The user is presented with the path and associated statistics from the modelling.
8 F. Translation to robot When the robot is on site the position that the robot arm needs to be located to complete 11 the path will be calculated by identifying the transform from the robot frame to the work 12 piece frame. This will be generated using the simultaneous localisation and planning 13 system. Recommended robot base positions will be provided to the user to ensure all 14 areas can be reached after the robot has worked from, all of the recommended base positions.
17 G. Path adjustment/ re-planning 19 The path may be adjusted or completely changed during the process. This will be based on sensor feedback (e.g. visual data run through a machine learning algorithm to identify 21 blast/coating quality). For surface preparation, poorly performed work can be remedied by 22 looping back over the path (automatically initiated where poor blast/coating quality is 23 detected). The velocity may be changed (e.g. for blasting where there is more rust than 24 anticipated).
26 H. Appending information 28 Process information may be appended to the generated path, this could include images 29 and other sensor readings.
31 An inertial measurement unit 70 on the robot base enables measurements to be used to 32 dynamically adjust the robot trajectory for instability/ movement of the base. This becomes 33 an important feature if the robot is mounted on a long reaching structure such as a 34 hydraulic boom.
1 The position tracking system 62 can use the camera 64 or lidar based system 68 2 optionally combined with the inertial measurement unit 70 to accurate track and position 3 the robotic manipulator assembly 11. The position tracking may be the same as described 4 for the handheld controller. Optionally more than one position tracking system may be used, one for rough positioning (say within a 10cm sphere -fast and very robust but low 6 accuracy), the next more accurate position tracking system specific the workpiece using a 7 surface fitting of the original scan/ model used for path planning to the data being 8 observed from a camera system or lidar generating a point cloud. An initial guess for the 9 position may come from the first positional tracking system. The final position tracking system may use laser data to provide fine alignment to the workpiece to maximise quality 11 and consistency.
13 Information on the environment can be communicated between the robotic manipulator 14 assembly 11 and the handheld controller 60 such that the calculated position by the handheld controller 60 can be used by the robotic manipulator assembly 11.
17 Data gathered by the robotic manipulator assembly 11 and the handheld controller 60 can 18 be attached to the position of physical objects in the working environment such that useful 19 information can be displayed and accessed by the user.
21 The information can optionally be overlaid in augmented reality for the user and the user 22 can then use the handheld controller 60 to change the position of the selected point, both 23 in distance away and point on the users' vision.
In use, the robotic manipulator assembly 11 is able to be precisely moved and positioned 26 using localisation visual data provided by the camera 52 and/or lidar 54. The visual input 27 data by the camera is processed by a localisation algorithm which ensures the robot to 28 avoids collisions with the external environment and personnel.
In order to ensure that the manipulator and head does not collide with the environment, 31 visual images and/ or laser range measurements are used to create identify points in the 32 environment that have been confirmed to be occupied. These may be added to over time 33 or refreshed frequently. When checking for collisions the path of the manipulator and head 34 is broken up into a series of steps, usually at constant time steps.
1 The occupied volume of the head system and the manipulator at each time step in the 2 planned path will be checked for collisions. This may be performed during the process 3 simulation operation described in D) above. The head and manipulator may be 4 represented as a simpler geometry to improve processing time (e.g. cuboid or a series of spheres of varying diameters). Path planning will use an optimisation approach where a 6 cost function will be applied to a specific path.
8 Once in position the individual work item is identified by a user with the handheld controller 9 60. Using input data from a range of cameras 52 and the laser-based system 54 on the sensor system, the most efficient path to be taken in the surface preparation or coating 11 operation is calculated by a path planning algorithm.
13 It will be appreciated that the scanning operations and/or planning operations may be 14 performed some time before the work is to be carried out. The scanning operations and/or planning operations may be performed hours, days, weeks, months or even years before 16 the work is to be carried out. The scanning operations and/or planning operations may be 17 perform in the absence or presence of the robot assembly in the work location or 18 environment. This scanning operation and/or planning operation may be performed using 19 available 3D modelling or by generating 3D models using the handheld controller.
Alternatively the scanning operations and planning operations may be performed in the 21 presence of the robot just before the work is due to be carried out.
23 The robotic manipulator assembly 11 is configured to conduct inspection, surface 24 preparation or coating operation to a three-dimensional surface of an object along a calculated path.
27 The proposed path is generated virtually and sent to the user for review and approval.
28 Through this method the robot is capable of executing complex paths on a range of 29 geometries including flat areas, pipes and curved surfaces.
31 The path generation process may be optimised during the process to reflect the level of 32 corrosion and capture the benefit of ricochets (optimising velocity along the path).
34 Referring to Figures 1A and 1B when a different inspection, surface preparation or coating operation is required the desired operation may be selected on the handheld controller 60.
1 An appropriate end effector head 22 for that specific application is mounted on the end 2 effector head mount 20 of the manipulator arm 16. The end effector head 22 may be 3 mounted manually or by as part of the plan for a fabric maintenance operation.
The base may optionally include outriggers or extendable supports to provide support and 6 stability to the device. Alternatively or additionally an electromagnet 80 maybe connected 7 to the endless tracks to optionally anchor and fix the position of the endless tracks on 8 metal structures during an inspection, surface preparation or coating operation.
Although in the above example the base is mounted on endless tracks, additionally or 11 alternatively the height of the base may be vertically adjustable such that it can be raised 12 and lowered. The base may be connected to a series of sections by a geared system that 13 when a change of height is required the gears systems climb up or down the sections.
Alternatively the apparatus may be mounted on a rail system which is further described in 16 Figure 10A to 12B.
18 Optionally a work basket may be installed in the assembly to enable a user to inspect and 19 support the work of the robotic manipulator assembly.
21 Figure 2 shows the different elements of a sensor system 50 of a robotic manipulator 22 assembly according to one embodiment of the invention.
24 The sensor system may include 20 lidar/ 20 laser profiler, 3D lidar, Load cell with one up to 6 degrees of freedom, IR projector for improved imaging e.g. vertical-cavity surface- 26 emitting laser, active stereo/ structure light camera system, laser Range finders, blast 27 attachment, spectrometer, camera -mono or stereo, inertial measurement units, 28 ultrasonic wall thickness measurement; paint pinhole/ holiday detector applied DC 29 brushes, wet film thickness sensor e.g. thermal transient analysis and ultrasonic coating thickness measurement.
32 Although in this example the sensor system is described as being located or mounted on 33 the end effector head, it will be appreciated that the sensor system may be located or 34 mounted on the robotic manipulator assembly 11, manipulator arm 16 and/or the end effector head mount 20. It will also be appreciated that robotic manipulator assembly 11, 1 manipulator arm 16, end effector head mount and/or the end effector head may contain 2 different elements of the sensor system.
4 Figures 3A and 3B show a fabric maintenance system 100 set up for water jetting surface preparation according to one embodiment of the invention.
7 The system 100 is similar to the operation of the system 10 described above in relation to 8 Figure 1. However the robotic manipulator assembly 111 has end effector head 122 with a 9 connected high-pressure water line 182 and a return vacuum line 184. During a water jetting surface preparation operation, water is transported from a water reservoir 136 via a 11 pump 137 to form a high pressure water jet at the end effector head 122. The high 12 pressure water jet is used to remove surface corrosion or contaminates on an object and 13 to fluidise waste material such as sand or silt. The vacuum line 184 connected to a 14 vacuum pump 185 is used to remove the debris, fluid and waste to container 187. The water pump and vacuum pumps may be a rotating or a reciprocating pumping system.
17 A handheld controller 160 connected to the robotic manipulator assembly 111 control its 18 operation and movement of the robotic manipulator assembly 111.
It will be appreciated that a high-pressure steam system may be used instead of the high- 21 pressure water system described in relation to Figure 3A and 3B above. It may be 22 preferably to use steam in certain surface preparation operations to remove contamination 23 such as paraffin wax precipitated from crude oil or other lubricating greases.
Figure 4A is a flow diagram 200 showing an example blasting operation surface 26 preparation according to an embodiment of the invention.
27 The system has a compressor connected to blast equipment to enable grit and water to be 28 dispensed under pressure via the umbilical/hose system which is connected to the end 29 effector head.
31 An onboard computer controls the robotic manipulator assembly platform which controls 32 the movement of the manipulator arm (robot manipulator), end effector system (including 33 the end effector head) and electromagnet (magnetic/mechanical anchor) to optionally 34 anchor and fix the position of the robotic manipulator assembly the blasting operation.
1 A sensor system controls the movement of the end effector system. The sensor system 2 comprises a camera system, a laser-based system to enables the robotic manipulator 3 assembly and the connected end effector head to be precisely positioned relative to the 4 surface of the structure or object being worked on. Optionally a spectrometer may be provided to assess and inspect the surface of the object being treated.
7 Optionally a load Cell is Connected to the end effector head to Measure and Confirm that 8 the end effector head is being held with the correct pressure against the surface where 9 surface contact is required. In the event of an over pressurisation, an ATEX over pressure system is activated.
12 The sensor system may assess or inspect the quality of the surface preparation after or 13 during the work is performed. The assessment or inspection of the quality of surface 14 preparation may be performed in real time as the surface is treated. The system may repeat or amend its plan for a fabric maintenance operation based on the results of the 16 assessment or inspection. For example, the blasting path may have been estimated using 17 an initial assessment of the level of rust (e.g. by an algorithm using visual data). If the rust 18 is worse than anticipated the path speed will be modified and sections may need to be 19 repeated.
21 Figure 4B is a flow diagram showing an example paint operation of a prepared surface 22 according to an embodiment of the invention 24 The flow diagram 220 shown in Figure 4B is similar to the flow diagram shown in Figure 4A described above. However flow diagram shown in Figure 4B relates to a paint 26 operation carried out by the robotic manipulator assembly. The compressor is therefore 27 connected to a paint system to enable paint to be dispensed under pressure via the 28 umbilical/hose system which is connected to the end effector head.
It will be appreciated that a compressor may not be required for some paint/blasting 31 operations. For example, some tools do not require pneumatic or hydraulic systems such 32 as air-less paint tools and electric bristle blasting tools.
34 A sensor system controls the movement of the end effector system during the paint operation or blast operation. The sensor system provides data that is processed and 1 allows either adjustment of the existing path or the generation of a new path for the end 2 effector system during the paint or blast operation. The sensor system comprises a 3 camera system, a laser-based system to enables the robotic manipulator assembly and 4 the connected end effector head to be precisely positioned relative to the surface of the structure or object being worked on. A spectrometer is optionally provided to assess and 6 inspect the surface of the object being treated and optionally parameters of the paint layer 7 applied.
9 The sensor system may assess or inspect the quality of the paint coating after or during the work is performed. The assessment or inspection of the quality of paint may be 11 performed in real time as the surface is painted. The system may repeat or amend its plan 12 for a fabric maintenance operation based on the results of the assessment or inspection.
13 Inspection may include the processing of visual data, roughness from a probe or laser or 14 contamination level based on visual, ultrasonic or spectrometer data.
16 Figure 5A shows a flow diagram 250 showing an example process flow of a surface 17 preparation operation according to an embodiment of the invention. Figure 5A shows a 18 first stage where the work area is visually inspected by the user and a work order 19 generated and approved. A second stage is the setup of the robotic manipulator assembly in the work location and 3D scanning of the surrounding area to assess for obstacles.
22 A third stage is the user selects the surfaces of the work object to be treated. The robot 23 autonomously prepares a path for movement of the robotic manipulator assembly, 24 manipulator arm and/or the end effector head to avoid potential obstacles in the surrounding environment and to undertake an effective treatment of the surface. The path 26 is displayed to the user and approval requested.
28 A fourth stage is a technician attaches an appropriate end effector head, selected for the 29 particular surface preparation application and the geometry of the surface to be treated.
The robot autonomously positions itself relative to the surface to be treated and begin 31 surface preparation.
33 During the surface preparation operation the sensor system including load cell, laser, 34 range finder monitors and instructs the adjustment of the position and rotation of the robotic manipulator assembly, manipulator arm and/or the end effector head. Cameras 1 and laser in the sensor system inspect the quality of the surface preparation after or during 2 the work is performed. Once the operation is complete the user is notified and a report is 3 generated.
Figure 5B shows a flow diagram 270 showing an example process flow of a painting 6 operation according to an embodiment of the invention. The painting operation described 7 in Figure 5B may be performed shortly after the surface preparation operation described in 8 a Figure 5A. Alternatively the surface to the painted may not require a surface preparation 9 operation.
11 Figure 5B shows a first stage where the surface is assessed for contaminants and if 12 required treated with a pressure wash to remove contaminants is planned. The robot 13 autonomously prepares a path for movement of the robotic manipulator assembly, 14 manipulator arm and/or the end effector head to avoid potential obstacles in the surrounding environment and to undertake an effective pressure wash treatment of the 16 surface. The path is displayed to the user and approval requested.
18 A second stage is the user selects the surfaces of the work object to be painted. The robot 19 autonomously prepares a path for movement of the robotic manipulator assembly, manipulator arm and/or the end effector head to avoid potential obstacles in the 21 surrounding environment and to undertake an effective paint treatment of the surface The 22 path is displayed to the user and approval requested.
24 A fourth stage is a technician attaches an appropriate end effector head, selected for the particular paint application and the geometry of the surface to be paint. The robot 26 autonomously positions itself relative to the surface to be treated and begins painting.
28 During the painting operation the sensor system including load cell, laser, range finder 29 monitors and instructs the adjustment of the position and rotation of the robotic manipulator assembly, manipulator arm and/or the end effector head. Cameras and laser 31 in the sensor system inspect the quality of the paint layer after or during the work is 32 performed. If the sensor system detects area where the paint quality is poor, the user is 33 notified of these areas and repeated paint operation approval is requested.
34 Once the operation is complete the user is notified and a quality report is generated.
1 Figure 6 shows a fabric maintenance system 300 set up for surface preparation according 2 to an embodiment of the invention.
4 The system 300 is similar to system 10 described in Figure 1 and will be understood from the description of Figure 1. The system 300 comprises a robotic manipulator assembly 311 6 having a base 312 mounted on endless tracks 314. Each of the endless tracks are 7 mounted on rollers 315. An articulated manipulator arm 316 is pivotally mounted at first 8 end 316a on the base 312. The manipulator arm 316 has joints 318 which may be 9 individually actuated to provide the arm 316 with at least six degrees of freedom. The manipulator arm 316 carries an end effector head mount 320 which is movably secured at 11 a second end of the manipulator arm 316b.
13 A variety of end effector heads may be reversibly fixed to the end effector head mount 320 14 depending on the desired inspection, surface preparation or coating operation. The end effector head may comprise a plurality of sensors including cameras, lasers, inductive 16 sensors and/or ultrasonic sensors to ensure correct mapping of the surface to be treated.
17 Only one end effector head is shown in Figure 6.
19 Using sensor system the robotic manipulator assembly 311 autonomously prepares a path for movement of the robotic manipulator assembly, manipulator arm and/or the end 21 effector head to avoid potential obstacles in the surrounding environment and to undertake 22 an effective treatment of the surface.
24 The orientation of the manipulator arm and application head are maintained at all times.
Where three or more proximity sensors are used a rotation and translation from the current 26 position to the optimal position can be calculated and implemented on all of the servos 27 with a control loop to enable a timely adjustment.
29 To identify the precise location of the robotic manipulator assembly, manipulator arm and/or the end effector head a laser sensor and/or camera is used. Alternatively or 31 additionally structured light or two cameras in stereo can be used.
33 A projection small laser dots using a VSCEL or similar can provide texture to the stereo 34 camera such that an accurate surface geometry can be assessed.
1 An important part of this system is the continuous visual, spectrometry and/or laser surface 2 profiling by the sensor system. The visual inspection involves capturing an image and then 3 processing it such that each pixel is compared against a reference grading system. This 4 enables the surface to be graded such that it is confirmed to meet the surface preparation
specification or paint specification.
7 The system may comprise a strong external light (most likely one or more LEDs of more 8 than 5000Im of a VCSEL laser illumination in the IR spectrum) to provide consistent 9 lighting for the assessment. The spectrometer will identify salt and other surface contamination and based on the algorithm will identify whether more blasting/ washing is 11 required.
13 A 2D laser profiler (most likely using a light-plane-intersecting method that triangulates the 14 reflected light), will provide the distance to points on a 2D line projected onto the targeted surface such that the surface roughness can be estimated.
17 The algorithm uses the sensors to identify whether re-blasting is required. If it is, the area 18 may be re-blasted and/ or washed and then retested until satisfactory results are obtained.
19 To determine whether there is any remaining corrosion, rust or contaminates.
21 The end effector can optionally be used as a handheld unit for assessment/ measurement 22 in areas that are difficult to access for the robot. The handheld unit or device may be used 23 by a user for logging data.
The sensor system may comprise one or more cameras which collect visual data on the 26 rust level of steel surfaces where this data is processed by an algorithm to determine the 27 grade of rust against industry standards. The sensor system may comprise a 2D laser 28 profiler to assess the level of surface preparation of steel surfaces where data collected by 29 the laser profiler is processed by an algorithm which is used to identify any areas of the surface which have not been blasted to a pre-set standard. The standard is defined by the 31 user.
33 In order to ensure that the manipulator and head does not collide with the environment, 34 visual images and/ or laser range measurements are used to create identify points in the environment that have been confirmed to be occupied. These may be added to over time 1 or refreshed frequently. \Mien checking for collisions the path of the manipulator and head 2 is broken up into a series of steps, usually at constant time steps.
4 The occupied volume of the head system and the manipulator at each time step in the planned path will be checked for collisions. The head and manipulator may be represented 6 as a simpler geometry to improve processing time (e.g. cuboid or a series of spheres of 7 varying diameters). Path planning will use an optimisation approach where a cost function 8 will be applied to a specific path.
The planned path once created may be displayed to the user for approval. The displayed 11 path may consist of a virtual representation of the environment to be worked in, including 12 the area to be treat for surface preparation and/ or painted. It may also display how the 13 robotic manipulator assembly, manipulator arm and/or the end effector head will move as 14 well as the base of the robotic manipulator assembly Of it is mobile). Key statistics may be calculated from the simulation including surface area blasted and/ or painted per unit time.
16 The number of robotic manipulator assembly/base station moves, expected volume of 17 paint, grit, dry ice along with areas that the manipulator will not be able to cover due to 18 obstacles etc will be flagged to the user.
As an example, Figure 6 shows the robotic manipulator assembly 311 set up for closed- 21 loop grit blasting as surface preparation operations. The end effector head is a closed loop 22 grit blasting end effector head 322 connected to a grit reservoir (not shown) via a grit 23 supply conduit 332a and a grit return conduit 334a. The closed loop end effector heads 24 has bristles or fibres 321 around its surface engaging periphery which acts as a curtain or screen to assist in the controlled delivery of grit and water from the end effector head to 26 the surface 325 to be treated and the containment and recirculation of the waste medium 27 and contaminants to the return conduit 334a in the end effector head.
29 The fabric maintenance system autonomously prepares a path for movement of the robotic manipulator assembly 311, manipulator arm 316 and/or the end effector head 322 to avoid 31 potential obstacles in the surrounding environment and to undertake an effective treatment 32 of the surface 325. The system may create a Cartesian path plan to control the overall 33 movement of the robotic manipulator assembly 311, manipulator arm 316 and/or the end 34 effector head 322 during the treatment operation and a plan "pick and place" plan of how the blasting sequence 329 should be conducted.
2 Figure 7A and 7B shows an enlarged view of a manipulator arm 416 of the robotic 3 manipulator assembly 411. An end effector head 422 is reversibly mounted on the end 4 effector head mount 420 connected to the manipulator arm 416. A sensor system 450 is located on the end effector head 422. In this example the sensor system has a camera 6 452 and a laser system 454 to ensure correct orientation of the robotic manipulator 7 assembly 411, manipulator arm 416 and end effector head 422 relative to the work 8 location and/or work object.
The camera 452 generates a first data set and a laser system 454 generates a second 11 data set. The first and second data sets are processed to locate the manipulator arm 416 12 and end effector head a 422 to a high resolution or accuracy.
14 In this example the sensor system 450 includes a load cell 460 on the end effector head 422 to measure and confirm that the end effector head 422 is being held with the require 16 pressure against the surface where surface contact is required. Pressure data is sent from 17 the load cells to an onboard computer via electric or fibre optic cables.
19 By providing a load cell real time measurement of the load on the head can be monitored which allow sufficient pressure to be placed on the equipment to ensure the surface is 21 treated but not excess pressure which would damage the equipment or prevent it from 22 moving across the surface.
24 It will be appreciated that other sensor types may be included in the sensor system including cameras, lasers, a range sensor, spectrometers, wet film thickness sensors, load 26 cells, inertial measurement units, ultrasound sensors, infra-red sensors, infra-red 27 projectors proximity sensors, inductive sensors, or a lidar sensors.
29 Figure 8 show a selection of different end effector heads that may be removably attached to the end effector head mount 520 connected to the manipulator arm 516. The end 31 effector head 522 type may be selected depending on the type of operation required and 32 the shape or profile of the surface to be treated.
1 As an example end effector heads 540a, 540b and 540c are configured for open loop grit 2 blasting of three different surface shapes. The end effector heads 540a, 540b and 540c 3 have a single conduit 541 to supply grit to the surface to be treated.
End effector heads 542a, 542b and 542c are configured for closed loop grit blasting of 6 three different surface shapes. The end effector heads 542a, 542b and 542c have a 7 supply conduit 543 to supply grit to the surface to be treated and a return line 545 for 8 waste grit and rust etc. End effector heads 544a, 544b and 544c are configured for a different type of surface 11 preparation such as open loop water jetting blasting of three different surface shapes.
12 End effector heads 546a, 546b and 546c are configured for a painting structure or 13 surfaces having three different surface shapes.
The end effector heads may be selected to match the required surface treatment (surface 16 preparation or coating), whether it is an open or closed loop operation; and the shape or 17 profile of the surface to be treated. As an example, if a small diameter pipe is to be closed 18 looped grit blasted then end effector head 542a would be selected. If however, a flat 19 surface was to be treated with an open looped grit blast operation then end effector head 540c would be selected.
22 By selecting an appropriate or corresponding end effector head shape for the surface to be 23 treated enables close contact between the end effector head and the surface. This may 24 allow improved treatment and may allow the sensors in the sensor system to take more accurate measurements. The end effector heads may have a different surface preparation 26 or painting function. Each end effector heads may have a different shape or profile or 27 structure engaging surface.
29 Figure 9 shows a robotic manipulator assembly 611 with an end effector head 622 detached from the manipulator arm 616 of the. An end effector head 622 is reversibly 31 mounted on the end effector head mount 620 connected to the manipulator arm 616.
33 A head attachment system is located at the end of end effector head mount 620. The 34 head attachment system comprises of a rough alignment guidance system 625 which can correct for angular of translation offset when connecting a new or different end effector 1 head. The head attachment system allows quick connection and disconnection end 2 effector heads from the robotic manipulator assembly 611.
4 In this example the rough alignment guidance system 625 comprises three or more tapered rods 627 which allow for rough then successively finer alignment of the end 6 effector head mount 622 with the end effector head 622.
8 A load cell 640 in the end effector head mount 620 can measure the force exerted by the 9 tapered alignment system 625 which can be used with a control loop (e.g. PID loop) to correct for makeup path errors.
12 A locking mechanism 630 allows for the end effector head 622 to be locked in position.
13 This could consist of a ball bearing locking system where a cylinder is extended when air 14 pressure is applied such that ball bearings are forced down a tapered surface extending out in the process. They can then lock into a groove in the female part of the mating 16 mechanism. This mechanism could also be electrically actuated. Additionally or 17 alternatively a collect type connector, actuatable pins, balls or a simple j-slot may be used.
19 Pneumatic feedthrough conduits may be included which consist of a polymer seat and a cone tapered profile. Similarly, electrical connections can be made up though a male pin 21 and female receptacle. A seal around the male pin can be included to ensure debris is 22 excluded. Alternatively, cylindrical sprung electrical pins can be used which form a press fit 23 when the male and female parts of the connector are pulled together.
Paint, pneumatic and vacuum conduits 632 are sealed with a polymer ring and a male part 26 consisting of either a tapered polymer of metal part. The polymer ring if soft enough to 27 allow for a small amount of float to allow for misalignment.
29 To ensure electrical safety in explosive environments, relays or similar switches can be used to isolate electrical pins during the period they are exposed to the environment.
31 Electrical pins may be flushed with an insulating fluid or gas to ensure that they do not 32 cause a spark potential in explosive environments.
34 The mechanical connection for the end effector head will include connectors to one or more conduits that are mounted on the robotic manipulator assembly. These may include 1 a high pressure (expected to be up to 12 bar though possibly higher pressure) conduit for 2 carrying one or more of grit, dry ice and sponge blasting media.
4 In a closed loop system a second conduit such as a vacuum line may be included which when grit or sponge media is used enables said media to be recovered minimising any 6 clean up required.
8 A paint line will enable paint to be applied which may also be accompanied by a flushing 9 line enabling the lines to be cleaned automatically. A water line may be included for washing surfaces that do not meet the cleanliness requirements.
12 Where closed loop blasting is being used a seal against the surface being blasted needs 13 to be created to avoid grit escaping. This will be formed by a replaceable attachment that 14 has a number of fibres extending from the head (bristles like on a brush). These fibres can be bent such that a seal is preserved even if the head is moved. A number of attachments 16 will enable different surface to be blasted without media escaping, however the correct 17 attachment needs to be identified for the area to be blasted and an area may require 18 multiple heads to be used.
A sensor (camera or laser -point cloud) identify local radius of curvature and then match 21 this to the required head attachment. Areas segmented based on radius of curvature. Path 22 optimised to minimise time by collecting similar areas together.
24 Figures 10A and 10B shows a perspective and side views of fabric maintenance system 700 according to an embodiment of the invention. The fabric maintenance system 700 and 26 robotic manipulator assembly 711 are similar to the system 10 and robotic manipulator 27 assembly 11 described in Figure 1 and its operation will be understood from the 28 description of Figures 1 to 9 above. However, the robot manipulator assembly 711 is 29 mounted on a rail rather than a base with endless tracks.
31 In this example the rail system 720 comprises a rail 722 which is secured to a wall 724, 32 using known fixing means such as bolts into stone, brick or concrete or continuous, bolted 33 fixings on to attachment points (e.g. tapped holes in a steel structure) or welds onto a 34 metal structure (e.g. the rail attachment points welded on to a metal structure such at the 1 side of a standard shipping container). It will be appreciated that the rail 714 could 2 alternatively be mounted to the floor or ceiling.
4 The rail system 720 comprises wheels 726 connected to a central member 728. The wheels are configured to move along an internal profile of the rail 722 such as a channel or 6 track 730 of the rail 722. The rail 722 has a lip 732 configured to prevent the wheels 726 7 and central member 728 from exiting the track 730. A guide member 734 is aligned with a 8 recess 736 in the central member 728 to maintain the orientation and alignment of central 9 member during travel along the rail 722.
11 The wheels are made of a suitable material to resist bending, tension and compression 12 loads. Multiple wheels at different angles could be used. Alternatively or additionally 13 multiple rails may be used to provide additional degrees of freedom.
A support platform 740 is connected to central member 728. The robot manipulator 16 assembly 711 is mounted on or connected to the support platform. A soft outer cover may 17 be included such that dust and other debris is excluded from the assembly.
19 Figure 11A shows an alternative rail arrangement for fabric maintenance system 750 which similar to the system 700 described in Figure 10A and will be understood from the 21 description of Figures 10A above. In this example the wheels 726 are configured to be 22 driven along the channel or track 730 of the rail 722 by a lead screw shown in Figure 11 B. 23 A nut 752 is connected to the support platform 740 and the lead screw 754 is connected to 24 a motor 756. As the motor turns the screw the nut and connected support platform travel along the screw.
27 It will be appreciated that other method of driving the wheel along the rail may include 28 roller screw, powered wheeled or a pneumatic or hydraulic cylinder.
In the examples described in Figures 10A to 1 1 B wheels are configured to run along a 31 track or channel within a rail. It will be appreciated that the rail system may be configured 32 for wheels, tracks or bearings to travel over an outer surface of a rail.
34 As an example, Figure 12A and 12B show an alternative rail arrangement for fabric maintenance system 800 which similar to the system 700 described in Figure 10A and will 1 be understood from the description of Figures 10A above. However, in this example 2 instead of using wheels with a track in the rail. The rail system 820 comprises a rail 822 3 and a robot assembly support frame 839. The robot assembly support frame comprises a 4 recirculating linear bearing 826 and a support platform 840. The robot manipulator assembly 811 is mounted on or connected to the support platform. The recirculating linear 6 bearing 826 is configured to engage an outer surface of rail 822 to move the robot 7 assembly support frame along the length of the rail 822. The recirculating linear bearing 8 could be powered or unpowered.
The invention provides an apparatus for industrial fabric maintenance of a work object in a 11 work location. The apparatus comprises a robotic manipulator assembly, a functional 12 module at an operative end of the robotic manipulator assembly, a sensor system and a 13 processing module. The sensor system is operable to collect data relating to the work 14 location and/or work object and provide a data set to the processing module, and the processing module is operable generate a plan for a fabric maintenance operation on a 16 work area comprising at least a part of the work object. The plan comprises at least a head 17 movement path plan for the functional module in relation to the work object.
19 The invention provides a sensor system which allow industrial fabric maintenance of a work object in a work location in a safe and effective manner. The invention may be used 21 in a variety of locations such as an industrial site or paint shop. It may also be used in the 22 inspection, surface preparation and coating of new equipment, equipment being 23 refurbished and/or maintained.
By providing an apparatus comprising a robotic manipulator assembly, a functional module 26 at an operative end of the robotic manipulator assembly, a sensor system and a 27 processing module data relating to the work location and/or work object may be collected 28 to enable a plan to be generated for a fabric maintenance operation. This may allow 29 industrial fabric maintenance of a work object to be conducted reliably, remotely and/or autonomously.
32 The foregoing description of the invention has been presented for the purposes of 33 illustration and description and is not intended to be exhaustive or to limit the invention to 34 the precise form disclosed. The described embodiments were chosen and described in order to best explain the principles of the invention and its practical application to thereby enable others skilled in the art to best utilise the invention in various embodiments and 2 with various modifications as are suited to the particular use contemplated. Therefore, 3 further modifications or improvements may be incorporated without departing from the 4 scope of the invention herein intended.
Claims (25)
- Claims 1. An apparatus for industrial fabric maintenance of a work object in a work location, the apparatus comprising: a robotic manipulator assembly; a functional module at an operative end of the robotic manipulator assembly; a sensor system; and one or more processing module; wherein the sensor system is operable to collect data relating to the work location and/or work object and provide a data set to the processing module, and the processing module is operable generate a plan for a fabric maintenance operation on a work area comprising at least a part of the work object; and wherein the plan comprises at least a head movement path plan for the functional module in relation to the work object.
- 2. The apparatus according to claim 1 wherein the processing module is operable to generate a plan comprising a robotic movement sequence plan for the robotic manipulator assembly.
- 3. The apparatus according to claim 1 or claim 2 wherein the processing module is operable to generate a plan comprising a robot translation plan for the apparatus.
- 4. The apparatus according to any preceding claim wherein the processing module comprises a planning algorithm configured to optimise the plan in relation to one or more characteristics selected from the group consisting of duration of operation; time of operation; duration of translation of apparatus; distance of apparatus from items or objects in the work location; user interventions required during the operation and/or user priorities.
- 5. The apparatus according to any preceding claim wherein the functional module is selected from the group consisting of an inspection system, a surface treatment system, or a coating system.
- 6. The apparatus according to claim 5 wherein the surface treatment system is selected from the group consisting of a water jetting system, a dry ice blasting system, and an abrasive blasting system.
- 7. The apparatus according to claim 5 or claim 6 wherein the surface treatment system is closed loop or open loop.
- 8. The apparatus according to claim 5 wherein the coating system is a paint system.
- 9. The apparatus according to any preceding claim wherein the sensor system comprises at least one sensor.
- 10. The apparatus according to claim 9 wherein the at least one sensor is selected from the group consisting of camera, laser, a range sensor, ultrasonic sensor, spectrometer, wet film thickness sensor, load cell, inertial measurement unit, infrared sensor, proximity sensor, inductive sensor or a lidar sensor.
- 11. The apparatus according to any preceding claim wherein the sensor system is located on or in the robotic manipulator assembly, a functional module or a hand held unit.
- 12. The apparatus according to any preceding claim wherein the apparatus is configured to undertake fabric maintenance tasks remotely and/or autonomously.
- 13. The apparatus according to any preceding claim wherein the sensor system is configured to enable real time feedback to assess the operation of the functional module and/or the quality of a fabric maintenance operation.
- 14. The apparatus according to any preceding claim wherein the sensor system is configured to avoid obstacles and collisions between the robotic manipulator assembly and the surrounding environment and personnel.
- 15. A method for a fabric maintenance operation of a work object in a work location, the method comprising: providing a fabric maintenance apparatus comprising: a robotic manipulator assembly; a functional module at an operative end of the robotic manipulator assembly: a sensor system: and one or more processing module; collecting data relating to the work location and/or work object and providing a data set to the processing module; generating a plan for a fabric maintenance operation on a work area comprising at least a part of the work object; and carrying out the fabric maintenance operation on the work area in accordance with the plan
- 16. The method according to claim 15 comprising carrying out the fabric maintenance operation of a work object in accordance with the plan in response to an approval signal input by an operator.
- 17. The method according to claim 15 or claim 16 comprising carrying out the fabric maintenance operation on the work area, collecting data relating to the performance of the operation over the work area, and identifying parts of the work area that require a further operation.
- 18. A method for planning a fabric maintenance operation of a work object in a work location, the method comprising: providing a sensor system: and a processing module; collecting data relating to the work location and/or work object and providing a data set to the processing module, and generating a plan for a fabric maintenance operation on a work area comprising at least a part of the work object.
- 19. The method according to claim 18 comprising presenting the plan to an operator for approval of the plan
- 20. The method according to claim 18 or claim 19 comprising providing a fabric maintenance apparatus comprising a robotic manipulator assembly.
- 21. The method according to any one of claims 18 to 20 comprising using an existing 3D model or generating a 3D model of the work location and/or work object.
- 22. The method according to claim 21 comprising selecting at least a part of the 3D model the work area to be worked on.
- 23. The method according to any one of claims 18 to 22 comprising using an algorithm to generate the path for the robotic manipulator assembly or a functional module at an operative end of the robotic manipulator assembly.
- 24. The method according to any one of claims 18 to 23 comprising a performing a process simulation to check for possible collisions or to assess the efficiency of the plan.
- 25. The method according to anyone of claims 20 to 24 comprising collecting data from the sensor system to accurately position the robotic manipulator assembly and/or a functional module.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GBGB1911457.8A GB201911457D0 (en) | 2019-08-09 | 2019-08-09 | Fabric maintenance system and method of use |
GBGB1911466.9A GB201911466D0 (en) | 2019-08-09 | 2019-08-09 | Fabric maintenance system |
Publications (2)
Publication Number | Publication Date |
---|---|
GB202012374D0 GB202012374D0 (en) | 2020-09-23 |
GB2589418A true GB2589418A (en) | 2021-06-02 |
Family
ID=72520101
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB2012374.1A Withdrawn GB2589418A (en) | 2019-08-09 | 2020-08-10 | Fabric maintenance system and method of use |
Country Status (2)
Country | Link |
---|---|
GB (1) | GB2589418A (en) |
WO (1) | WO2021028672A1 (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113450379B (en) * | 2021-07-02 | 2022-05-27 | 湖南国天电子科技有限公司 | Method and device for extracting and analyzing profile line of section of special-shaped workpiece |
CN113663835B (en) * | 2021-09-14 | 2022-04-22 | 兴三星云科技有限公司 | Spraying equipment for door and window machining and forming |
TWI795065B (en) * | 2021-11-10 | 2023-03-01 | 逢甲大學 | Masonry automation system and operation method thereof |
CN117455860B (en) * | 2023-10-26 | 2024-04-09 | 宁波市宇星水表有限公司 | Water meter delivery data monitoring management system |
CN118834368B (en) * | 2024-09-19 | 2024-12-24 | 东北电力大学 | Preparation method and system of cable insulating layer water branch shape nano-tracing detection liquid |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010091086A1 (en) * | 2009-02-03 | 2010-08-12 | Fanuc Robotics America, Inc. | Method of controlling a robotic tool |
US20140114459A1 (en) * | 2012-10-19 | 2014-04-24 | Kabushiki Kaisha Yaskawa Denki | Robot system and processed product producing method |
WO2014145471A1 (en) * | 2013-03-15 | 2014-09-18 | Carnegie Mellon University | A supervised autonomous robotic system for complex surface inspection and processing |
WO2016148743A1 (en) * | 2015-03-18 | 2016-09-22 | Irobot Corporation | Localization and mapping using physical features |
CN107363840A (en) * | 2017-07-25 | 2017-11-21 | 广东加德伟自动化有限公司 | One kind is used for body section spray painting intelligent robot integrated system |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015081013A1 (en) * | 2013-11-26 | 2015-06-04 | Elwha Llc | Structural assessment, maintenance, and repair apparatuses and methods |
US10814480B2 (en) * | 2017-06-14 | 2020-10-27 | The Boeing Company | Stabilization of tool-carrying end of extended-reach arm of automated apparatus |
-
2020
- 2020-08-10 GB GB2012374.1A patent/GB2589418A/en not_active Withdrawn
- 2020-08-10 WO PCT/GB2020/051903 patent/WO2021028672A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010091086A1 (en) * | 2009-02-03 | 2010-08-12 | Fanuc Robotics America, Inc. | Method of controlling a robotic tool |
US20140114459A1 (en) * | 2012-10-19 | 2014-04-24 | Kabushiki Kaisha Yaskawa Denki | Robot system and processed product producing method |
WO2014145471A1 (en) * | 2013-03-15 | 2014-09-18 | Carnegie Mellon University | A supervised autonomous robotic system for complex surface inspection and processing |
WO2016148743A1 (en) * | 2015-03-18 | 2016-09-22 | Irobot Corporation | Localization and mapping using physical features |
CN107363840A (en) * | 2017-07-25 | 2017-11-21 | 广东加德伟自动化有限公司 | One kind is used for body section spray painting intelligent robot integrated system |
Also Published As
Publication number | Publication date |
---|---|
GB202012374D0 (en) | 2020-09-23 |
WO2021028672A1 (en) | 2021-02-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
GB2589418A (en) | Fabric maintenance system and method of use | |
WO2021028673A1 (en) | Fabric maintenance sensor system | |
US11110611B2 (en) | Automatic detection and robot-assisted machining of surface defects | |
US11579097B2 (en) | Localization method and system for mobile remote inspection and/or manipulation tools in confined spaces | |
US10864640B1 (en) | Articulating arm programmable tank cleaning nozzle | |
US10449619B2 (en) | System for processing a workpiece | |
US20170305015A1 (en) | Semi-Autonomous Multi-Use Robot System and Method of Operation | |
CA2554992A1 (en) | Cost effective automated preparation and coating methodology for large surfaces | |
US20170120442A1 (en) | System and method for inspection and maintenance of hazardous spaces | |
US20210038045A1 (en) | Exterior Wall Maintenance Apparatus | |
EP2994248A1 (en) | Multifunction robot for maintenance in confined spaces of metal constructions | |
Le et al. | The spir: An autonomous underwater robot for bridge pile cleaning and condition assessment | |
US11731281B2 (en) | Automation in a robotic pipe coating system | |
WO2019094766A1 (en) | Portable structure cleaner/painter | |
Mateos et al. | Automatic in-pipe robot centering from 3D to 2D controller simplification | |
Paul et al. | A robotic system for steel bridge maintenance: Field testing | |
JP6735316B2 (en) | Surface treatment equipment | |
CN117140543A (en) | Intelligent water-cooled wall climbing maintenance operation robot and working method thereof | |
US12172266B1 (en) | System and method for media blasting a workpiece | |
Santos et al. | ROBBE–Robot-aided processing of assemblies during the dismantling of nuclear power plants | |
CN211654333U (en) | Device for cleaning radioactive contaminants | |
Tunawattana et al. | Design of an underwater positioning sensor for crawling ship hull maintenance robots | |
US11745309B1 (en) | Remotely operated abrasive blasting apparatus, system, and method | |
Mende et al. | Environment modeling and path planning for a semi-autonomous manipulator system for decontamination and release measurement | |
TWI801805B (en) | Coating system and its application method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |