WO2021028673A1 - Système de capteur de maintenance de tissu - Google Patents

Système de capteur de maintenance de tissu Download PDF

Info

Publication number
WO2021028673A1
WO2021028673A1 PCT/GB2020/051904 GB2020051904W WO2021028673A1 WO 2021028673 A1 WO2021028673 A1 WO 2021028673A1 GB 2020051904 W GB2020051904 W GB 2020051904W WO 2021028673 A1 WO2021028673 A1 WO 2021028673A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
sensor system
algorithm
data
work
Prior art date
Application number
PCT/GB2020/051904
Other languages
English (en)
Inventor
Ben BAMFORD
Ben Stuart
Edward RAFIPAY
Anthony Wilson
Original Assignee
Quantum Leap Technologies Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GBGB1911458.6A external-priority patent/GB201911458D0/en
Priority claimed from GBGB1911466.9A external-priority patent/GB201911466D0/en
Application filed by Quantum Leap Technologies Limited filed Critical Quantum Leap Technologies Limited
Publication of WO2021028673A1 publication Critical patent/WO2021028673A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05BSPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
    • B05B13/00Machines or plants for applying liquids or other fluent materials to surfaces of objects or other work by spraying, not covered by groups B05B1/00 - B05B11/00
    • B05B13/005Machines or plants for applying liquids or other fluent materials to surfaces of objects or other work by spraying, not covered by groups B05B1/00 - B05B11/00 mounted on vehicles or designed to apply a liquid on a very large surface, e.g. on the road, on the surface of large containers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05BSPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
    • B05B13/00Machines or plants for applying liquids or other fluent materials to surfaces of objects or other work by spraying, not covered by groups B05B1/00 - B05B11/00
    • B05B13/02Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work
    • B05B13/04Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work the spray heads being moved during spraying operation
    • B05B13/0431Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work the spray heads being moved during spraying operation with spray heads moved by robots or articulated arms, e.g. for applying liquid or other fluent material to 3D-surfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05BSPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
    • B05B9/00Spraying apparatus for discharge of liquids or other fluent material, without essentially mixing with gas or vapour
    • B05B9/03Spraying apparatus for discharge of liquids or other fluent material, without essentially mixing with gas or vapour characterised by means for supplying liquid or other fluent material
    • B05B9/04Spraying apparatus for discharge of liquids or other fluent material, without essentially mixing with gas or vapour characterised by means for supplying liquid or other fluent material with pressurised or compressible container; with pump
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B24GRINDING; POLISHING
    • B24CABRASIVE OR RELATED BLASTING WITH PARTICULATE MATERIAL
    • B24C3/00Abrasive blasting machines or devices; Plants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/005Manipulators for mechanical processing tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0075Manipulators for painting or coating
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/088Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
    • B25J13/089Determining the position of the robot with reference to its environment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/022Optical sensing devices using lasers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/005Manipulators mounted on wheels or on carriages mounted on endless tracks or belts
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/02Manipulators mounted on wheels or on carriages travelling along a guideway
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B24GRINDING; POLISHING
    • B24CABRASIVE OR RELATED BLASTING WITH PARTICULATE MATERIAL
    • B24C1/00Methods for use of abrasive blasting for producing particular effects; Use of auxiliary equipment in connection with such methods
    • B24C1/003Methods for use of abrasive blasting for producing particular effects; Use of auxiliary equipment in connection with such methods using material which dissolves or changes phase after the treatment, e.g. ice, CO2
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45066Inspection robot

Definitions

  • the present invention relates to inspection, surface preparation and coating of objects and structures in an industrial environment. Aspects of the invention relate to a sensor system for industrial fabric maintenance of a work objects and structures in a work location.
  • the invention has particular application to the blasting and painting of steel surfaces.
  • An important part of surface preparation is ensuring that the prepared surface meets the specification required by the end user. This typically includes visual inspection to verify that any previous coating / rust has been removed and that the surface does not have excessive dust. It is also important that salt levels are sufficiently low; and that the surface roughness is sufficient to hold the coating.
  • a sensor system for a robotic apparatus for industrial fabric maintenance of a work object in a work location comprising: a first optical system for collecting a first data set relating to a work location and/or work object; a second optical system for collecting a second data set relating to a work location and/or work object; at least one processing module for processing the first and second data sets; wherein the first optical system comprises an optical camera, and the first data set comprises camera imaging data; wherein the second optical system comprises a laser positioning system, and the second data set comprises laser positioning data; wherein the sensor system is operable to process the first data set to locate the robotic apparatus in relation to the work location and/or work object to a first resolution or accuracy, and wherein the sensor system is operable to process the second data set to locate a robotic apparatus in relation to the work location and/or work object to a second resolution or accuracy, wherein the second resolution or accuracy is higher than the first resolution or accuracy.
  • the robotic apparatus may comprise a robotic manipulator assembly.
  • the robotic manipulator assembly may comprise at least one functional module at an operative end of the robotic manipulator assembly.
  • the sensor system may be configured to locate, orientate, control and/or position the robotic apparatus, robotic manipulator assembly and/or functional module.
  • the sensor system according to any preceding claim wherein the sensor system is configured to locate, orientate, control and/or position the robotic apparatus, or at least a part thereof, during, before or after a fabric maintenance operation.
  • the sensor system may be configured to monitor the movement of the robotic apparatus, robotic manipulator assembly and/or functional module.
  • the sensor system may be configured to monitor translation movement of the robotic apparatus, robotic manipulator assembly and/or functional module.
  • the sensor system may be configured to locate, orientate, control and/or position the robotic apparatus, robotic manipulator assembly and/or functional module in relation to the work location and/or work object in accordance with a plan.
  • the sensor system may be configured to monitor the movement of the robotic apparatus, robotic manipulator assembly and/or functional module and the at least one processing module may compare the sensor data set with a plan.
  • the at least one processing module may be configured to compare the first data set and/or the second data set with a plan.
  • the at least one processing module may use an existing plan or may generate a plan to locate, orientate, control and/or position the robotic apparatus, or at least a part thereof during a fabric maintenance operation.
  • the plan may comprise a movement sequence for the robotic manipulator assembly or functional module.
  • the movement sequence may define the movement path for the functional module.
  • the plan may comprise at least a head movement path plan for the functional module in relation to the work object.
  • the apparatus may be configured to present the movement path to an operator, which may be a visual representation of the movement path in a virtual representation of the work location.
  • the functional module may be selected from the group consisting of an inspection system, a surface treatment system or a coating system.
  • the surface treatment system may be selected from the group consisting of a water jetting system, a dry ice blasting system and an abrasive blasting system such as a grit blasting system.
  • the dry ice blasting, grit blasting, water or steam jetting systems may either be closed or open loop.
  • the closed loop system may contain two or more hoses with at least one hose configured to project the dry ice, grit, steam or water medium onto the required surface and at least one hose acting as a return line for the residue dry ice, grit, steam or water medium and treated surface waste such as rust.
  • the coating device may be a spray paint device.
  • the coating may be a paint.
  • the paint may be a clear paint, a mica paint, a metallic paint, a water-based paint, a solvent-based paint and/or a multi-component paint.
  • the sensor system may comprise at least one sensor.
  • the at least one sensor may be selected from the group consisting of a camera, laser, a range sensor, spectrometer, wet film thickness sensor, load cell, inertial measurement unit, ultrasound sensor, infra-red sensor, proximity sensor, inductive sensor, or a lidar sensor.
  • the sensor system may comprise two or more sensors.
  • the sensor system data may be processed to localise the robotic manipulator assembly and/or functional module with respect to the work location and/or work object.
  • the sensor data may be processed using algorithms to recognize a workpiece and estimate its most likely position.
  • a model or a scan of the work location and/or work object may estimate where the robotic manipulator assembly and/or functional module is in relation the work location and/or work object. This may be a probabilistic estimate which may be updated using information gathered from the sensor system.
  • the at least one processing module may use sensor data to create a probabilistic map of free and occupied volumes that is updated. This map may be updated in real time. This may be used in a simulated environment for path planning to ensure that there are no collisions.
  • the sensor system may collect data relating to the performance of the operation over the area.
  • the sensor system may identify parts of the work area that require a further operation.
  • the at least one processing module may generate a second plan for carrying out a further fabric maintenance on the work area. For example, the sensor system may inspect the quality of the operation and identify areas that require a repeat operation.
  • the work object may be located in a booth.
  • the fabric maintenance operation may be conducted in a paint and/or blast booth.
  • the booths may be a sealed area that allows inspection, painting or blasting. In the case of blasting the booth may allow open blasting (grit fired at objects to remove rust and other contaminants).
  • the booth may have a grit recovery system that allow for the grit to be reused.
  • the booth may comprise a standard 20ft or 40ft container.
  • the sensor system may use an existing plan or generate a plan.
  • the sensor system may compare the operation of the robotic apparatus, robotic manipulator assembly and/or functional module with the plan.
  • the sensor system may compare at least one factor with the plan.
  • the at least one factor may include location, orientation, position and/or velocity of the robotic apparatus, robotic manipulator assembly and/or functional module.
  • the at least one factor may include spray rate, surface condition, distance of functional module to object and/or angle of functional module to the work object.
  • the sensor system may adjust or change the plan.
  • the plan may be adjusted to the operator’s specifications.
  • the sensor system may adjust or change the plan in response to sensor data.
  • the sensor system may adjust or change the plan to adapt changes in the environment where the work is to be performed and/or to avoid obstacles.
  • the sensor system may present the revised plan to an operator for approval.
  • the plan may be adjusted in real time or repeated after the first plan has been executed.
  • the apparatus may be configured to undertake fabric maintenance tasks remotely and/or autonomously.
  • the sensor system may be configured to enable real time feedback to assess the operation of the functional module and/or the quality of the fabric maintenance.
  • the sensor system may be configured to analyse data from the first data set, second data and/or from the at least one sensor to assess the operation of the functional module and/or the quality of the fabric maintenance.
  • the sensor system may be configured to avoid obstacles and collisions with the external environment and/or personnel.
  • the sensor system may be configured to analyse data from the first data set, second data and/or from the at least one sensor to avoid obstacles and collisions.
  • the sensor system may be mounted on or connected to the robotic apparatus, robotic manipulator assembly and/or functional module.
  • the sensor system, or at least a part thereof may be mounted or connected to a functional module, which may be removably connected to the robotic apparatus, such that the sensor system or part thereof is removably connected to the robotic apparatus with the functional module.
  • the sensor system, or at least a part thereof may be mounted on or connected to a portable and/or handheld unit.
  • a method of operating a robotic device for industrial fabric maintenance of a work object in a work location comprising: providing a robotic apparatus with a sensor system, the sensor system comprising: a first optical system; a second optical system; at least one processing module for processing the first and second data sets; collecting a first data set relating to a work location and/or work object using the first optical system; collecting a second data set relating to a work location and/or work object using the second optical system; processing the first data set to locate the robotic apparatus in relation to the work location and/or work object to a first resolution or accuracy; processing the second data set to locate the robotic apparatus in relation to the work location and/or work object to a second resolution or accuracy, wherein the second resolution or accuracy is higher than the first resolution or accuracy.
  • optical system may mean a collection of optical sensors or a single optical sensor.
  • the first optical system may comprise an optical camera.
  • the method may comprise collecting camera imaging data as a first data set.
  • the second optical system may comprise a laser positioning system.
  • the method may comprise collecting laser positioning data as a second data set.
  • the method may comprise locating, orientating, controlling and/or positioning the robotic apparatus, or at least a part thereof, in response to the first and/or second data set.
  • the method may comprise continually collecting and processing data from the first and/or second optical system during, before and/or after a fabric maintenance operation.
  • the method may comprise monitoring the movement of the robotic apparatus during, before or after a fabric maintenance operation.
  • the method may comprise comparing the location and/or movement of the robotic apparatus, or at least a part thereof, with a plan.
  • the plan may be an existing plan.
  • the method may comprise generating a new plan.
  • the method may comprise correcting the movement, location, orientation, and/or position of the robotic apparatus, or at least a part thereof to bring it into conformity with the plan.
  • the plan may comprise a movement sequence for the robotic manipulator assembly or functional module.
  • the movement sequence may define the movement path for the functional module.
  • the plan may comprise at least a head movement path plan for the functional module in relation to the work object.
  • Embodiments of the second aspect of the invention may include one or more features of the first aspect of the invention or its embodiments, or vice versa.
  • a sensor system for a robotic apparatus for industrial fabric maintenance of a work object in a work location comprising: at least one optical system for collecting a data set relating to a work location and/or work object; at least one processing module for processing the data set; wherein the sensor system is operable to process the data set to locate and/or orientate the robotic apparatus in relation to the work location and/or work object.
  • the sensor system may comprise two or more optical systems.
  • the at least one processing module may be configured to process a data set from each of the optical systems.
  • the at least one processing modules may process the collected data using an algorithm.
  • the at least one processing modules may process the collected data using at least two algorithms.
  • the at least one processing modules may process the collected data using at least two parts of one algorithm.
  • a first algorithm may process the data to locate the robotic apparatus in relation to the work location and/or work object to a first resolution or accuracy.
  • a second algorithm may process the data to locate the robotic apparatus in relation to the work location and/or work object to a second resolution or accuracy.
  • a second algorithm may process the data to locate the robotic apparatus in relation to the work location and/or work object to augment the resolution or accuracy of the first algorithm by combining the information from the second algorithm with that from the first algorithm.
  • the algorithms may be combined together such that they are sub-algorithms in a larger algorithm.
  • the at least two algorithms may be considered to be at least two separate algorithms or at least two parts of the same algorithm.
  • An algorithm is considered to be made up of at least one component where a core part of the algorithm may be used to complete an intended purpose of the algorithm.
  • aspects of the core part of the algorithm may be augmented to improve the performance of the algorithm.
  • the core part of the algorithm may be augmented by a second component of the same algorithm or by another algorithm.
  • An algorithm may comprise two (or more) components where one of those components could be used as part of a system independently without the second component to perform a task such as to estimate the position of a work piece is considered to be two algorithms.
  • An algorithm may have multiple components that work together to provide a single accuracy each component may provide independent information that is complementary with the other components.
  • a core part of an algorithm may process small movements between each data frame that is captured (say at 30 frames per second) and another part of the algorithm may independently identify when the apparatus intersects with a previous position and closes a loop. Where a loop closure is detected, this may be used to improve the accuracy of the core part of the algorithm.
  • Multiple parts of one algorithm may be considered to be separate algorithms. For example, a first part of an algorithm which identifies a loop closure may be used independently to a second part of the algorithm which may be used to estimate the position of the workpiece. The two parts of the same algorithm may be used independently, albeit with lower accuracy, and may be considered as two different algorithms.
  • the at least one processing module may be configured to process a first data set from a first optical system to locate the robotic apparatus in relation to the work location and/or work object to a first resolution or accuracy.
  • the at least one processing module may be configured to process a second data set from a second optical system to locate the robotic apparatus in relation to the work location and/or work object to a second resolution or accuracy.
  • the second resolution or accuracy may be higher than the first resolution or accuracy.
  • the optical system may comprise a camera and/or a laser system.
  • the least one processing module may process data from the camera and/or a laser system.
  • the sensor system or at least a part thereof may be mounted to or located on the robotic apparatus.
  • the sensor system or at least a part thereof may be mounted to or located on a robotic manipulator assembly and/or a functional module at an operative end of the robotic manipulator assembly.
  • the sensor system may be configured to locate, orientate and/or position the robotic apparatus, robotic manipulator assembly and/or functional module in relation to the work location and/or work object in accordance with a plan.
  • the functional module may be selected from the group consisting of an inspection system, a surface treatment system or a coating system.
  • the sensor system may comprise at least one sensor.
  • the at least one sensor may be selected from the group consisting of a camera, laser, a range sensor, spectrometer, wet film thickness sensor, load cell, inertial measurement unit, ultrasound sensor, infra-red sensor, proximity sensor, inductive sensor, or a lidar sensor.
  • the sensor system may be configured to move, locate, orientate and/or position the robotic apparatus, robotic manipulator assembly and/or functional module according to an existing plan and/or generate a plan to move, locate, orientate and/or position the robotic manipulator assembly and/or a functional module relative to the work object in a work location.
  • the apparatus may be configured to undertake fabric maintenance tasks remotely and/or autonomously.
  • the sensor system may be configured to enable real time feedback to assess the operation of the functional module and/or the quality of the fabric maintenance.
  • the sensor system may be configured to avoid obstacles and collisions with the external environment and personnel.
  • Embodiments of the third aspect of the invention may include one or more features of the first or second aspects of the invention or their embodiments, or vice versa
  • an apparatus for industrial fabric maintenance of a work object in a work location comprising: a robotic manipulator assembly; a sensor system operable to collect data relating to the work location and/or work object; and a functional module at an operative end of the robotic manipulator assembly; wherein the functional module comprises: a first inlet and a first outlet for a surface treatment medium; a first conduit between the first inlet and first outlet; and a connector for removably connecting the functional module to the robotic manipulator assembly such that the first inlet is coupled to a source of the surface treatment medium; wherein the functional module comprises a shape or form selected for delivery of the surface treatment medium in dependence on the geometry of a work area on the work object.
  • the apparatus may be configured to move, orientate, located and/or position the functional module in response to feedback from at least one sensor in the sensor system.
  • the sensor system may be configured to locate, orientate, control and/or position the robotic apparatus, robotic manipulator assembly and/or functional module in relation to the work location and/or work object in accordance with a plan.
  • the plan may comprise a robotic movement sequence plan for the robotic manipulator assembly.
  • the robotic movement sequence plan may define the head movement path for the functional module.
  • the functional module may comprise a surface preparation head, and the surface treatment medium may be a surface preparation medium.
  • the functional module may comprise a surface coating head, and the surface treatment medium may be a surface coating medium.
  • the surface coating medium may comprise a paint.
  • the shape or form of the functional module may comprise a head surface profile, which may be configured to be presented to the work area of the work object, and which may be configured to engage or otherwise interact with the work area of the work object.
  • the head surface profile may comprise a substantially flat or flat planar surface configured to be presented to the work area of the work object. Such a head surface profile may be particularly suitable for a substantially flat or flat planar work area surface.
  • the head surface profile may comprise a concave surface configured to be presented to the work area of the work object. Such a head surface profile may be particularly suitable for a convex work area surface.
  • the head surface profile may comprise a convex surface configured to be presented to the work area of the work object.
  • Such a head surface profile may be particularly suitable for a concave work area surface.
  • the head surface profile may comprise a cylindrical or part-cylindrical surface configured to be presented to the work area of the work object. Such a head surface profile may be particularly suitable for a cylindrical or part-cylindrical work area surface, for example the surface of a pipe.
  • the head surface profile may comprise a surface configured to be presented to the work area of the work object that is curved with respect to two orthogonal axes.
  • the head surface profile may comprise one or more surface projections configured to be presented to the work area of the work object. Such a head surface profile may be particularly suitable for a recess, groove or relief in a work area surface.
  • the apparatus may comprise one or more sensors. The one or more sensors may be mounted on the functional module, such that they are removably connected to the robotic manipulator assembly with the functional module.
  • the one or more sensors may be mounted on the robotic manipulator assembly, for example such that they remain on the robotic manipulator assembly when the functional module is removed.
  • a first subset of the sensors may be mounted on the functional module, and a second subset of sensors may be mounted on the robotic manipulator assembly.
  • the one or more sensors may form a part of a sensor system.
  • the sensor system may comprise a first optical system for collecting a first data set relating to a work location and/or work object and may comprise a second optical system for collecting a second data set relating to a work location and/or work object.
  • the sensor system may comprise at least one processing module for processing the first and second data sets.
  • the first optical system may comprise an optical camera, and the first data set may comprise camera imaging data.
  • the second optical system may comprise a laser positioning system, and the second data set may comprise laser positioning data.
  • the sensor system is operable to process the first data set to locate the robotic apparatus in relation to the work location and/or work object to a first resolution or accuracy. More preferably, and the sensor system is operable to process the second data set to locate a robotic apparatus in relation to the work location and/or work object to a second resolution or accuracy, wherein the second resolution or accuracy is higher than the first resolution or accuracy.
  • Embodiments of the fourth aspect of the invention may include one or more features of the first to third aspects of the invention or their embodiments, or vice versa
  • a functional module for removable connection to the apparatus of the first or third aspect of the invention.
  • the apparatus may be configured to orientate, position, locate and/or move the functional module based on sensor data.
  • the apparatus may be configured to orientate, position, locate and/or move the functional module according to a plan.
  • the plan may comprise a robotic movement sequence plan for the robotic manipulator assembly.
  • the robotic movement sequence plan may define the head movement path for the functional module.
  • Embodiments of the fifth aspect of the invention may include one or more features of the first to fourth aspects of the invention or their embodiments, or vice versa.
  • a modular system of components comprising the apparatus of the third aspect of the invention and a plurality of functional modules interchangeable on the robotic manipulator assembly of the apparatus, wherein each functional module wherein the functional module comprises a shape or form selected for delivery of the surface treatment medium in dependence on the geometry of a work area on the work object.
  • Embodiments of the sixth aspect of the invention may include one or more features of the first to fifth aspects of the invention or their embodiments, or vice versa.
  • a seventh aspect of the invention there is provided a method of performing a fabric maintenance operation using the apparatus according to first or third aspects of the invention or the system according to the fifth aspect of the invention, the method comprising removing a first functional module from the apparatus, and connecting a second functional module to the apparatus.
  • Embodiments of the seventh aspect of the invention may include one or more features of the first to sixth aspects of the invention or their embodiments, or vice versa.
  • a sensor system for a robotic apparatus for industrial fabric maintenance of a work object in a work location comprising: at least one sensor for collecting a data set relating to a work location and/or work object; at least one processing module for processing the data set; wherein the sensor system is operable to process the data set to locate and/or orientate the robotic apparatus in relation to the work location and/or work object.
  • the at least one sensor may be selected from the group consisting of a camera, laser, a range sensor, spectrometer, wet film thickness sensor, load cell, inertial measurement unit, ultrasound sensor, infra-red sensor, proximity sensor, inductive sensor, or a lidar sensor.
  • the sensor system may comprise two or more sensors.
  • the at least one processing module may be configured to process a data set from each of the sensors.
  • the at least one processing module may be configured to process a first data set from a first sensor to locate and/or orientate the robotic apparatus in relation to the work location and/or work object to a first resolution or accuracy.
  • the at least one processing module may be configured to process a second data set from a second sensor to locate and/or orientate the robotic apparatus in relation to the work location and/or work object to a second resolution or accuracy.
  • the second resolution or accuracy may be higher than the first resolution or accuracy.
  • the first sensor may be a camera.
  • the second sensor may be a laser system.
  • Embodiments of the eighth aspect of the invention may include one or more features of the first to seventh aspects of the invention or their embodiments, or vice versa.
  • a method of operating a robotic device for industrial fabric maintenance of a work object in a work location comprising: providing a robotic apparatus with a sensor system, the sensor system comprising: at least one sensor; at least one processing module for processing at least one data set from the at least one sensor; collecting at least one data set relating to a work location and/or work object using the at least one sensor; processing the at least one data set to locate and/or orientate the robotic apparatus in relation to the work location and/or work object to a first resolution or accuracy.
  • the at least one sensor may be selected from the group consisting of a camera, laser, a range sensor, spectrometer, wet film thickness sensor, load cell, inertial measurement unit, ultrasound sensor, infra-red sensor, proximity sensor, inductive sensor, or a lidar sensor.
  • the method may comprise providing two or more sensors.
  • the method may comprise collecting a first data set relating to a work location and/or work object a first sensor; collecting a second data set relating to a work location and/or work object using a second sensor; processing the first data set to locate the robotic apparatus in relation to the work location and/or work object to a first resolution or accuracy; processing the second data set to locate the robotic apparatus in relation to the work location and/or work object to a second resolution or accuracy, wherein the second resolution or accuracy may be higher than the first resolution or accuracy.
  • the first sensor is an optical system such as an optical camera.
  • the second sensor is an optical system such as a laser system.
  • Embodiments of the ninth aspect of the invention may include one or more features of the first to eighth aspects of the invention or their embodiments, or vice versa.
  • a sensor system for a robotic apparatus for industrial fabric maintenance of a work object in a work location comprising: a first optical sensor for collecting data relating to a work location and/or work object; a second optical sensor for collecting data relating to a work location and/or work object; and one or more processing modules for processing the collected data; wherein the sensor system is operable to process the data to locate the robotic apparatus in relation to the work location and/or work object to a first resolution or accuracy.
  • the sensor system may be operable to process the data to locate a robotic apparatus in relation to the work location and/or work object to a second resolution or accuracy, wherein the second resolution or accuracy is higher than the first resolution or accuracy.
  • the first optical sensor may be an optical camera for collecting a first data set comprising camera imaging data.
  • the second optical sensor may be an optical camera for collecting a second data set comprises camera imaging data or the second optical sensor may be a laser positioning system and a second data set comprises laser positioning data.
  • the industrial fabric maintenance may be an inspection operation, surface preparation operation and/or coating operation.
  • the sensor system may be configured to locate, orientate, control and/or position the robotic apparatus, or at least a part thereof, during, before or after a fabric maintenance operation.
  • the sensor system may comprise at least one further sensor, wherein the at least one further sensor may be selected from the group consisting of a camera, laser, a range sensor, spectrometer, wet film thickness sensor, load cell, inertial measurement unit, ultrasound sensor, infra-red sensor, proximity sensor, inductive sensor, or a lidar sensor.
  • the sensor system, or at least a part thereof may be mounted on the robotic apparatus.
  • the sensor system, or at least a part thereof, may be mounted on a functional module of the robotic apparatus.
  • the at least one processing module may use an existing plan or generates a plan to locate, orientate, control and/or position the robotic apparatus, or at least a part thereof during a fabric maintenance operation.
  • the at least one processing module may process the collected data using at least two algorithms.
  • a first algorithm may process the data to locate the robotic apparatus in relation to the work location and/or work object to a first resolution or accuracy.
  • a second algorithm may process the data to locate the robotic apparatus in relation to the work location and/or work object to provide a second resolution or accuracy.
  • the second algorithm may process to augment the resolution or accuracy of the first algorithm by combining information from the second algorithm with that from the first algorithm.
  • the algorithms may be combined together such that they are sub-algorithms in a larger algorithm.
  • Data from the first optical system, second optical system and/or at least one further sensor may be processed to follow a movement sequence which defines the movement path for a functional module of the robotic apparatus.
  • the functional module may be selected from the group consisting of an inspection system, a surface treatment system or a coating system.
  • the surface treatment system may be selected from the group consisting of a water jetting system, a dry ice blasting system and an abrasive blasting system.
  • the coating system is a paint system.
  • the sensor system may be configured to undertake fabric maintenance tasks remotely and/or autonomously.
  • the sensor system may be configured to assess the operation of the functional module and/or the quality of the fabric maintenance.
  • the sensor system may be configured to avoid obstacles and collisions with the external environment and/or personnel.
  • Embodiments of the tenth aspect of the invention may include one or more features of the first to ninth aspects of the invention or their embodiments, or vice versa.
  • a method of operating a robotic apparatus for industrial fabric maintenance of a work object in a work location comprising: providing a robotic apparatus with a sensor system, the sensor system comprising: a first optical sensor for collecting data relating to a work location and/or work object; a second optical sensor for collecting data relating to a work location and/or work object; at least one processing module for processing collected data; and processing the data to locate the robotic apparatus in relation to the work location and/or work object to a first resolution or accuracy.
  • the method may comprise processing the data to locate the robotic apparatus in relation to the work location and/or work object to a second resolution or accuracy; wherein the second resolution or accuracy is higher than the first resolution or accuracy.
  • the method may comprise locating, orientating, controlling and/or positioning the robotic apparatus, or at least a part thereof, in response to the processed data.
  • the method may comprise processing the collected data using at least two algorithms.
  • the method may comprise processing the data with a first algorithm to locate the robotic apparatus in relation to the work location and/or work object to a first resolution or accuracy.
  • the method may comprise processing the data with a second algorithm to locate the robotic apparatus in relation to the work location and/or work object to provide a second resolution or accuracy.
  • the method may comprise processing the data with a second algorithm to augment the resolution or accuracy of the first algorithm by combining information from the second algorithm with that from the first algorithm.
  • the method may comprise processing the collected data using at least two independent parts of one algorithm.
  • the method may comprise processing the data with a first part of one algorithm to locate the robotic apparatus in relation to the work location and/or work object to a first resolution or accuracy.
  • the method may comprise processing the data with a second part of one algorithm to locate the robotic apparatus in relation to the work location and/or work object to provide a second resolution or accuracy.
  • the method may comprise processing the data with a second part of one algorithm to augment the resolution or accuracy of the first algorithm by combining information from the second part of one algorithm with that from the first part of one algorithm.
  • the algorithms may be combined together such that they are sub-algorithms in a larger algorithm.
  • the method may comprise monitoring the movement of the robotic apparatus during, before and/or after a fabric maintenance operation.
  • the method may comprise comparing the location of the robotic apparatus, or at least a part thereof, with a plan.
  • Embodiments of the eleventh aspect of the invention may include one or more features of the first to tenth aspects of the invention or their embodiments, or vice versa.
  • Figure 1 A and 1 B are a sketch and a schematic view of a fabric maintenance system for inspection, surface preparation and coating of steel structures according to the invention
  • Figure 2 is a schematic view of different elements of a sensor system which may be incorporated into the fabric maintenance system of Figure 1;
  • Figure 3A and 3B are a sketch and an enlarged schematic view of a manipulator arm of a fabric maintenance system with a sensor system according to the invention;
  • Figure 4A is a flow diagram showing an example blasting operation surface preparation according to an embodiment of the invention.
  • Figure 4B is a flow diagram showing an example paint operation of a prepared surface according to an embodiment of the invention.
  • Figure 5 shows a fabric maintenance system set up for surface preparation according to an embodiment of the invention
  • Figure 6 is a schematic view of selection of different end effector heads that may be removably attached to the fabric maintenance system of Figure 1A;
  • Figure 7 is schematic view of a fabric maintenance system with an end effector head attachment system according to an embodiment of the invention.
  • Figure 8A and 8B are side and end profile schematic views of a fabric maintenance system mounted on a rail according to the invention.
  • Figure 9A is schematic side view of a fabric maintenance system mounted on a rail with a powered mechanism according to the invention.
  • Figure 9B is perspective schematic view of a component of the powered mechanism described in Figure 9A.
  • Figure 10A and 10B are side and perspective views of a fabric maintenance system mounted on a rail according to the invention.
  • FIGS 1A and 1B show a fabric maintenance system 10 for inspection, surface preparation and coating of steel structures according to the invention.
  • the system 10 comprises a robotic manipulator assembly 11 having a base 12 mounted on endless tracks 14. Each of the endless tracks are mounted on rollers 15.
  • An articulated manipulator arm 16 is pivotally mounted at first end 16a on the base 12.
  • the manipulator arm 16 has joints 18 which may be individually actuated to provide the arm 16 with at least six degrees of freedom.
  • the manipulator arm 16 carries an end effector head mount 20 which is movably secured at a second end of the manipulator arm 16b.
  • end effector heads 22 may be reversibly fixed to the end effector head mount 20 depending on the desired application including inspection, surface preparation and/or coating operation.
  • a variety of end effector heads 22 may also be provided depending on the geometry of the surface to be treated. This is discussed further in relation to Figure 6 below.
  • the system may be used for a number of different applications including surface preparation, inspection, or coating operation.
  • Surface preparation e.g. water jetting
  • NDT non-destructive testing
  • the NDT work could include ultrasonic (such as phased array) or radiography.
  • the manipulator could also conduct the inspection work by use of an inspection head. Following the surface preparation and/or inspection operations, application of a coating may or may not be required.
  • the different mountable end effector heads 22 enable the system to conduct inspection, surface preparation and coating operations.
  • the inspection operations include quality control checks of the treated surfaces such as blasted surfaces and painted surfaces.
  • the surface preparation operations include dry ice blasting, grit blasting or water jetting.
  • the dry ice blasting, grit blasting or water jetting system may either be closed or open loop.
  • the end effector heads comprise at least a first conduit to deliver or dispense medium (dry ice, grit or water) to the surface to be treated and a second conduit for suction or removal of the waste medium and contaminants for the treated surface.
  • the closed loop end effector heads may comprise bristles or fibres around its surface engaging periphery which acts as a curtain or screen to assist in the controlled delivery of medium (dry ice, grit or water) from the end effector head to the surface to be treated and the containment and recirculation of the waste medium and contaminants to the second conduit in the end effector head.
  • medium dry ice, grit or water
  • the coating operations may include painting and spray painting.
  • Figure 1 shows the robotic manipulator assembly 11 is set up for quality control for inspection operations; water jetting, dry ice blasting and grit blasting as surface preparation operations; and spray painting as a coating operation.
  • the system 10 is connected to a dry ice reservoir 32, a grit reservoir 34, a water reservoir 36 and to a paint reservoir 38, via conduits 32a, 34a, 36a and 38a respectively. Although these are shown in Figure 1A as individual conduits, these may be bundled together and housed into a single conduit called an umbilical.
  • the dry ice reservoir 32, grit reservoir 34, water reservoir 36 and paint reservoir 38 are connected to a compressor 40 via pressure lines 40a to enable the dry ice, grit, water and paint to be dispensed under pressure.
  • the system may not comprise or use a dry ice reservoir, grit reservoir, water reservoir and paint reservoir but a selection or combination of these depending on type of fabric maintenance application, work scope, client objectives, type of material to be worked on and the type of environment where the work is to be performed (e.g. inside an oil storage vessel, outdoors in an potentially explosive environment, on a helideck etc.).
  • conduits 32a, 34a, 36a and 38a either individually or an umbilical when the conduits are bundled together, may be spooled on a reel to assist with conduit handling and management.
  • the conduits or umbilical may be paid out when required and spare conduit or umbilical removed when not.
  • the conduits 32a, 34a, 36a and 38a either individually or an umbilical when the conduits are bundled together, may be connected to a plurality of small rollers to provide low friction on the surface when the conduit or umbilical follows the automated device.
  • the umbilical rollers may be powered such that the umbilical can be moved to avoid collisions.
  • the robotic manipulator assembly 11 has a sensor system 50.
  • the sensor system may be located on the robotic manipulator assembly 11, manipulator arm 16, end effector head mount 20 and/or the end effector head 22.
  • components of the sensor system may be located on different parts of the robotic manipulator assembly 11 , manipulator arm 16, end effector head mount 20 and the end effector head 22.
  • the sensor system 50 is located on the end effector head 22.
  • the sensor system 50 comprises cameras 52 and a laser-based system 54.
  • the camera 52 may be used without the laser-based system where accurate positioning is not required.
  • the camera and laser-based sensor system enables the robotic manipulator assembly 11 and the connected end effector head 22 to be precisely positioned relative to the surface of the structure or object being worked on.
  • the fabric maintenance system 10 autonomously creates path for movement of the robotic manipulator assembly, manipulator arm and/or the end effector head to avoid potential obstacles in the surrounding environment and to undertake an effective treatment of the surface.
  • the system may create a path plan based on cartesian coordinates to control the overall movement of the robotic manipulator assembly, manipulator arm and/or the end effector head during the treatment operation and a plan “pick and place” plan of how the blasting sequence should be conducted.
  • a handheld controller 60 may be connected to the robotic manipulator assembly 11 to control the operation and movement of the robotic manipulator assembly 11.
  • the handheld controller 60 includes a position tracking system 62 which may include a camera 64, rangefinder 66 and/or a lidar based system 68.
  • the position tracking system 62 enables the user to identify and track the position of objects in a 3D environment such that instructions for the robotic manipulator assembly 11 can be created.
  • the tracked movements may be recorded as a series of stored instructions.
  • Creation of work scope for the robot assembly may comprise of some or all of the following steps: a. use of an existing 3D model or scan or generation of a 3D model; b. the user selecting on the 3D model the area to be worked on; c. running the path generation algorithm; d. simulation of the process (such as to verify the path and define the speed (e.g.
  • paint thickness modelling using the paint spray cone e. presentation of the planned path to the user for approval; f. translation of the planned path into robot coordinate frame based on robot localisation with respect to the work piece. g. updating of the path during the process to ensure the correct distance and angle to the workpiece is maintained and/ or re-planning to loop back over areas (e.g. during blasting) and/ or changing of the velocity (e.g. if blasting time needs to be longer than expected); and h. appending information to the location in the generated 3D model (e.g. visual confirmation of successful blasting).
  • a handheld controller (also known as a handheld scanner) may be typically used for the scanning.
  • the handheld controller includes a position tracking system which may include a camera, rangefinder and/or a lidar based system.
  • the position of the handheld controller will be calculated using a simultaneous localisation and mapping system. The system works by estimating between successive sensor measurements the movement of the handheld controller.
  • the handheld controller tracks key points (in 2D or 3D) that have a degree of rotational and translation invariance (approximately 20 degrees rotation and 30cm translation).
  • the frame will have the six degree of freedom pose and will be linked to the previous recoded frame along with the uncertainty of the measurement.
  • a global system of identifying whether a loop has been made will operate. This will aim to find correspondences with frames recorded in other parts of the map which might (based on probability distribution) be part of a loop.
  • all of the poses in the map are updated to reflect the new information that a loop has been closed.
  • the pose will then be used to fuse point clouds captured from individual positions together.
  • the 3D points generated from the scan data or where available from an existing model are converted into a surface representation.
  • An algorithm is run to identify standard geometries and sharp edges. These are used to segment the model into a series of sub models.
  • a machine learning algorithm can optionally classify the surface to select optimal meshing parameters (such as the algorithm type and the smoothing parameters). The edges have lines or curves fitted to them to improve the quality of the mesh by preserving sharp edges.
  • the user can select points on the meshed object.
  • the points are selected by generating a line from the user selection tool on top of a virtual surface. Where the line intersects the surface a point is generated and displayed.
  • a line across the surface of the object will be created connecting two successive points. Either the shortest path across the surface or using a plane with the third degree of freedom selected by the user.
  • the user can select as many points as desired with a minimum of three points.
  • the user can see the model in 3D and manipulate the model during the selection process.
  • the selected surface may be segmented into smaller sub surfaces by extracting surfaces separated by edges (using an edge detection algorithm) or splitting large surface in to two subclusters (and repeating this until the clusters are below a specified size - max distance between any two points or similar).
  • a reference edge is either determined by a machine learning algorithm trained on simulated data to minimise total object path length and complexity or provided by the user.
  • User input is created by providing three degrees of freedom, such as two point on the surface and an angle.
  • the reference path will have a corresponding plane.
  • points will be generated with planes generated at each point where the plane is perpendicular to the reference edge plane.
  • a separation of paths (e.g. for painting) will be generated by an algorithm based on the spray cone size or may be user selected. A distance along each of the new planes across the surface they intersect will be traversed until it is at the path separation. All of these new points will be linked together to generate a new path. The mid points of the new path straight sections will be used to generate new planes (one for each straight section of path) with the normal the vector connecting the two points making up the straight section of path and the midpoint being on the plane. New paths are generated by traversing the path separation distance across the intersection of the new planes and the surface. The process is repeated until the surface is covered in paths. An optimisation algorithm may be used to reduce path complexity and time (e.g. reference path plane parameters clustering parameters for separating the area into smaller separate surfaces).
  • a simulation is run to ensure that there are no collisions, (some path segments may need to be modified. This can be done by allowing a number of parameters to be changed with a range including distance from the surface and angle of the tool head relative to the surface.
  • An optimisation algorithm can be used to identify new parameters that do not result in a collision where the loss parameter is based on the number of points on the path that result in collisions (the spacing between waypoints can be modified to ensure convergence). Where paths cannot be found these are excluded from further processing and are marked for the user.
  • the further process simulation may be performed. For painting this may include a deposition model based on the spray tip selected (either by the user or the path optimisation algorithm in C.), the distance from the object and system pressure. For blasting grit open loop this is based on the spray cone, distance from the surface, type of blasting media and system pressure.
  • the velocity along the path is optimised to ensure that the required media is deposited on the surface. This may be modified while executing the path based on sensor feedback.
  • the user is presented with the path and associated statistics from the modelling.
  • the position that the robot arm needs to be located to complete the path will be calculated by identifying the transform from the robot frame to the work piece frame. This will be generated using the simultaneous localisation and planning system. Recommended robot base positions will be provided to the user to ensure all areas can be reached after the robot has worked from, all of the recommended base positions.
  • the path may be adjusted or completely changed during the process. This will be based on sensor feedback (e.g. visual data run through a machine learning algorithm to identify blast/coating quality). For surface preparation, poorly performed work can be remedied by looping back over the path (automatically initiated where poor blast/coating quality is detected). The velocity may be changed (e.g. for blasting where there is more rust than anticipated).
  • sensor feedback e.g. visual data run through a machine learning algorithm to identify blast/coating quality.
  • poorly performed work can be remedied by looping back over the path (automatically initiated where poor blast/coating quality is detected).
  • the velocity may be changed (e.g. for blasting where there is more rust than anticipated).
  • Process information may be appended to the generated path, this could include images and other sensor readings.
  • a first stage where the work area is visually inspected by the user and a work order generated and approved.
  • a second stage is the setup of the robotic manipulator assembly in the work location and 3D scanning of the surrounding area to assess for obstacles.
  • a third stage is the user selects the surfaces of the work object to be treated.
  • the robot autonomously prepares a path for movement of the robotic manipulator assembly, manipulator arm and/or the end effector head to avoid potential obstacles in the surrounding environment and to undertake an effective treatment of the surface.
  • the path is displayed to the user and approval requested.
  • a fourth stage is a technician attaches an appropriate end effector head, selected for the particular surface preparation application and the geometry of the surface to be treated.
  • the robot autonomously positions itself relative to the surface to be treated and begin surface preparation.
  • the sensor system including load cell, laser, range finder monitors and instructs the adjustment of the position and rotation of the robotic manipulator assembly, manipulator arm and/or the end effector head. Cameras and laser in the sensor system inspect the quality of the surface preparation after or during the work is performed. Once the operation is complete the user is notified and a report is generated.
  • An inertial measurement unit 70 on the robot base enables measurements to be used to dynamically adjust the robot trajectory for instability/ movement of the base. This becomes an important feature if the robot is mounted on a long reaching structure such as a hydraulic boom.
  • the position tracking system 62 can use the camera 64 or lidar based system 68 optionally combined with the inertial measurement unit 70 to accurate track and position the robotic manipulator assembly 11.
  • the position tracking may be the same as described for the handheld controller.
  • more than one position tracking system may be used, one for rough positioning (say within a 10cm sphere - fast and very robust but low accuracy), the next more accurate position tracking system specific the workpiece using a surface fitting of the original scan/ model used for path planning to the data being observed from a camera system or lidar generating a point cloud.
  • An initial guess for the position may come from the first positional tracking system.
  • the final position tracking system may use laser data to provide fine alignment to the workpiece to maximise quality and consistency.
  • Information on the environment can be communicated between the robotic manipulator assembly 11 and the handheld controller 60 such that the calculated position by the handheld controller 60 can be used by the robotic manipulator assembly 11.
  • Data gathered by the robotic manipulator assembly 11 and the handheld controller 60 can be attached to the position of physical objects in the working environment such that useful information can be displayed and accessed by the user.
  • the information can optionally be overlaid in augmented reality for the user and the user can then use the handheld controller 60 to change the position of the selected point, both in distance away and point on the users’ vision.
  • the robotic manipulator assembly 11 is able to be precisely moved and positioned using localisation visual data provided by the camera 52 and/or lidar 54.
  • the visual input data by the camera is processed by a localisation algorithm which ensures the robot to avoids collisions with the external environment and personnel.
  • visual images and/ or laser range measurements are used to create identify points in the environment that have been confirmed to be occupied. These may be added to over time or refreshed frequently.
  • the path of the manipulator and head is broken up into a series of steps, usually at constant time steps.
  • the occupied volume of the head system and the manipulator at each time step in the planned path will be checked for collisions. This may be performed during the process simulation operation described in D) above.
  • the head and manipulator may be represented as a simpler geometry to improve processing time (e.g. cuboid or a series of spheres of varying diameters).
  • Path planning will use an optimisation approach where a cost function will be applied to a specific path.
  • the scanning operations and/or planning operations may be performed some time before the work is to be carried out.
  • the scanning operations and/or planning operations may be performed hours, days, weeks, months or even years before the work is to be carried out.
  • the scanning operations and/or planning operations may be perform in the absence or presence of the robot assembly in the work location or environment. This scanning operation and/or planning operation may be performed using available 3D modelling or by generating 3D models using the handheld controller.
  • the scanning operations and planning operations may be performed in the presence of the robot just before the work is due to be carried out.
  • the robotic manipulator assembly 11 is configured to conduct inspection, surface preparation or coating operation to a three-dimensional surface of an object along a calculated path.
  • the proposed path is generated virtually and sent to the user for review and approval.
  • the robot is capable of executing complex paths on a range of geometries including flat areas, pipes and curved surfaces.
  • the desired operation may be selected on the handheld controller 60.
  • An appropriate end effector head 22 for that specific application is mounted on the end effector head mount 20 of the manipulator arm 16.
  • the end effector head 22 may be mounted manually or by as part of the plan for a fabric maintenance operation.
  • the base may optionally include outriggers or extendable supports to provide support and stability to the device.
  • an electromagnet 80 maybe connected to the endless tracks to optionally anchor and fix the position of the endless tracks on metal structures during an inspection, surface preparation or coating operation.
  • the base is mounted on endless tracks, additionally or alternatively the height of the base may be vertically adjustable such that it can be raised and lowered.
  • the base may be connected to a series of sections by a geared system that when a change of height is required the gears systems climb up or down the sections.
  • the apparatus may be mounted on a rail system which is further described in Figure 8A to 10B.
  • a work basket may be installed in the assembly to enable a user to inspect and support the work of the robotic manipulator assembly.
  • Figure 2 shows the different elements of a sensor system 50 of a robotic manipulator assembly according to one embodiment of the invention.
  • the sensor system may include 2D lidar/ 2D laser profiler, 3D lidar, Load cell with one up to 6 degrees of freedom, IR projector for improved imaging e.g. vertical-cavity surface- emitting laser, active stereo/ structure light camera system, laser Range finders, blast attachment, spectrometer, camera - mono or stereo, inertial measurement units, ultrasonic wall thickness measurement; paint pinhole/ holiday detector applied DC brushes, wet film thickness sensor e.g. thermal transient analysis and ultrasonic coating thickness measurement.
  • IR projector for improved imaging e.g. vertical-cavity surface- emitting laser, active stereo/ structure light camera system, laser Range finders, blast attachment, spectrometer, camera - mono or stereo, inertial measurement units, ultrasonic wall thickness measurement
  • paint pinhole/ holiday detector applied DC brushes e.g. thermal transient analysis and ultrasonic coating thickness measurement.
  • the sensor system is described as being located or mounted on the end effector head, it will be appreciated that the sensor system may be located or mounted on the robotic manipulator assembly 11, manipulator arm 16 and/or the end effector head mount 20. It will also be appreciated that robotic manipulator assembly 11, manipulator arm 16, end effector head mount and/or the end effector head may contain different elements of the sensor system.
  • Figures 3A and 3B show an enlarged view of a manipulator arm 116 of the robotic manipulator assembly 111.
  • An end effector head 122 is reversibly mounted on the end effector head mount 120 connected to the manipulator arm 116.
  • a sensor system 150 is located on the end effector head 122.
  • the sensor system has a camera 152 and a laser system 154 to ensure correct orientation of the robotic manipulator assembly 111, manipulator arm 116 and end effector head 122 relative to the work location and/or work object.
  • the camera 152 generates a first data set and a laser system 154 generates a second data set.
  • the first and second data sets are processed to locate the manipulator arm 116 and end effector head a 122 to a high resolution or accuracy.
  • the sensor system 150 includes a load cell 160 on the end effector head 122 to measure and confirm that the end effector head 122 is being held with the require pressure against the surface where surface contact is required. Pressure data is sent from the load cells to an onboard computer via electric or fibre optic cables.
  • sensor types may be included in the sensor system including cameras, lasers, a range sensor, spectrometers, wet film thickness sensors, load cells, inertial measurement units, ultrasound sensors, infra-red sensors, infra-red projectors proximity sensors, inductive sensors, or a lidar sensors.
  • Figure 4A is a flow diagram 200 showing an example blasting operation surface preparation according to an embodiment of the invention.
  • the system has a compressor connected to blast equipment to enable grit and water to be dispensed under pressure via the umbilical/hose system which is connected to the end effector head.
  • An onboard computer controls the robotic manipulator assembly platform which controls the movement of the manipulator arm (robot manipulator), end effector system (including the end effector head) and electromagnet (magnetic/mechanical anchor) to optionally anchor and fix the position of the robotic manipulator assembly the blasting operation.
  • robotic manipulator assembly platform which controls the movement of the manipulator arm (robot manipulator), end effector system (including the end effector head) and electromagnet (magnetic/mechanical anchor) to optionally anchor and fix the position of the robotic manipulator assembly the blasting operation.
  • a sensor system controls the movement of the end effector system.
  • the sensor system comprises a camera system, a laser-based system to enables the robotic manipulator assembly and the connected end effector head to be precisely positioned relative to the surface of the structure or object being worked on.
  • a spectrometer may be provided to assess and inspect the surface of the object being treated.
  • a load cell is connected to the end effector head to measure and confirm that the end effector head is being held with the correct pressure against the surface where surface contact is required.
  • an ATEX over pressure system is activated.
  • the sensor system may assess or inspect the quality of the surface preparation after or during the work is performed.
  • the assessment or inspection of the quality of surface preparation may be performed in real time as the surface is treated.
  • the system may repeat or amend its plan for a fabric maintenance operation based on the results of the assessment or inspection. For example, the blasting path may have been estimated using an initial assessment of the level of rust (e.g. by an algorithm using visual data). If the rust is worse than anticipated the path speed will be modified and sections may need to be repeated.
  • Figure 4B is a flow diagram 220 showing an example paint operation of a prepared surface according to an embodiment of the invention.
  • the flow diagram shown in Figure 4B is similar to the flow diagram shown in Figure 4A described above.
  • flow diagram shown in Figure 4B relates to a paint operation carried out by the robotic manipulator assembly.
  • the compressor is therefore connected to a paint system to enable paint to be dispensed under pressure via the umbilical/hose system which is connected to the end effector head.
  • a compressor may not be required for some paint/blasting operations.
  • some tools do not require pneumatic or hydraulic systems such as air-less paint tools and electric bristle blasting tools.
  • a sensor system controls the movement of the end effector system during the paint operation or blast operation.
  • the sensor system provides data that is processed and allows either adjustment of the existing path or the generation of a new path for the end effector system during the paint or blast operation.
  • the sensor system comprises a camera system, a laser-based system to enables the robotic manipulator assembly and the connected end effector head to be precisely positioned relative to the surface of the structure or object being worked on.
  • a spectrometer is optionally provided to assess and inspect the surface of the object being treated and optionally parameters of the paint layer applied.
  • the sensor system may assess or inspect the quality of the paint coating after or during the work is performed.
  • the assessment or inspection of the quality of paint may be performed in real time as the surface is painted.
  • the system may repeat or amend its plan for a fabric maintenance operation based on the results of the assessment or inspection. Inspection may include the processing of visual data, roughness from a probe or laser or contamination level based on visual, ultrasonic or spectrometer data.
  • FIG. 5 shows a fabric maintenance system 300 set up for imaging a surface to be treated according to an embodiment of the invention.
  • the system 300 is similar to system 10 described in Figure 1 and will be understood from the description of Figure 1.
  • the system 300 comprises a robotic manipulator assembly 311 having a base 312 mounted on endless tracks 314. Each of the endless tracks are mounted on rollers 315.
  • An articulated manipulator arm 316 is pivotally mounted at first end 316a on the base 312.
  • the manipulator arm 316 has joints 318 which may be individually actuated to provide the arm 316 with at least six degrees of freedom.
  • the manipulator arm 316 carries an end effector head mount 320 which is movably secured at a second end of the manipulator arm 316b.
  • end effector heads may be reversibly fixed to the end effector head mount 320 depending on the desired inspection, surface preparation or coating operation.
  • the end effector head may comprise a plurality of sensors including cameras, lasers, inductive sensors and/or ultrasonic sensors to ensure correct mapping of the surface to be treated. Only one end effector head is shown in Figure 5.
  • the robotic manipulator assembly 311 autonomously prepares a path for movement of the robotic manipulator assembly, manipulator arm and/or the end effector head to avoid potential obstacles in the surrounding environment and to undertake an effective treatment of the surface.
  • the system may create a Cartesian path plan to control the overall movement of the robotic manipulator assembly 311 , manipulator arm 316 and/or the end effector head 322 during the treatment operation and a plan “pick and place” plan of how the blasting sequence 329 should be conducted.
  • the orientation of the manipulator arm and application head are maintained at all times. Where three or more proximity sensors are used a rotation and translation from the current position to the optimal position can be calculated and implemented on all of the servos with a control loop to enable a timely adjustment.
  • a laser sensor and/or camera is used to identify the precise location of the robotic manipulator assembly.
  • Alternatively or additionally structured light or two cameras in stereo can be used.
  • a projection small laser dots using a VSCEL or similar can provide texture to the stereo camera such that an accurate surface geometry can be assessed.
  • the visual inspection involves capturing an image and then processing it such that each pixel is compared against a reference grading system. This enables the surface to be graded such that it is confirmed to meet the surface preparation specification or paint specification.
  • the system may comprise a strong external light (most likely one or more LEDs of more than 5000lm of a VCSEL laser illumination in the IR spectrum) to provide consistent lighting for the assessment.
  • the spectrometer will identify salt and other surface contamination and based on the algorithm will identify whether more blasting/ washing is required.
  • a 2D laser profiler (most likely using a light-plane-intersecting method that triangulates the reflected light), will provide the distance to points on a 2D line projected onto the targeted surface such that the surface roughness can be estimated.
  • the algorithm uses the sensor data to identify whether re-blasting is required. If it is, the area may be re-blasted and/ or washed and then retested until satisfactory results are obtained. To determine whether there is any remaining corrosion, rust or contaminates.
  • the end effector can optionally be used as a handheld unit for assessment/ measurement in areas that are difficult to access for the robot.
  • the handheld unit or device may be used by a user for logging data.
  • the sensor system may comprise one or more cameras which collect visual data on the rust level of steel surfaces where this data is processed by an algorithm to determine the grade of rust against industry standards.
  • the sensor system may comprise a 2D laser profiler to assess the level of surface preparation of steel surfaces where data collected by the laser profiler is processed by an algorithm which is used to identify any areas of the surface which have not been blasted to a pre-set standard. The standard is defined by the user.
  • visual images and/ or laser range measurements are used to create identify points in the environment that have been confirmed to be occupied. These may be added to over time or refreshed frequently.
  • the path of the manipulator and head is broken up into a series of steps, usually at constant time steps.
  • the occupied volume of the head system and the manipulator at each time step in the planned path will be checked for collisions.
  • the head and manipulator may be represented as a simpler geometry to improve processing time (e.g. cuboid or a series of spheres of varying diameters).
  • Path planning will use an optimisation approach where a cost function will be applied to a specific path.
  • the planned path once created may be displayed to the user for approval.
  • the displayed path may consist of a virtual representation of the environment to be worked in, including the area to be treat for surface preparation and/ or painted. It may also display how the robotic manipulator assembly, manipulator arm and/or the end effector head will move as well as the base of the robotic manipulator assembly (if it is mobile). Key statistics may be calculated from the simulation including surface area blasted and/ or painted per unit time.
  • the number of robotic manipulator assembly/base station moves, expected volume of paint, grit, dry ice along with areas that the manipulator will not be able to cover due to obstacles etc will be flagged to the user.
  • Figure 6 show a selection of different end effector heads that may be removably attached to the end effector head mount 120 connected to the manipulator arm 116.
  • the end effector head 522 type may be selected depending on the type of operation required and the shape or profile of the surface to be treated.
  • end effector heads 540a, 540b and 540c are configured for open loop grit blasting of three different surface shapes.
  • the end effector heads 540a, 540b and 540c have a single conduit 541 to supply grit to the surface to be treated.
  • End effector heads 542a, 542b and 542c are configured for closed loop grit blasting of three different surface shapes.
  • the end effector heads 542a, 542b and 542c have a supply conduit 543 to supply grit to the surface to be treated and a return line 545 for waste grit and rust etc.
  • End effector heads 544a, 544b and 544c are configured for a different type of surface preparation such as open loop water jetting blasting of three different surface shapes.
  • End effector heads 546a, 546b and 546c are configured for a painting structure or surfaces having three different surface shapes.
  • the end effector heads may be selected to match the required surface treatment (surface preparation or coating), whether it is an open or closed loop operation; and the shape or profile of the surface to be treated. As an example, if a small diameter pipe is to be closed looped grit blasted then end effector head 542a would be selected. If however, a flat surface was to be treated with an open looped grit blast operation then end effector head 540c would be selected.
  • end effector head shape By selecting an appropriate or corresponding end effector head shape for the surface to be treated enables close contact between the end effector head and the surface. This may allow improved treatment and may allow the sensors in the sensor system to take more accurate measurements.
  • the end effector heads may have a different surface preparation or painting function. Each end effector heads may have a different shape or profile or structure engaging surface.
  • Figure 7 shows a robotic manipulator assembly 611 with an end effector head 622 detached from the manipulator arm 616 of the.
  • An end effector head 622 is reversibly mounted on the end effector head mount 620 connected to the manipulator arm 116.
  • a head attachment system 623 is located at the end of end effector head mount 620.
  • the head attachment system comprises of a rough alignment guidance system 625 which can correct for angular of translation offset when connecting a new or different end effector head.
  • the head attachment system allows quick connection and disconnection end effector heads from the robotic manipulator assembly 611.
  • the rough alignment guidance system 625 comprises three or more tapered rods 627 which allow for rough then successively finer alignment of the end effector head mount 622 with the end effector head 622.
  • a load cell 640 in the end effector head mount 620 can measure the force exerted by the tapered alignment system 625 which can be used with a control loop (e.g. PID loop) to correct for makeup path errors.
  • a control loop e.g. PID loop
  • a locking mechanism 630 allows for the end effector head 622 to be locked in position.
  • This mechanism could also be electrically actuated. Additionally or alternatively a collect type connector, actuatable pins, balls or a simple j-slot may be used.
  • Pneumatic feedthrough conduits may be included which consist of a polymer seat and a cone tapered profile.
  • electrical connections can be made up though a male pin and female receptacle. A seal around the male pin can be included to ensure debris is excluded.
  • cylindrical sprung electrical pins can be used which form a press fit when the male and female parts of the connector are pulled together.
  • Paint, pneumatic and vacuum conduits 632 are sealed with a polymer ring and a male part consisting of either a tapered polymer of metal part.
  • the polymer ring if soft enough to allow for a small amount of float to allow for misalignment.
  • relays or similar switches can be used to isolate electrical pins during the period they are exposed to the environment. Electrical pins may be flushed with an insulating fluid or gas to ensure that they do not cause a spark potential in explosive environments.
  • the mechanical connection for the end effector head will include connectors to one or more conduits that are mounted on the robotic manipulator assembly. These may include a high pressure (expected to be up to 12 bar though possibly higher pressure) conduit for carrying one or more of grit, dry ice and sponge blasting media. In a closed loop system a second conduit such as a vacuum line may be included which when grit or sponge media is used enables said media to be recovered minimising any clean up required.
  • a paint line will enable paint to be applied which may also be accompanied by a flushing line enabling the lines to be cleaned automatically.
  • a water line may be included for washing surfaces that do not meet the cleanliness requirements.
  • a seal against the surface being blasted needs to be created to avoid grit escaping.
  • This will be formed by a replaceable attachment that has a number of fibres extending from the head (bristles like on a brush). These fibres can be bent such that a seal is preserved even if the head is moved.
  • a number of attachments will enable different surface to be blasted without media escaping, however the correct attachment needs to be identified for the area to be blasted and an area may require multiple heads to be used.
  • a sensor (camera or laser - point cloud) identify local radius of curvature and then match this to the required head attachment. Areas segmented based on radius of curvature. Path optimised to minimise time by collecting similar areas together.
  • FIGs 8A and 8B shows a perspective and side views of fabric maintenance system 700 according to an embodiment of the invention.
  • the fabric maintenance system 700 and robotic manipulator assembly 711 are similar to the system 10 and robotic manipulator assembly 11 described in Figure 1 and its operation will be understood from the description of Figures 1 to 7 above.
  • the robot manipulator assembly 711 is mounted on a rail rather than a base with endless tracks.
  • the rail system 720 comprises a rail 722 which is secured to a wall 724.with the fixings to the wall using known fixing means such as bolts into stone, brick or concrete or continuous, bolted fixings on to attachment points (e.g. tapped holes in a steel structure) or welds onto a metal structure (e.g. the rail attachment points welded on to a metal structure such at the side of a standard shipping container).
  • the rail 714 could alternatively be mounted to the floor or ceiling.
  • the rail system 720 comprises wheels 726 connected to a central member 728. The wheels are configured to move along an internal profile of the rail 722 such as a channel or track 730 of the rail 722.
  • the rail 722 has a lip 732 configured to prevent the wheels 726 and central member 728 from exiting the track 730.
  • a guide member 734 is aligned with a recess 736 in the central member 728 to maintain the orientation and alignment of central member during travel along the rail 722.
  • the wheels are made of a suitable material to resist bending, tension and compression loads. Multiple wheels at different angles could be used. Alternatively or additionally multiple rails may be used to provide additional degrees of freedom.
  • a support platform 740 is connected to central member 728.
  • the robot manipulator assembly 711 is mounted on or connected to the support platform.
  • a soft outer cover may be included such that dust and other debris is excluded from the assembly.
  • FIG 9A shows an alternative rail arrangement for fabric maintenance system 750 which similar to the system 700 described in Figure 8A and will be understood from the description of Figures 8A above.
  • the wheels 726 are configured to be driven along the channel or track 730 of the rail 722 by a lead screw shown in Figure 9B.
  • a nut 752 is connected to the support platform 740 and the lead screw 754 is connected to a motor 756. As the motor turns the screw the nut and connected support platform travel along the screw.
  • other method of driving the wheel along the rail may include roller screw, powered wheeled or a pneumatic or hydraulic cylinder.
  • wheels are configured to run along a track or channel within a rail. It will be appreciated that the rail system may be configured for wheels, tracks or bearings to travel over an outer surface of a rail.
  • Figure 10A and 10B show an alternative rail arrangement for fabric maintenance system 800 which similar to the system 700 described in Figure 8A and will be understood from the description of Figures 8A above.
  • the rail system 820 comprises a rail 822 and a robot assembly support frame 839.
  • the robot assembly support frame comprises a recirculating linear bearing 826 and a support platform 840.
  • the robot manipulator assembly 811 is mounted on or connected to the support platform.
  • the recirculating linear bearing 826 is configured to engage an outer surface of rail 822 to move the robot assembly support frame along the length of the rail 822.
  • the recirculating linear bearing could be powered or unpowered.
  • the invention provides a sensor system for a robotic apparatus for industrial fabric maintenance of a work object in a work location.
  • the sensor system comprises a first optical system for collecting a first data set relating to a work location and/or work object, a second optical system for collecting a second data set relating to a work location and/or work object and at least one processing module for processing the first and second data sets.
  • the first optical system comprises an optical camera, and the first data set comprises camera imaging data.
  • the second optical system comprises a laser positioning system, and the second data set comprises laser positioning data.
  • the sensor system is operable to process the first data set to locate the robotic apparatus in relation to the work location and/or work object to a first resolution or accuracy.
  • the sensor system is operable to process the second data set to locate a robotic apparatus in relation to the work location and/or work object to a second resolution or accuracy.
  • the second resolution or accuracy is higher than the first resolution or accuracy.
  • the invention provides a sensor system which allow industrial fabric maintenance of a work object in a work location in a safe and effective manner.
  • the invention may be uses in a variety of locations such as an industrial site or paint shop. It may also be used in the inspection, surface preparation and coating of new equipment, equipment being refurbished and/or maintained.
  • work object and/or the fabrication operation may be collected to enable the robot apparatus to be positioned and moved at accurately relative a work object in a work environment.
  • This may allow a wide range of industrial fabric maintenance of a work object to be conducted including inspection operations, surface preparation operations and/or coating operations. This may also allow fabric maintenance of a work object to be conducted reliably, remotely and/or autonomously to a high standard.
  • the accurate positioning a robot apparatus and provision of a sensor system may allow real time measurement and feedback on a robot end effector on the surface to be treated or inspected. This may ensure that sufficient pressure is placed on the equipment to allow effective treatment or measurement whilst avoiding over pressurisation which may prevent the equipment from moving across the surface.
  • Another advantage of the sensor system is the quality of the fabric maintenance operation may be assessed using the sensors in real time. This ensures that the surface meets the specification required by the end user before the robot assembly is moved to a different location or piece of work.
  • Another advantage of the invention is that fabric maintenance operations are often required to be conducted within complex and often congested areas.
  • a sensor system for a robotic apparatus it may be positioned and moved accurately around obstacles such as infrastructure or other equipment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un système de capteur pour un appareil robotique destiné à la maintenance de tissu industriel d'un objet de travail dans un emplacement de travail. Le système de capteur comprend un premier système optique pour collecter un premier ensemble de données concernant un emplacement de travail et/ ou un objet de travail, un second système optique pour collecter un second ensemble de données concernant un emplacement de travail et/ ou un objet de travail et au moins un module de traitement pour traiter les premier et second ensembles de données. Le premier système optique comprend une caméra optique et le premier ensemble de données comprend des données d'imagerie de caméra. Le second système optique comprend un système de positionnement laser et le second ensemble de données comprend des données de positionnement laser. Le système de capteur peut fonctionner pour traiter le premier ensemble de données afin de localiser l'appareil robotique par rapport à l'emplacement de travail et/ ou à l'objet de travail selon une première résolution ou une première précision. Le système de capteur peut fonctionner pour traiter le second ensemble de données afin de localiser un appareil robotique par rapport à l'emplacement de travail et/ ou à l'objet de travail selon une seconde résolution ou précision. La seconde résolution ou précision est supérieure à la première résolution ou précision.
PCT/GB2020/051904 2019-08-09 2020-08-10 Système de capteur de maintenance de tissu WO2021028673A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
GB1911466.9 2019-08-09
GBGB1911458.6A GB201911458D0 (en) 2019-08-09 2019-08-09 Fabric maintenance sensor system
GBGB1911466.9A GB201911466D0 (en) 2019-08-09 2019-08-09 Fabric maintenance system
GB1911458.6 2019-08-09

Publications (1)

Publication Number Publication Date
WO2021028673A1 true WO2021028673A1 (fr) 2021-02-18

Family

ID=72470550

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2020/051904 WO2021028673A1 (fr) 2019-08-09 2020-08-10 Système de capteur de maintenance de tissu

Country Status (2)

Country Link
GB (1) GB2589419A (fr)
WO (1) WO2021028673A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114603562A (zh) * 2022-04-19 2022-06-10 南方电网电力科技股份有限公司 一种配网带电接引线装置及方法
WO2023121528A1 (fr) * 2021-12-25 2023-06-29 Husqvarna Ab Navigation améliorée pour un système d'outil de travail robotisé

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112934543A (zh) * 2021-01-27 2021-06-11 武船重型工程股份有限公司 一种喷涂机器人
DE102022100389A1 (de) 2022-01-10 2023-07-13 Bayerische Motoren Werke Aktiengesellschaft System und Verfahren zur Erfassung von Applikationsdaten in Echtzeit
DE102022000701A1 (de) 2022-02-25 2023-08-31 Visevi Robotics GmbH Autonomes Manipulationssystem für Wartungs- und Inspektionsarbeiten an Gleisanlagen

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018119450A1 (fr) * 2016-12-23 2018-06-28 Gecko Robotics, Inc. Robot d'inspection
EP3415285A2 (fr) * 2017-06-14 2018-12-19 The Boeing Company Stabilisation d'extrémité porte-outil de bras à portée étendue d'appareil automatisé

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3635076A1 (de) * 1986-10-15 1988-04-28 Messerschmitt Boelkow Blohm Roboteranlage mit beweglichen manipulatoren
CA2089017C (fr) * 1992-02-13 1999-01-19 Yasurou Yamanaka Methode permettant de poser une roue sur un vehicule
EP1675709A2 (fr) * 2003-10-20 2006-07-05 Isra Vision Systems AG Procede pour imprimer un mouvement a un appareil de manutention et dispositif de traitement d'image
US20090057373A1 (en) * 2007-08-30 2009-03-05 Gm Global Technology Operations, Inc. Multi-Purpose End Effector for Welder
US9694498B2 (en) * 2015-03-30 2017-07-04 X Development Llc Imager for detecting visual light and projected patterns
CN108025435A (zh) * 2015-07-17 2018-05-11 艾沛克斯品牌公司 具有自动校准的视觉系统
JP2017035754A (ja) * 2015-08-10 2017-02-16 ファナック株式会社 視覚センサおよび複数のロボットを備えるロボットシステム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018119450A1 (fr) * 2016-12-23 2018-06-28 Gecko Robotics, Inc. Robot d'inspection
EP3415285A2 (fr) * 2017-06-14 2018-12-19 The Boeing Company Stabilisation d'extrémité porte-outil de bras à portée étendue d'appareil automatisé

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
I. MAURTUA ET AL: "MAINBOT - Mobile Robots for Inspection and Maintenance in Extensive Industrial Plants", ENERGY PROCEDIA, vol. 49, 1 January 2014 (2014-01-01), NL, pages 1810 - 1819, XP055754509, ISSN: 1876-6102, DOI: 10.1016/j.egypro.2014.03.192 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023121528A1 (fr) * 2021-12-25 2023-06-29 Husqvarna Ab Navigation améliorée pour un système d'outil de travail robotisé
CN114603562A (zh) * 2022-04-19 2022-06-10 南方电网电力科技股份有限公司 一种配网带电接引线装置及方法
CN114603562B (zh) * 2022-04-19 2024-04-30 南方电网电力科技股份有限公司 一种配网带电接引线装置及方法

Also Published As

Publication number Publication date
GB202012375D0 (en) 2020-09-23
GB2589419A (en) 2021-06-02

Similar Documents

Publication Publication Date Title
WO2021028673A1 (fr) Système de capteur de maintenance de tissu
GB2589418A (en) Fabric maintenance system and method of use
US10718119B2 (en) Automated drywall sanding system and method
US10449619B2 (en) System for processing a workpiece
US11579097B2 (en) Localization method and system for mobile remote inspection and/or manipulation tools in confined spaces
US10576627B1 (en) System and method for inspection and maintenance of hazardous spaces with track and roller
CA2554992A1 (fr) Preparation automatisee rentable et procede de revetement de grandes surfaces
US10864640B1 (en) Articulating arm programmable tank cleaning nozzle
US20210038045A1 (en) Exterior Wall Maintenance Apparatus
US11731281B2 (en) Automation in a robotic pipe coating system
EP2994248A1 (fr) Robot multifonction pour entretien dans des espaces confinés de constructions métalliques
US20190134820A1 (en) Tank Cleaner
Paul et al. A robotic system for steel bridge maintenance: Field testing
Mateos et al. Automatic in-pipe robot centering from 3D to 2D controller simplification
Tunawattana et al. Design of an underwater positioning sensor for crawling ship hull maintenance robots
JP6735316B2 (ja) 表面処理装置
US20240033920A1 (en) Apparatus and a Method for Removing Coatings by using Laser
Sehestedt et al. Prior-knowledge assisted fast 3d map building of structured environments for steel bridge maintenance
US11745309B1 (en) Remotely operated abrasive blasting apparatus, system, and method
Notheis et al. Towards an autonomous manipulator system for decontamination and release measurement
Santos et al. ROBBE–Robot-aided processing of assemblies during the dismantling of nuclear power plants
Kesler Assessment of a velocity-based robot motion planner for surface preparation with geometric uncertainty
CN117021132A (zh) 一种智能化水冷壁爬壁检修运维机器人的控制系统
CN117140543A (zh) 一种智能化水冷壁爬壁检修运维机器人及其工作方法
BR102021019970A2 (pt) Robô autônomo de pintura

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20771340

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 18.05.2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20771340

Country of ref document: EP

Kind code of ref document: A1