GB2581013A - Improvements in and relating to control apparatus - Google Patents

Improvements in and relating to control apparatus Download PDF

Info

Publication number
GB2581013A
GB2581013A GB1917716.1A GB201917716A GB2581013A GB 2581013 A GB2581013 A GB 2581013A GB 201917716 A GB201917716 A GB 201917716A GB 2581013 A GB2581013 A GB 2581013A
Authority
GB
United Kingdom
Prior art keywords
environment
task set
robotic
data
operator
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1917716.1A
Other versions
GB201917716D0 (en
GB2581013B (en
Inventor
Olner William
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cavendish Nuclear Ltd
Original Assignee
Cavendish Nuclear Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GBGB1819806.9A external-priority patent/GB201819806D0/en
Priority claimed from GBGB1819805.1A external-priority patent/GB201819805D0/en
Priority claimed from GBGB1910943.8A external-priority patent/GB201910943D0/en
Application filed by Cavendish Nuclear Ltd filed Critical Cavendish Nuclear Ltd
Publication of GB201917716D0 publication Critical patent/GB201917716D0/en
Publication of GB2581013A publication Critical patent/GB2581013A/en
Application granted granted Critical
Publication of GB2581013B publication Critical patent/GB2581013B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01TMEASUREMENT OF NUCLEAR OR X-RADIATION
    • G01T7/00Details of radiation-measuring instruments
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G21NUCLEAR PHYSICS; NUCLEAR ENGINEERING
    • G21CNUCLEAR REACTORS
    • G21C17/00Monitoring; Testing ; Maintaining
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1615Programme controls characterised by special kind of manipulator, e.g. planar, scara, gantry, cantilever, space, closed chain, passive/active joints and tendon driven manipulators
    • B25J9/1625Truss-manipulator for snake-like motion
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40131Virtual reality control, programming of manipulator
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02EREDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
    • Y02E30/00Energy generation of nuclear origin
    • Y02E30/30Nuclear fission reactors

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • General Engineering & Computer Science (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Plasma & Fusion (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Molecular Biology (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Manipulator (AREA)

Abstract

Apparatus for generating control signals for one or more robotic units 7, 7’ receives at least one data set 3, 3’, which may include spatial information and radiometric information regarding an operating environment, displays a representation of the data set(s) 3, 3’ to an operator and may do so using virtual reality control software 9, receives a proposed task set from the operator such as a desired cutting operation, evaluates the proposed task set against predetermined operating constraints to ensure it is permissible and, if so, provides the evaluated task set to be communicated to the robotic units. If the task set is not permissible the software 9 seeks a revised set of operations from the operator. The use of VR enables improved positioning of robotics within the environment and improved positioning for the operations, such as laser cutting. provided by the robotics at positions within the environment. This is useful in nuclear decommissioning operations and other situations where the operating environment is hazardous or hard to access.

Description

IMPROVEMENTS IN AND RELATING TO CONTROL APPARATUS
This invention concerns improvements in and relating to control apparatus, apparatus for generating controls for robotic units, apparatus for configuring such apparatus, computer implemented methods for controlling apparatus, computer implemented methods for generating control signals for robotic units, computer implemented methods for configuring such apparatus and computer programs comprising instructions for such purposes.
I() The present invention is particularly applicable where the operating environment is hazardous or otherwise hard to access and where a variety of different data types arc important to informing on the actions to be taken in the environment by robotic units.
According to a first aspect of the invention there is provided control apparatus for one or 15 more robotic units, the apparatus comprising: 1) optionally one or more databases for storing one or more data sets; 2) one or more first data processors adapted to: a. receive at least one data set from a database; b. display a representation of the data set to an operator; c. receive a proposed task set. from the operator; d. evaluate one or more characteristics of the proposed task set Liainst one or more test characteristics; c. provide an evaluated task set to a second processor; 3) a second processor adapted to: a. receive at least one evaluated task set; and b. to communicate said evaluated task set to a selected interface via a telecommunications network; 4) optionally an interface for a robotic unit, the interface being adapted to: a. receive the evaluated task set via the telecommunications network; and h. provide operating instructions to the robotic unit according to the content of the evaluated task set; 5) optionally a robotic unit adapted to operate in the environment according to the operating instructions.
According to a second aspect of the invention there is provided control apparatus for one 5 or more robotic units, the apparatus comprising: 1) one or more databases for storing one or more data sets; 2) one or more first data processors adapted to: a. receive at least one data set from a database; b. display a representation of the data set to an operator; c. receive a proposed task set from the operator; d. evaluate one or more characteristics of the proposed task set against one or more test characteristics; e. provide an evaluated task set to a second processor; 3) a second processor adapted to: a. receive at least one evaluated task set; and b. to communicate said evaluated task set to a selected interface via a telecommunications network; 4) an interface for a robotic unit, the interface being adapted to: a. receive the evaluated task set via the telecommunications network; and b. provide operating instructions to the robotic unit according to the content of the evaluated task set; 5) a robotic unit adapted to operate in the environment according to the operating instructions.
According to a third aspect of the invention there is provided apparatus for generating control signals for one or more robotic units, the control signals being in the form of a task set, the apparatus comprising: 1) one or more first data processors adapted to: a. receive at least one data set from a database; h. display a representation of the data set to an operator; c. receive a proposed task set from the operator; d. evaluate one or more characteristics of the proposed task set against valid characteristics; c. provide an evaluated task set to a second processor. the second processor being adapted to communicate with the one or more robotic units. 5 The apparatus of the third aspect of the invention may he a part of apparatus for generating and communicating a task set to one or more robotic units, the apparatus further comprising: 1) the second processor, the second processor being adapted to: a. receive at least one evaluated task set; and b. to communicate said evaluated task set to a selected interface via a telecommunications network.
The apparatus of the third aspect of the invention may he part of apparatus for generating 15 and communicating a task to one or more robotic units, the apparatus further comprising: 1) an interface for a robotic unit, the interface being adapted to: a. receive the evaluated task set via the telecommunications network; and b. provide operating instructions to the robotic unit according to the content of the evaluated task set.
The apparatus of the third aspect of the invention may be part of control apparatus for one or more robotic units, the apparatus further comprising: a robotic unit adapted to operate in the environment according to the operating instructions.
The apparatus of the third aspect of the invention may he part of control apparatus for one or more robotic units, the apparatus further comprising: one or more databases for storing one or more data sets.
According to a fourth aspect, the invention provides apparatus for configuring apparatus 30 for generating a task set for one or more robotic units, the apparatus comprising: 1) a configuring processor adapted to: a. receive inputs from an operator and generate control signals for a robotic unit from amongst the one or more robotic units from those inputs; b. provide the control signals to a second processor, the second processor being adapted to communicate with the one or more robotic units; 2) a receiver for one or more observed data sets from the one or more robotic units provided with the control signals, the receiver being adapted to communicate a data set from an observed data set to one or more databases.
The configuring processor may be a first processor as defined in the first and/or second 10 and/or third aspect of the invention or may he a further processor.
The receiver may be a first processor as defined in the first and/or second and/or third aspect of the invention.
The apparatus for configuring apparatus for generating a task set for one or more robotic units may provide a determination of the spatial form of the environment. The determination of the spatial form of the environment may include a scan of the environment from one or more positions, preferably positions of the robotic unit, more particularly positions of the spatial scanner provided on the robotic unit.
The spatial form of the environment may include a first spatial scan of the environment, for instance from a first spatial scan position, preferably to give a first spatial scan data set The apparatus for configuring apparatus for generating a task set for one or more robotic units may provide a determination of the radiometric form of the environment. The determination of the radiometric form of the environment may include a scan of the environment from one or more positions, preferably positions of the robotic unit, more particularly positions of the radiometric scanner provided on the robotic unit.
The radiometric form of the environment may include a first radiometric scan of the environment, for instance from a first radiometric scan position, preferably to give a first radiometric scan data set.
The configuring processor may provide a user interface for an operator, preferably adapted to control the position of the robotic unit in the environment, such as during a determination of the spatial form of the environment and/or during a determination of the radiometric form of the environment. The configuring processor may control one or more characteristics of the robotic unit, for instance the position of a scanner or detector provided on the robotic unit, within the environment.
The robotic unit may be provided at a series of positions within the environment with one 10 or more determinations of one or more types being made at one or more or all of those positions.
The spatial form of the environment may include n spatial scans of the environment, for instance from an nth spatial scan position, preferably to give n spatial scan data sets. The radiometric form of the environment may include r radiometric scans of the environment, for instance from an rth radiometric scan position, preferably to give r radiometric scan data sets. The positions used for one or more spatial scans may match those used for one or more radiometric scans. The number of spatial scans may he less than, equal to or greater than the number of radiometric scans.
Where two or more spatial scan data sets are generated or obtained for the same spatial scan position, then the two or more spatial scan data sets are preferably combined into one spatial scan data set.
The spatial form of the environment may he determined using a spatial scanner, for instance a LIDAR type scanner. The spatial form of the environment may he represented by a point cloud type set of data points, preferably for a position of the scanner. The point clouds for different positions of the scanner may be provided with point set registration to provide positional alignment between them. Preferably the point clouds are utilised without surface mode creation or conversion to 3D surfaces.
Prior to one or more radiometric scans, the radiometric scanner may be calibrated for the environment. The calibration may be provided in the environment and/or by a simulation of the environment.
The radiometric investigation of the environment and/or radiometric instrument/scanner may be provided according to the contents of EP2691791 and/or GB2502501. Tip particular, the investigation may provide a measurement of one or more emitted characteristics of the activity source(s) from the one of more detected characteristics of the activity source(s) by applying a factor accounting for the detector device efficiency, for instance the intrinsic detector device efficiency, and/or a calibration factor, for instance a calibration efficiency, such as a calibration efficiency accounting for the location. The investigation may provide a measurement of the total activity for one or more or all of the activity sources in the location, potentially with an uncertainty range there for and/or an upper and/or lower uncertainty value. A single detector device may be provided or used.
The single detector device may be provided at one or more, and preferably at a plurality, of measurement positions. Preferably the single detector device is moved between measurement positions during the method's performance and/or during repeals thereof. The movement may he provided by moving the detector device and/or by moving the location, for instance by rotation. A plurality of detector devices may be provided or used.
Preferably each detector device is provided at a different measurement position. A plurality of measurement positions may be provided. Preferably each detector device remains at the measurement position during the methods performance and/or during repeats thereof. The detector device may be sensitive to and/or detect one or more forms of emission from the activity source, for instance neutrons, alpha particles, beta particles or gamma rays. The detector device may be sensitive to and/or detect one or more different energies or ranges of energies. The detector device may be sensitive to and/or detect uranium and/or plutonium and/or one or more isotopes thereof. The measurement data set may include one or more of: count rate at an energy; count rate at a range of energies; total count rate, for instance at all energies the detector device is sensitive to. The measurement data set may include for a plurality, and preferably for all, measurement positions, separate values for one or more of: count rate at an energy for a measurement position; count rate at a range of energies for a measurement. position; total count rate for a measurement position, for instance at all energies the detector device is sensitive to. The measurement data set may include a total count rate for all measurement positions combined.
The configuring of the apparatus may include testing and/or configuration of the use of one 5 or more operations within the environment. For instance, the configuring of the apparatus may include calibration of a laser cutter.
The apparatus of the first aspect and/or second and/or third and/or fourth aspect of the invention may be part of apparatus for conducting a spatial survey of the environment 10 and/or for detection of radiation emissions in the environment and/or for conducting a radiometric survey of the environment.
The apparatus of the first aspect and/or second and/or third and/or fourth aspect of the invention may be part of apparatus for altering the environment. The apparatus for altering 15 the environment may include one or more of the following: a manipulation device for moving one or more items in the environment and/or parts of the environment; a lifting device for lifting one or more items in the environment and/or parts of the environment; a cutting device, for instance a laser cutter, for cutting one or more items in the environment and/or parts of the environment.
The apparatus of the first aspect of the invention and/or the second aspect and/or third and/or fourth aspect of the invention may be apparatus for an evaluated task set which is conducted in a hazardous environment, such as a radioactive environment and/or an environment containing one or more sources of alpha and/or beta and/or gamma and/or neutron emissions.
The apparatus of the first aspect and/or second aspect and/or third and/or fourth aspect of 30 the invention may he apparatus for an evaluated task set which is to he conducted in an environment that an operator cannot physically enter and/or cannot safely enter.
According to a fifth aspect of the invention there is provided a computer implemented method of controlling one or more robotic units, the method comprising: 1) optionally obtaining one or more data sets from one or more databases storing the one or more data sets; 2) by one or more first data processors: a. receiving at least one data set from a database; b. displaying a representation of the data set to an operator; c. receiving a proposed task set from the operator; d. evaluating one or more characteristics of the proposed task set against one or more test characteristics; c. providing an evaluated task set to a second processor; 3) by a second processor: a. receiving at least one evaluated task set; and h. communicating said evaluated task set to a selected interface via a telecommunications network; 4) optionally by an interface for a robotic unit, the interface: a. receiving the evaluated task set via the telecommunications network; and b. providing operating instructions to the robotic unit according to the content of the evaluated task set; 5) optionally operating a robotic unit in the environment according to the operating instructions.
According to a sixth aspect of the invention there is provided a computer implemented method of controlling one or more robotic units, the method comprising: 1) obtaining one or more data sets from one or more databases storing the one or more data sets; 2) by one or more first data processors: a. receiving at least one data set from a database; h. displaying a representation of the data set to an operator; c. receiving a proposed task set from the operator; d. evaluating one or more characteristics of the proposed task set against one or more test characteristics; e. providing an evaluated task set to a second processor; 3) by a second processor: a. receiving at least one evaluated task set; and b. communicating said evaluated task set to a selected interface via a telecommunications network; 4) by an interface for a robotic unit, the interface: a. receiving the evaluated task set via the telecommunications network; and b. providing operating instructions to the robotic unit according to the content of the evaluated task set; 5) operating a robotic unit in the environment according to the operating instructions.
According to a seventh aspect of the invention there is provided a computer implemented method for generating control signals for one or more robotic units, the control signals being in the form of a task set, the method comprising: 1) by one or more first data processors: a. receiving at least one data set from a database; b. displaying a representation of the data set to an operator; c. receiving a proposed task set from the operator; d. evaluating one or more characteristics of the proposed task set against valid characteristics; e. providing an evaluated task set to a second processor, the second processor being adapted to communicate with the one or more robotic units.
The method of the seventh aspect of the invention may be a part of a method for generating 25 and communicating a task set to one or more robotic units, the method further comprising: 1) by the second processor, the second processor: a. receiving at least one evaluated task set; and b. communicating said evaluated task set to a selected interface via a telecommunications network. 30 The method of the seventh aspect of the invention may be part of a method for generating and communicating a task to one or more robotic units, the method further comprising: 1) by an interface for a robotic unit, the interface: a. receiving the evaluated task set via the telecommunications network; and b. providing operating instructions to the robotic unit according to the content of the evaluated task set.
The method of the seventh aspect of the invention may be part of a control method for one or more robotic units, the method further comprising: operating a robotic unit in the environment according to the operating instructions.
The method of the seventh aspect of the invention may he part of control method for one or more robotic units, the method further comprising: using one or more databases for storing one or more data sets.
According to an eighth aspect, the invention provides a method for configuring apparatus 15 for generating a task set for one or more robotic units, the method comprising: 1) by a configuring processor: a. receiving inputs from an operator and generate control signals for a robotic unit from amongst the one or more robotic units from those inputs; b. providing the control signals to a second processor, the second processor communicating with the one or more robotic units; 2) by a receiver, receiving one or more observed data sets from the one or more robotic units provided with the control signals, the receiver communicating a data set from an observed data set to one or more databases.
The method of the fifth and/or sixth and/or seventh and/or eighth aspect of the invention may be part of a method for conducting a spatial survey of the environment and/or for detection of radiation emissions in the environment and/or for conducting a radiometric survey of the environment.
The method of the fifth and/or sixth and/or seventh and/or eighth aspect of the invention may be part of a method for altering the environment. The method for altering the environment may include one or more of the following: using a manipulation device for moving one or more items in the environment and/or parts of the environment; using a lifting device for lifting one or more items in the environment and/or parts of the environment; using a cutting device, for instance a laser cutter,for cutting one or more items in the environment and/or parts of the environment.
The method of the fifth and/or sixth and/or seventh and/or eighth aspect of the invention may be part of a method for an evaluated task set which is conducted in a hazardous environment, such as a radioactive environment and/or an environment containing one or more sources of alpha and/or beta and/or gamma and/or neutron emissions.
The method of the fifth and/or sixth and/or seventh and/or eighth aspect of the invention may be part of a method for an evaluated task set which is to be conducted in an 15 environment that an operator cannot physically enter and/or cannot safely enter.
According to a ninth aspect, the invention provides a computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of a method of controlling one or more robotic units, the method 20 comprising: 1) optionally obtaining one or more data sets from one or more databases storing the one or more data sets 2) by one or more first data processors: a. receiving at least one data set from a database; h. displaying a representation of the data set to an operator; c. receiving a proposed task set from the operator; d. evaluating one or more characteristics of the proposed task set against one or more test characteristics; c. providing an evaluated task set to a second processor; 3) by a second processor: a. receiving at least one evaluated task set; and b. communicating said evaluated task set to a selected interface via a telecommunications network; 4) optionally by an interface for a robotic unit, the interface: a. receiving the evaluated task set via the telecommunications network; and b. providing operating instructions to the robotic unit according to the content of the evaluated task set; 5) optionally operating a robotic unit in the environment according to the operating instructions.
According to a tenth aspect, the invention provides a computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of a method of controlling one or more robotic units, the method comprising: 1) obtaining one or more data sets from one or more databases storing the one or 15 more data sets; 2) by one or more first data processors: a. receiving at least one data set from a database; h. displaying a representation of the data set to an operator; c. receiving a proposed task set from the operator; d. evaluating one or more characteristics of the proposed task set against one or more test characteristics; e. providing an evaluated task set to a second processor; 3) by a second processor: a. receiving at least one evaluated task set; and h. communicating said evaluated task set to a selected interface via a telecommunications network; 4) by an interface for a robotic unit, the interface: a. receiving (he evaluated task set via the telecommunications network; and h. providing operating instructions to the robotic unit according to the content of the evaluated task set; 5) operating a robotic unit in the environment according to the operating instructions.
According to an eleventh aspect, the invention provides a computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of a method for generating control signals for one or more robotic units, 5 the control signals being in the form of a task set, the method comprising: 1) by one or more first data processors: a. receiving at least one data set from a database; b. displaying a representation of the data set to an operator; c. receiving a proposed task set from the operator; d. evaluating one or more characteristics of the proposed task set against valid characteristics; e. providing an evaluated task set to a second processor, the second processor being adapted to communicate with the one or more robotic units.
The computer program of the eleventh aspect of the invention may carry out the steps of a part of a method for generating and communicating a task set to one or more robotic units, the method further comprising: 1) by the second processor, the second processor: a. receiving at least one evaluated task set; and b. communicating said evaluated task set to a selected interface via a telecommunications network.
The computer program of the eleventh aspect of the invention may carry out the steps of a part of a method for generating and communicating a task to one or more robotic units, the 25 method further comprising: 1) by an interface for a robotic unit, the interface: a. receiving the evaluated task set via the telecommunications network; and b. providing operating instructions to the robotic unit according to the content of the evaluated task set.
The computer program of the eleventh aspect of the invention may carry out the steps of a part of a control method for one or more robotic units, the method further comprising: operating a robotic unit in the environment according to the operating instructions.
The computer program of the eleventh aspect of the invention may carry out the steps of a part of a control method for one or more robotic units, the method further comprising: using one or more databases for storing one or more data sets.
According to an twelfth aspect, the invention provides a computer program comprising 10 instructions which, when the program is executed by a computer, cause the computer to carry out the steps of a method for configuring apparatus for generating a task set for one or more robotic units, the method comprising: 1) by a configuring processor: a. receiving inputs from an operator and generate control signals for a robotic unit from amongst the one or more robotic units from those inputs; b. providing the control signals to a second processor, the second processor communicating with the one or more robotic units; 2) by a receiver, receiving one or more observed data sets from the one or more robotic units provided with the control signals, the receiver communicating a data set from an observed data set to one or more databases.
The computer program of the ninth and/or tenth and/or eleventh and/or twelfth aspect of the invention may carry out part of a method for conducting a spatial survey of the environment and/or for detection of radiation emissions in the environment and/or for 25 conducting a radiometric survey of the environment.
The computer program of the ninth and/or tenth and/or eleventh and/or twelfth aspect of the invention may carry out part of a method for altering the environment. The method for altering the environment may include one or more of the following: using a manipulation device for moving one or more items in the environment and/or parts of the environment; using a lifting device for lifting one or more items in the environment and/or parts of the environment; using a cutting device, for instance a laser cutter, for cutting one or more items in the environment and/or parts of the environment.
The computer program of the ninth and/or tenth and/or eleventh and/or twelfth aspect of the invention may carry out pail of a method for an evaluated task set which is conducted in a hazardous environment, such as a radioactive environment and/or an environment containing one or more sources of alpha and/or beta and/or gamma and/or neutron emissions.
The computer program of the ninth and/or tenth and/or eleventh and/or twelfth aspect of the invention may carry out part of a method for an evaluated task set which is to be conducted in an environment that an operator cannot physically enter and/or cannot safely 15 enter.
Any of the aspects of the invention may include any one or more of the possibilities, options and features of the other aspects of the invention and/or any one or more of the following possibilities, options and features.
One or more or all of the robotic units may be provided with a camera. The camera may he a still image and/or video capable camera. One or more or all of the robotic units may be provided with a light. One or more or all of the robotic units may be provided with a direction indicator, for instance a laser pointer, potentially to show the centre of the field of view of the camera and/or the radiometric scanner.
One or more or all of the robotic units may be provided with a spatial scanner, for instance a L1DAR type scanner.
One or more or all of the robotic units may he provided with a radiometric scanner, for instance the radiometric investigation of the environment and/or radiometric instrument/scanner may be provided according to the contents of EP2691791 and/or GB2502501. In particular, the investigation may provide a measurement of one or more emitted characteristics of the activity source(s) from the one of more detected characteristics of the activity source(s) by applying a factor accounting for the detector device efficiency, for instance the intrinsic detector device efficiency, and/or a calibration factor, for instance a calibration efficiency, such as a calibration efficiency accounting for the location. The investigation may provide a measurement of the total activity for one or more or all of the activity sources in the location, potentially with an uncertainty range there for and/or an upper and/or lower uncertainty value. A single detector device may be provided or used. The single detector device may be provided at one or more, and preferably at a plurality, of measurement positions. Preferably the single detector device is moved between measurement positions during the method's performance and/or during repeats thereof. The movement may be provided by moving the detector device and/or by moving the location, for instance by rotation. A plurality of detector devices may be provided or used. Preferably each detector device is provided at a different measurement position. A plurality of measurement positions may be provided. Preferably each detector device remains at the measurement position during the methods performance and/or during repeats thereof. The detector device may be sensitive to and/or detect one or more forms of emission from the activity source, for instance neutrons, alpha particles, beta particles or gamma rays. The detector device may be sensitive to and/or detect one or more different energies or ranges of energies. The detector device may be sensitive to and/or detect uranium and/or plutonium and/or one or more isotopes thereof. The measurement. data set may include one or more of: count rate at an energy; count rate at a range of energies; total count rate, for instance at all energies the detector device is sensitive to. The measurement data set may include for a plurality, and preferably for all, measurement positions, separate values for one or more of: count rate at an energy for a measurement position; count rate at a range of energies for a measurement position; total count rate for a measurement position, for instance at all energies the detector device is sensitive to. The measurement data set may include a total count rate for all measurement positions combined.
One or more or all of the robotic units may he provided with apparatus for altering the environment or a part thereof, for instance a laser cutter.
Two or more different types of robotics unit may be the subject of the aspects of the invention. Robotics units may differ in type due to the capabilities they provide, for instance due to the tools they are provided with, for instance a spatial scanner compared with a radiometric scanner. Robotics units may differ in type due to having different manufacturers. Robotics units may differ in type due to different operating mechanisms, for instance snake arm robotic units, compared with jointed arm robotic units, compared with crane robotic units etc. The robotic units preferably use Cartesian type definitions for their characteristics, such as 10 position. The tip of the robotic unit and/or the position of the scanner may he so defined. The definition of characteristics is preferably stated relative to five degrees of freedom, such as position in 3D [for instance x, y, z] and orientation of the robotic unit in 2D.
The one or more data sets may include data sets of more than one type. The one or more data sets may include one or more spatial data sets for the environment, for instance as a first type. The one or more data sets may include radiometric data sets for the environment, for instance as a second type. The one or more data sets may include one or more combined data sets, for instance as a third type. One or more combined data sets may be provided in which spatial data and radiometric data is combined in the data set, for instance with the radiometric data overlain on spatial data.
The one or more data sets may be generated by previous use of the apparatus. The one or more data sets may be generated by previous use of other apparatus, for instance spatial scanning and/or radiometric scanning apparatus.
One or more or all of the data sets may he stored as PLY format files.
The one or more databases may be provided by the same device. Preferably the one or more databases containing data sets of the same type are provided by the same device. The 30 one or more databases may he provided by different devices, with data sets of the different types being separated from one another on different devices.
A database for spatial data sets may be provided. A database for radiometric data sets may be provided. A database for combined data sets may be provided.
The one or more or all of the databases may be directly connected to a first processor or 5 may be indirectly connected via a further unit.
The one or more first data processors may be computer programs, devices or a combination of computer programs and devices.
The one or more first data processor performed functions may be provided by a single first data processor.
The one or more first data processors adapted to receive at least one data set from a database may receive one type of data set, for instance a combined data set, for instance a 15 combined data set in which spatial data and radiometric data are combined in the data set, for instance with the radiometric data overlain on spatial data.
The one or more first data processors adapted to receive at least one data set from a database may receive two types of data set, for instance a spatial data set and a radiometric data set. The first processor may generals a combined data set in which spatial data and radiometric data are combined in the data set, for instance with the radiometric data overlain on spatial data. Preferably spatial data sets and radiometric data sets are combined when they were obtained at the same position within the environment. Preferably a position definition, such as a source identifier, is associated with each data set 25 to allow a determination of whether data sets can be combined or not.
The one or more first data processors adapted to receive at least one data set from a database may process the one or more data sets received, for instance to apply a conversion to the data sets. The conversion may enable the converted data set(s) to he used in the display of the representation of the data set to the operator provided by one of the one or more first processors.
The display of the representation of the data set to the operator may be provided by computer software, for instance VR control software, preferably operating on one of the first data processors.
The VR control software is preferably adapted to display a VR representation of the environment to the operator and/or to allow the operator to move within the VR representation of the environment and/or to define the position of elements, such as robotic units, within the VR representation of the environment and/or to provide the operator with radiometric information on parts of the environment within the VR representation of the environment and/or to defining positions for tasks, such as moving and cutting, within the VR representation of the environment.
The apparatus may further include a VR headset for the operator.
The apparatus may be provided with a graphical user interface.
The apparatus may further include one or more user input devices. The user input devices are preferably adapted to allow the operator to build a proposed task set.
The proposed task set may be built up taking into account spatial data shared with the operator and/or radiometric data shared with the operator and/or the combination of those and/or information on the materials within the environment and/or forming parts of the environment.
The proposed task set may he built of one or more movements of the robotic unit, for instance one or more extensions and/or one or more retractions and/or one or more rotations and/or one or more twists applied to one or more of the parts of the robotic unit. The proposed task set may be built of one or more operations by the robotic unit within the environment, for instance one or more spatial scans and/or one or more radiometric scans and/or one or more cutting operations and/or one or more movements applied to a part of the environment.
The proposed task set may be partially built by the operator, for instance an end position at which an action is to be taken, and/or may he partially built by the computer software, for instance to transition from a position to the end position sought by the operator.
The one or more first data processors may evaluate one or more characteristics of the proposed task set against one or more test characteristics by obtaining a set of test characteristics, for instance from a data base.
One or more test characteristics may be provided which are permissible characteristics.
One or more test characteristics may he provided which are impermissible characteristics. One or more test characteristics may be defined for multiple environments, for instance permissions and/or limitations on the rate of movement of a robotic unit, for instance permissions and/or limitations on the laser power useable. One or more test characteristics may be defined for a robotic unit type, for instance permissible and/or impermissible movements or combinations of movement. One or more test characteristics may be defined for a specific environment, for instance impermissible positions for a part of the robotic unit, for instance due to that position colliding with a part of the environment. One or more test characteristics may he amended during the conduct of a task set, for instance to reflect the changes on what is permissible and/or impermissible due to the changes in the environment by the conduct of the task set so far.
The operator may make a visual assessment within the VR representation of the environment of whether a proposed task set should be an evaluated task set, for instance by considering the position of the robotic unit relative to the environment in the VR 25 representation of the environment.
A proposed task set which includes one or more characteristics which do not pass the evaluation may be declined as a task set for use and/or may not be classified as an evaluated task set. A proposed task set which includes one or more impermissible characteristics may he declined as a task set for use and/or may not he classified as an evaluated task set.
A proposed task set which includes only characteristics which pass the evaluation may be accepted as a task set for use and/or may he classified as an evaluated task set. A proposed task set which includes only permissible characteristics may be accepted as a task set for use and/or may be classified as an evaluated task set.
One or more evaluated task sets may be stored for late use. They may be stored in the first data processor or maybe stored in the second processor. An operator may generate one or more evaluated task sets the time taken for completion of which is at least twice the time taken to establish the evaluated tasks, more preferably at least three times that time and more preferably at least five times that time.
The evaluated task set may be provided in a queue relative to one or more other evaluated task sets for the same and/or for different robotic units.
The operator may provide a series of evaluated task for sequential use in an environment.
The evaluated task set may be provided to an output interface, for instance an application programming interface or API. The output interface may convert the evaluated task set into operating instructions compatible with the interface for the robotic unit and/or for the 20 robotic unit to which the evaluated task set is directed.
The second processor may he a computer program and/or a device. The second processor may be a server. The second processor may be a communications server. The second processor may have one or more first data processors as clients, for instance greater than three clients. The second processor may have one or more interfaces for robotic units as clients, for instance greater than 10 clients.
The second processor may be a request-response server. A request may be for the communication of an evaluated task to an interface for a robotic unit. A response may he a 30 confirmation that the communication of the evaluated task to the interface for the robotic unit has been made. A request may be for the communication of a robotic data set from the robotic unit to a first processor. A response may be that the communication of the robotic data set to the first processor has been made.
The second processor may be informed by the first processor to communicate with a particular interface and/or for the interface to communicate with a particular robotic unit for a given evaluated task set and/or robotic data set.
The interface for a robotic unit may convert the evaluated task set received into operating instructions for the robotic unit.
The apparatus may further provide for the interface for a robotic unit to apply a correction to the operating instructions arising from the evaluated task set. The correction may be for the robotic unit and/or for the environment the robotic unit is operative in.
The apparatus may further provide for the robotic unit to provide positional information to the interface. The apparatus may further provide for the interface to provide the positional information to the second processor and potentially hence to a first processor. The positional information may include positional information on one or more or all parts of the robotic unit, preferably to fully define the positions it occupies with the environment.
The positional information may enable a first processor to provide visualisation of the position of the robotic unit in the environment to an operator.
Various embodiments of the invention will now be described, by way of example only, and with reference to the accompanying drawings in which: Figure 1 is a schematic illustration of a robotics operating system, according to an embodiment of the invention, showing the interaction of data and components within the system; Figure 2a details the components for a first robot arm useful for the invention, particularly for data acquisition stage(s); Figure 2b details the components for a second robot arm useful for the invention, particularly for operation stage(s); Figure 3 is a schematic illustration of the layered architecture used in an embodiment of 5 the invention for the VR interface; Figure 4 is a schematic illustration of the VR hardware interface architecture; Figure 5 is a schematic illustration of the steps involved in the spatial scanning of the 1() environment; Figure 6 is a schematic illustration of the steps involved in the radiometric data collection for the environment; Figure 7 is a schematic illustration of the steps involved in the overlaying of spatial scan and radiometric data collection results; Figure 8 is a schematic illustration of the steps involved in verifying the spatial scan operations planned; Figure 9 is a schematic illustration of the steps involved in verifying the radiometric data collection operations planned; Figure 10 is a schematic illustration of the steps involved in verifying the cutting 25 operations planned; Figures 11, 12, 13 and 14 show experimental tests results from an embodiment according to the invention; Figure 15 is an alternative schematic illustration of a robotics operating system, according to a further embodiment of the invention, showing the interaction of data and components within the system; and Figures 16 and 17 show further experimental test results from an embodiment according to the invention.
Background to the Invention
Various hazardous environments exist which limit or preclude a person visually inspecting a location directly at which work needs to he conducted. The hazard might he due to radiation, chemicals, bio-hazards or the like. The restriction may simply be one of difficulty of access due to physical constraints. In either event, the lack of direct visual inspection makes the conduct of any work at the location harder to plan, verify and implement. The work might he cleaning, characterisation, removal, cutting or the like, for instance in a decommissioning operation. Cells and other chambers involved in the storage and/or processing of nuclear material are particularly problematic when decommissioning is needed.
Key Improvements Previous visualisation assistance is limited in the extent of the information provided and in terms of the ease of use of that information. The present invention seeks to provide a greater extent of information and/or to provide for easier use of that information, for instance through improved visualisation of the situation to the operator.
Previous approaches also tended to focus on visualisation in isolation from other aspects of the conduct of the work needed. Thus information on what needed to he treated/handled would be provided but little or no guidance on how that should be done. For instance, a location might be shown to have radiometric contamination of this type and level and to be positioned at a location. However, the ability to access that location and conduct the work there using robotics was not evaluated. The present invention seeks to integrate analysis with viability of doing the work for robotic type systems.
In general, site specific approaches are also taken. The scanners, analysis software, 30 operators and experts are all gathered at or very near the location to he worked upon.
Whilst there they decide what needs to be done, before returning to their normal work location whilst the long tasks are actually performed. It is not commercially viable to have their expertise on site at all times. This can cause delays if the initial plans prove not to be viable. The present invention seeks to enable centralised expertise to he used and shared to locations all over the world so that the best solutions are provided and can be very quickly updated or modified.
Previous approaches have also tended to determine the work needed and propose the next steps to perform that as they advance through the works. This may mean delays before the next decision can be taken on the next step in the works, whilst time consuming radiometric scans for instance are completed. By evaluating and testing the sequence of I() works in a VR environment, the present inventions seeks to stack up a series of viable steps to form a program of works that can be conducted without expert consideration as they progress, whilst still being confident that the full sequence of the steps can be conducted without interruption.
Overview of the Invention The overarching aim of the invention is to integrate one or more data acquisition stages for an environment, with Virtual Reality (VR) consideration of that data to improve one or more operation stages applied to the environment subsequently.
In particular, the data acquisition usefully includes spatial information on the environment, together with radiometric information on the contents of that environment. When these two data sets are overlain with one another using VR, the user has far more useful information on the environment. This enables, amongst other benefits, improved planning and verification of the operation stages to be applied to the environment. More particularly, the use of VR enables improved positioning of robotics within the environment and improved positioning for the operations, such as laser cutting, provided by the robotics at positions within the environment. This is useful in nuclear decommissioning operations and other situations.
As illustrated in Figure 1, the robotics operating system includes a database 3 holding a series of 3D data sets on the environment 5 within which the robotics 7, 7' will be operating. As an example, a computer [database] can hold the results of a spatial scan [the 3D data set] of a room [the environment] in which the snake-arm [robotics] is to be controlled to provide cutting [operations].
In use, one or more 3D data sets are imported from the database 3 and via a conversion process into the VR control software 9. The conversion process applied to the 3D data sets allows the automatic conversion of point cloud type data into a VR compatible format automatically.
After the conversion of the 3D data sets, the VR control software 9 will run calculations to determine where that 3D data set can be utilized safely before it is shown to the VR operator; the person using the VR control software 9 to set up operations. The VR operator is then able to define robotics operations with an almost infinite level of precision.
The VR operator inputs a desired series of operations to be performed by the robotics 7 or 7' utilizing the VR equipment which operations are then considered by the VR control software 9. The VR control software 9 verifies that these operations are permissible under predetermined operating constraints. If they are not, then a revised series of operations is sought from the VR operator. If they are permissible, then they are sent to a robotics control broker server 11.
From the robotics control broker server 11, (hese operations can then be executed in real time or queued to be sent at a later date or exported to a file for further analysis.
When these operations are to be executed, then they are sent from the robotics control broker server 11 to an interface for the relevant robotics to be controlled. In Figure 1, an interface 13 [named Sinus"] is used to apply the operations to robotics 7 in the form of snake-arm hardware 15. A 3rd party robotics interface 17 is used to provide the interface to apply the operations to robotics 7' in the form of 3rd party hardware 19 when that is the chosen robotics 7'.
The robotics control broker server 11 enables interfacing with a wide variety of robotics 7, 30 7' etc from a wide variety of different suppliers. Only two robotics 7, 7' are shown in Figure 1 for the sake of clarity. This robotic control broker server 11 is capable of receiving multiple operations from multiple VR software environments and forwarding them on to the appropriate robotics platform being used at the time. Only two VR control software 9, 9' environments are shown in Figure 1 for the purposes of clarity.
The system architecture selected is beneficial as it allows workflow of taking 3D data sets 5 and radiometric data sets from a remote site, importing them into a centralised VR control software 9 at a location so they can he used to define the best robotic operations, whilst then sending those robotic operations out to a remotely deployed robotics 7, 7' in real time, anywhere in the world. The centralised VR control software 9 at its location enables the best radiometric, VR, spatial, structural and other technical specialist to be provided at that 1() location facilitating cooperation but still allowing the benefits of their knowledge to he readily applied at the actual environment where robotic operations are necessary across the world.
During initial scanning and calibration manual control of the robotics 7, 7' is required and 15 so manual control software 21 is provided for that purpose. This can of course be used at any stage where manual control of the robotics 7, 7' is required.
As illustrated in figure 15, the robotics operating system has been modified slightly. In this instance, the VR control software 9 has been modified through the use of an alternative 20 software and the manual control software 21 has also been revised.
The key parts of the system and their operation are now discussed in more detail.
Spatial Scanning and Radiometric Scanning This section of the system is responsible for investigating the environment 5, collecting the data sets and providing the data which allows the spatial and radiometric overlay to he constructed and accessed.
The two different data set types may be collected separately during passes of the robotics 30 7, 7', such as a robotic arm, through the environment 5.
hi the embodiment shown in Figure 2a, the robotic arm 7a is provided with a light 23 to illuminate the environment 5, a camera 25 to provide visual images Estill or video] of the environment 5, a laser pointer 27 to show which part of the environment 5 the camera 25 is pointing at and a 3D laser scanner 29 to collect the spatial data set on the environment 5 from a given position from which a data set is collected.
LIDAR [light detection and ranging] is one potential approach to the collection of the spatial data set as it allows quick, accurate and detailed remote sensing of the environment 5.
The first pass of the robotic arm 7a generally collects the spatial scan results. Each data set is in affect a point cloud. The 3D data set is provided as a point cloud data set in PLY format. The 3D data set provides an accurate representation of the 3D space defining the environment 5.
The radiometric data for the overlay is collected in a separate second pass; a radiometric scanner 31 replaces the 3D laser scanner 29 on the robotic arm 7a. A suitable radiometric approach is detailed in EP2691791 and 0B2502501, the contents of which are incorporated herein by reference, particularly with respect to the radiometric detector, radiometric detection, processing and presentation of radiometric results as detailed therein. Other radiometric detection approaches can be used by mounting a suitable radiometric detector and/or through different processing of the signals detected.
The radiometric data is provided in PLY format too.
if the robotic arm can accommodate both 3D laser scanner 29 and radiometric scanner 31 then both data sets could be collected in a single pass through the environment 5.
Robotics Interface This section of the system, referring again to Figure 1, is responsible for directly driving the robotics 7, 7' and feeding back into the VR control software 9 valid poses and transforms of the robotics 7, 7'.
The robotics 7, 7' will generally be operating in Cartesian mode. In this mode, the position of the tip of the robotic arm 15 is defined with respect to 5 degrees of freedom. Those are the 3D position of the tip of the robotic arm 15 in space and the 2D orientation definition of the tip (the heading -roll is not free).
Input demands received by the robotics 7, 7' will be interpreted as 'correct'; the robotics 7, 7' will apply the relevant compensation (based on a calibration process mentioned below) transparently and internally to the interface 13 named "Sirius". All output. from the interface 13 named "Sirius" will also be corrected.
The frame of reference for the demands on the robotic arm 15 is right-handed with positive Z in the direction the robotic arm 15 is pointing and Y is vertically up with X towards the left as the arm 15 is viewed from arm base to arm tip. The origin is the position of the tip of the robotic arm 15 when fully retracted.
The interface 13 named "Sirius" will stream the current estimated joint positions and linear introduction position of the robotic arm 15, via the robotics control broker server 11, to the VR control software 9 for rendering the robotic arm 15 in VR.
The interface 13 named "Sirius" will also provide current estimated robotic arm 15 tip transforms (post-compensation) to manual control software 21 and radiometric instrument. software, such as RadScan software.
Cartesian scanning moves will be demanded one-by-one.
For cutting operations different robotics 7, 7' may often be used. For instance, in the Figure 2b embodiment a different robotic arm 7b is used, together with a mounted laser cutter 33. A camera 25, a laser pointer 27 and a light 23 may still be provided.
Layered System Architecture As shown in Figure 3, the system architecture has four different layers in relation to the VR interface architecture. These are a DATA layer, COMMS layer, CONTROL layer and PRESENTATION layer.
The 3D data sets and radiometric datasets are both stored in the database(s) as PLY files 35 in the DATA layer. Those PLY files 35 go through the COMMS layer to the CONTROL layer where they are received into the conversion process and the VR control software 9 which act to provide the Loader 37. The Loader 37 provides the combined representation of the data to the VR operator in the PRESENTATION layer in Unity 39.
The VR operator interacts with the system via Unity 25 to determine what operations to take where and in what sequence within the environment 5. Those operations are then provided to the UI Control 41 in the CONTROL layer; the UI Control 41 can be a part of the robotic control broker server 11. The UI Control 41 interacts with the API 43 [application programming interface] to ensure that the operations to be sent out through the COMMS layer via Comms 45 are compatible with the Robotics 47 in the DATA layer. This enables the Robotics 47 to perform the operations desired at remote locations.
VR User Interface This unit contains all the software for VR visualisation by the VR operator and for the interfaces between that and, via the robotics control broker server 11, the interfaces 13, 17 etc to the robotics 7, 7' in Figure 1.
The VR software and interface architecture is illustrated with reference to Figure 4. The VR operator wears an HTC Vive -headset 51 which is provided with a wireless connection to the VR GUI [graphical user interface] unit 53, referenced as Unity 39 in Figure 3 and the system architecture above. The VR GUI unit 53 is in communication with the PLY file loading interface 55 for receiving the overlay PLY files when generated for a given scene identity.
The VR operator is provided with a working space 57 in which to visualise and interact and suitable input devices for selecting operations to he considered and other inputs.
Windows 10 is one potential approach, with direct connection to the VR Goggles via 5 HDMI as well as an Ethernet connection for communication with both the radiometric instruments and the robotics.
Using this information the VR operator can provide the sequence of movements for the robotic arm and the cutting location and pattern required. These are considered by the VR GUI unit 53 to ensure that they are physically possible and acceptable sequences or other actions. If so, then the operations selected by the VR operator and approved by them and/or the VR control software 9 are, expressed in C++ [or other general purpose programming language] and/or ZeroMQ [or other high performance messaging library], collated and provided as an operation queue 55.
The operation queue 55 passes through the API wrapper 57 for the particular robotics 7, 7' that is intended to receive the operations so as to allow seamless communication between the VR operator and the robotics 7, 7' which may he from one of many suppliers. The operations are then passed out to the robotics control broker server 11 and on to the relevant robotics 7, 7' for implementation.
Method of Operation A typical sequence of operation is now provided to illustrate the use of the system.
Phase 1 -Spatial scanning the environment A robotics package of the type illustrated in Figure 2a is provided in the environment. The operator controls the robot arm 7a, for instance a snake-arm, and uses the images from the video camera 25 to place the laser scanner 29 in a first measurement position in the environment 5. The first measurement position may he the position of the robot arm 7a when first introduced to the environment 5 or after minimal movement.
At the first measurement position, a first scan is initiated by a start signal and the laser scanner 29 conducts its scan; a point cloud acquisition is achieved. The first scan thus generates the first scan data set and this is provided with a scene identity so as to allow subsequent association of point clouds and radiometric data for the same measurement position. The completion of the scan indicates that movement of the robotic arm can occur again to the operator.
The first scan data set is created using the data from the laser scanner transformed to the coordinate system used by the robotic arm and in particular the laser scanner position for 10 that data set. The first scan data set is stored off device using the communications interface and is a PLY file.
The operator then manually controls movement of the robot arm to a second measurement position and a second scan is performed there to generate a second scan data set which is 15 also communicated to the remote store.
If a scan data set already exists for that position, considered in terms of the scene identity, then the old and new data sets are combined to give a combined data set. If there is no previous scene point cloud for this scene identity, then the current position point cloud becomes the scene point cloud by default. All new position point clouds are registered to the coordinate system of the existing scene point cloud.
The process is repeated through n further measurement positions to give n scans and n scan data sets.
Each data set is processed in the same way using offline stitching of points to generate the n processed data sets still stored remotely. The n processed data sets are read for importation into the VR model when needed.
The sequence and interrelationship between the components for this part of the process is shown in Figure 5 for one embodiment.
Phase 2 -Calibration of Radiometric Information In this phase, the Figure 2a robot arm 7a is provided with a radiometric scanner 31 of the desired type and calibration of the radiometric scanner 31 to the environment 5 is performed. This involves the operator manually moving the robot arm 7a to a first calibration position and gathering first calibration data set there. The process is repeated for the c calibration positions to give c calibration data sets.
The calibration data sets are used to correct the results from the radiometric scanner 31 for influencing factors arising from that environment 5 and at that scene identity for the 10 position.
Phase 3 -Radiometric Data Collection Having calibrated the radiometric scanner 31 or other instrument format, it can now be used to collect the radiometric data collection.
The robot arm 7a is provided with a series of movement instructions via the VR operator and VR control software 9. These movement instructions are the sequential movements that the robot arm 7a will make from its starting position to reach each of the positions at which radiometric data collection is required. These are preferably the same positions as are used in the spatial scanning and so have common scene identities.
As the radiometric data collection at a position takes a material amount of time, the series of movements allows the system to operate without direct user involvement, for instance overnight such that the operator is not required to be involved throughout this time 25 consuming part of the process.
Again the robot arm 7a will make one or more of the series of movements to reach a first radiometric collection position. Whilst stationary at that position, a first radiometric measurement is made to form a first radiometric measurement data set. The data set is transformed according to the calibration results and a first corrected radiometric measurement data set is generated. A gamma spectra is a preferred form of result. The data set has a scene identity associated with it. The first corrected radiometric measurement data set is then outputted to a remote location for storage. This is repeated for r radiometric measurements at r positions to give r radiometric measurement data sets and r corrected ones. As desired, one or more of the r corrected data sets can be recalled and reimported to the VR control software 9.
The sequence and interrelationship between the components for this part of the process is shown in Figure 6 for one embodiment.
Phase 4 -Laser Calibration Flaying completed the scan of the environment 5 and the radiometric measurement of the environment 5, the system to proceed to the alteration of the environment 5 using the robotics 7b. As an example, this may feature use of a different arm 7b provided with a laser cutter 33 for laser cutting of a part of the environment 5 to remove a piece of the environment to another location, inside or outside the environment 5. A robotics package of the type shown in Figure 3b could be used, with the robotic arm 7b provided at the end with a laser cutter 33.
Before cutting operations begin, laser calibration steps may he conducted so as to ensure effective and controlled operation of the laser within the environment 5.
Phase 5 -Cutting Planning Having calibrated the laser cutter 33 correctly, actual cutting of the environment 5 can he planned and then commence. For this, a series of movements of the robotic arm 7b need to be defined so as to allow the robotic aim 7b to provide a series of laser cuttings of the right form in the right locations. The definition of the desired cut(s) is provided in the VR environment by the VR operator making use of the overlay of information.
The overlay is initiated for a position by referencing the scene identity for that position. The associated point cloud data and radiometric data are retrieved from the storage 30 database(s) 3 using the common scene identity. Using both data sets the radiometric overlay is calculated and this is saved as a PLY file too. As a formed radiometric overlay file this can then be transferred to the VR control software 9 for visualisation and use by the VR operator.
The sequence and interrelationship between the components for this part of the process is 5 shown in Figure 7 for one embodiment.
The planning of the cuts takes into account the scan information displayed through the VR system to the VR operator and the radiometric data that is overlain with that scan information. Thus the VR operator can see where in the environment 5 to make the cuts based upon the physical representation achieved through the scan and can take into account radiometric information relative to those cuts, for instance the presence or absence of radiometric material or the level of radiometric material present and/or characteristics thereof. The overlay and the VR operator access to it is a powerful tool for effective design of the following phases, for instance in decommissioning.
A significant aspect of the cutting set up is that the VR control software 9 checks that the movements planned for the robotic arm 7b and the cuts planned are permissible under predefined criteria before allowing them to be stacked and the sequence started. In this way impossible or undesirable movements of the robot arm 7b are avoided and undesirable cutting is also avoided. This functionality comes from VR control software 9 based checks and also from the VR operator being able to see the movements and other operations before they occur to ensure that there are no collisions or other clashes.
In more detail, the VR operator defines a calibration point within the VR environment and then the VR operator initiates the verification process from the VR environment. A simulation, using the VR control software 9, is run to produce a set of transforms the robotic arm 7b will need to perform to require to reach that point within the environment 5. The resulting transforms are rendered and displayed in the VR environment to the VR operator. The VR operator visually inspects the resulting robotic arm 7h render to ensure that no collisions are likely. Once verified the VR operator can initiate the move in the real world system, either at that time or in a future sequence to be conducted. Once that movement to that calibration point has been dealt with then the next, its overlay and the transforms to reach it can be considered through to the completion of all the movements and other actions desired.
When working in real time, after the first movement is completed, the VR control software 9 then requests the camera view from the robotic arm 7b and renders it so that it is visible to the VR operator. The VR operator then re-aligns the camera view within the VR environment to get the best match with the point cloud model. The translation offsets required for this match are then passed back through the controller and store within for use in future runs.
The sequence and interrelationship between the components for this part of the process is shown in Figure 8 for one embodiment.
A similar process applies for the VR consideration of the radiometric scanning approach. 15 The VR operator defines all the move points within the VR environment and then the VR operator initiates the verification process from the VR environment. The software runs a simulation to produce a set of transforms the arm will require to reach the first point. The resulting transforms are rendered and displayed in the VR environment. The VR operator visually inspects the resulting robotic arm 7a render to ensure that no collisions are likely, 20 once verified they approve that point. These steps are then repeated for all points to be defined. Once all points have been verified the VR operator initial the move to the first point. The point information is then pass through the controller to the robotic arm 7a itself and the move begins Once the move is confirmed as finished by Polling the robotic arm 7a then the controller initiates the radiometric scan. When the scan is complete, the controller initiates the move to the next point. These measurement steps are then repeated until all scans have completed, this completion status is then passed back to the VR environment.
The sequence and interrelationship between the components for this part of the process is shown in Figure 9 for one embodiment.
The process works in a similar way in relation to cutting operations. First the VR operator defines all the cut points within the Virtual Reality environment. The VR operator then initiates the verification process from the Virtual Reality environment. The controller runs a simulation to produce a set of transforms the arm will require to reach the first point.
The resulting transforms are rendered and displayed in the Virtual Reality environment. The VR operator visually inspects the resulting arm render to ensure that no collisions are likely, once verified they approve that cut. These approval steps are then repeated for all cuts defined. Once all cuts have been verified the VR operator initiates the move to the first cut. The cut information is then pass through the controller to the robotic arm and its cutter and the move and then the cut begins. The controller then polls the arm to ascertain if the cut has finished and when it has the controller initiates the next movement and/or cut combination. These cutting steps are then repeated until all cuts have completed.
The sequence and interrelationship between the components for this part of the process is 15 shown in Figure 10 for one embodiment.
Phase 6 -Cutting Again the operations can he stacked in advance and the robotics left to perform the sequence over the time period necessary to make the movements and make the relevant 20 cuts. The user need not be occupied by the system throughout to save user time demands.
Cartesian cut moves are supplied to the robotic arm 76 as a series of Cartesian waypoints. Each point (except the start and end) can individually be specified as a control point for a curve that intersects that point (using Catmull-Rom spline interpolation) or as a stopping point where the robotic arm 76 decelerates to and then accelerates off in a different direction towards the next point. With the exception of near the stopping points, the robotic arm 7b maintains a constant speed for the laser focal-point.
For laser cutting, the cutting speed should he constant, to cut consistently and to avoid 30 objects behind the cut receiving larger than desired laser illumination.
Experimental Demonstration Testing Criteria To truly test the functionality of the system two sets of points were agreed upon. Three valid and three Invalid scan points to be tested for the Scanning Operations. 16 Valid and 5 16 Invalid Cut Points for the cutting operations.
A "Point" consists of five values, the first three being the X, Y and Z position of the demanded point, the final two P and R (pitch and roll) being the angle the robotic arm should approach the aforementioned point.
A Valid point is one that is technically reachable by the snake aim. E.g. lm directly in front of the aim with no rotation (0,0,1,0,0).
An Invalid point is one that is impossible to reach by the robotic arm due to its technical 15 constraints. E.g. 5m directly in front with no rotation (0,0,5,0,0). This is deemed unreachable as it is outstretched beyond the reach of the aim.
These constraints are all calculated by the interface and form a key component of what is to be tested. These points then go into validation in the VR interface, this validation process consist of viewing the robotic arm's path for the now validated point sent into the interface. If this path is not hitting anything on the 3D scan, the operator will then also deem the operation valid and the cutting or scanning operation will take place.
Dataset Point X Point Y Point Z Point P Point R Type 1 0.2 0.3 1.4 0 0 Valid 2 0.2 1.4 1.4 0 0 Valid 3 -0.25 1.5 1.4 0.45 -0.26 Valid 4 0.2 0.3 4.5 0 0 Invalid 0.2 1.4 -4.4 0 0 Invalid 6 -0.25 1.5 1.4 3.14 0 Invalid 7 0.2 -0.2 1.3 0 0 Valid 0.2 0.2 1.3 0 0 Valid -0.2 0.2 1.3 0 0 Valid 8 -0.2 0.2 2.3 0 0 Valid 0.2 0.2 2.4 0 0 Valid 0.2 -0.2 2.3 0 0 Valid -0.2 0.2 1.3 0 0 Valid 9 -0.2 0.5 2.3 0.5 -0.26 Valid 0.2 0.5 2.4 0.5 -0.26 Valid 0.2 -0.2 2.3 0.5 -0.26 Valid -0.2 0.5 2.3 0.5 -0.26 Valid -0.2 0.5 4.3 3.14 0 Invalid 0.2 0.5 4.2 3.14 0 Invalid 0.2 -0.2 4.8 3.14 0 Invalid -0.2 0.5 4.1 3.14 0 Invalid 11 -0.2 0.5 5.3 0 0 Invalid 0.2 0.5 5.2 0 0 Invalid 0.2 -0.2 5.8 0 0 Invalid -0.2 0.5 5.1 0 0 Invalid 12 -0.2 0.5 1.3 0.5 0 Invalid 0.2 0.5 1.2 5 0 Invalid 0.2 -0.2 1.8 5 0 Invalid -0.2 0.5 1.1 5 0 Invalid Table 1 -Chart illustrating failure conditions (RED) and success conditions (Green) Table 1 provides a list of predefined points selected to test the criteria outlines above.
Dataset VR Validation Expected Result Actual 1 -Valid TRUE Scanning Operation Commences Scanning Operation Commences 2 -Valid TRUE Scanning Operation Commences Scanning Operation Commences 3 -Valid TRUE Scanning Operation Commences Scanning Operation Commences 4 -Invalid TRUE No Action Taken No Action Taken 5 -Invalid TRUE No Action Taken No Action Taken 6 -Invalid TRUE No Action Taken No Action Taken 1 -Valid FALSE No Action Taken No Action Taken 2 -Valid FALSE No Action Taken No Action Taken 3 -Valid FALSE No Action Taken No Action Taken 4 -Invalid FALSE No Action Taken No Action Taken -Invalid FALSE No Action Taken No Action Taken 6 -Invalid FALSE No Action Taken No Action Taken Table 2 -Scanning Operation Results 7 -Valid TRUE Cutting Operation Commences Cutting Operation Commences 8 -Valid TRUE Cutting Operation Commences Cutting Operation Commences 9 -Valid TRUE Cutting Operation Commences Cutting Operation Commences -Invalid TRUE No Action Taken No Action Taken 11 -Invalid TRUE No Action Taken No Action Taken 12 -Invalid TRUE No Action Taken No Action Taken 7 -Valid FALSE No Action Taken Cutting Operation Commences 8 -Valid FALSE No Action Taken Cutting Operation Commences 9 -Valid FALSE No Action Taken Cutting Operation Commences -Invalid FALSE No Action Taken No Action Taken 11 -Invalid FALSE No Action Taken No Action Taken 12 -Invalid FALSE No Action Taken No Action Taken Table 3 -Cutting Operation results Tables 2 and 3 comprise of a list of operations to run through to test that the logic for the validation of both scan operations and cut operations are fully tested.
As a summary of that logic, if the operation is deemed valid by the interface and then by the VR operator a cutting or scanning operation should take place. Otherwise the robotic arm should take no action.
The actual results of the test workflows are recorded in the "Actual" column. As can he seen, they all executed as expected.
Results A dry-run test was conducted to provide a full end to end test of the system and so this 30 tested the full lifccycle of the solution following the following workflow: Fitting of the Faro 3D scanner to the snake arm and creating a point cloud.
2. Point cloud importing from the Fano 3D scanner into the VR environment.
3. Definition of radiometric measurement points in the VR environment.
4. Sending of the defined radiometric scan point in the VR.
5. Importing of the new radiometric overlaid point cloud.
6. Moving of the snake arm and execution of radiometric scan from the defined point.
7. Definition of a cut point and path in VR.
8. Visualization of the snake arm within the VR.
Figure 11 shows the results obtained from fitting the Faro 3D scanner to the snake arm, conducting a scan to create a cloud point and then importing that into the VR environment for visualisation. This is before any radiometric data is added. The result was successful.
Figure 12 shows the results from the subsequent radiometric measurements. Following the method of the invention, the radiometric instrument was mounted on the snake arm and was positioned appropriately in the environment and a scan was completed. The results were saved and then combined with the spatial scan results to provide the overlay for inspection. The radiometric scan overlay is visible in blue, that corresponding to radiometric scanned space by with no radiation detected as no source was present. Other colours would reflect different levels of radiation. Again the result was successful.
Following these stages, the movement of the snake arm and the conduct of radiometric scans and spatial scans at different positions was demonstrated.
In the next stage, the VR operator selected a part of the environment, an upright pipe, and set up a series of cuts on that part. The visualisation of the cuts is shown in Figure 13. The cut paths were successfully defined and executed.
Finally, the snake arm was visualized within the VR environment at the position required 30 to conduct the cut points in Figure 13. Figure 14 shows the results of this visualisation of the snake arm correctly.
Overall the dry run was completed successfully with all component parts working as expected. This dry run constitutes evidence that all systems are communicating as expected.
Figure 16 shows a further set of experimental results featuring visualisation of physical constructions and radiometric data within that environment.
In the case of figure 17, the environment and radiometric information is visualised together with a series of cuts on part of the equipment (green circles).

Claims (26)

  1. CLAIMS1. Control apparatus for one or more robotic units, the apparatus comprising: 1) one or more databases for storing one or more data sets; 2) one or more first data processors adapted to: a. receive at least one data set from a database; b. display a representation of the data set to an operator; c. receive a proposed task set from the operator; d. evaluate one or more characteristics of the proposed task set against one or more test characteristics; c. provide an evaluated task set to a second processor; 3) a second processor adapted to: a. receive at least one evaluated task set; and b. to communicate said evaluated task set to a selected interface via a telecommunications network; 4) an interface for a robotic unit, the interface being adapted to: a. receive the evaluated task set via the telecommunications network; and b. provide operating instructions to the robotic unit according to the content of the evaluated task set; 5) a robotic unit adapted to operate in the environment according to the operating instructions.
  2. 2. Apparatus for generating control signals for one or more robotic units, the control signals being in the form of a task set, the apparatus comprising: 1) one or more first data processors adapted to: a. receive at least one data set from a database; b. display a representation of the data set to an operator; c. receive a proposed task set from the operator; d. evaluate one or more characteristics of the proposed task set against valid characteristics; e. provide an evaluated task set to a second processor, the second processor being adapted to communicate with the one or more robotic units.
  3. 3. Apparatus for generating and communicating a task set to one or more robotic units, the apparatus including apparatus according to claim 2 and further comprising: 1) the second processor, the second processor being adapted to: a. receive at least one evaluated task set; and h. to communicate said evaluated task set to a selected interface via a telecommunications network.
  4. 4. Apparatus according to claim 2 or claim 3, the apparatus further comprising: 1) an interface for a robotic unit, the interface being adapted to: a. receive the evaluated task set via the telecommunications network; and b. provide operating instructions to the robotic unit according to the content of the evaluated task set.
  5. 5. Apparatus according to any of claims 2 to 4, wherein: a) the apparatus is a part of control apparatus for one or more robotic units, the apparatus further comprising: a robotic unit adapted to operate in the environment according to the operating instructions; and/or b) the apparatus is a part of control apparatus for one or more robotic units, the 20 apparatus further comprising: one or more databases for storing one or more data sets.
  6. 6. Apparatus according to claim 1 or according to any one of claims 2 to 5, wherein the apparatus is a part of apparatus for one or more of: 1) conducting a spatial survey of the environment; and/or 2) for detection of radiation emissions in the environment; and/or 3) for conducting a radiometric survey of the environment; and/or 4) for altering the environment, the apparatus for altering the environment including one or more of: a) a manipulation device for moving one or more items in the environment and/or parts of the environment; b) a lifting device for lifting one or more items in the environment and/or parts of the environment; c) a cutting device, for instance a laser cutter, for cutting one or more items in the environment and/or parts of the environment.
  7. 7. Apparatus according to claim 1 or according to any of claims 2 to 5 or according to 5 claim 6, wherein the apparatus is for an evaluated task set which is conducted in a hazardous environment, such as: a) a radioactive environment; and/or b) an environment containing one or more sources of alpha and/or beta and/or gamma and/or neutron emissions. 10
  8. 8. Apparatus according to any preceding claim, wherein one or more or all of the robotic units are provided with a camera and/or with a light and/or with a direction indicator and/or with a spatial scanner and/or with a radiometric scanner and/or with apparatus for altering the environment or a part thereof.
  9. 9. Apparatus according to any preceding claim, wherein the one or more data sets include data sets of more than one type, the one or more data sets include one or more spatial data sets for the environment and/or the one or more data sets include radiometric data sets for the environment.
  10. 10. Apparatus according to any preceding claim, wherein the one or more first data processors adapted to receive at least one data set from a database receive two types of data set, a spatial data set and a radiometric data set, and the first processor generates a combined data set in which spatial data and radiometric data are combined in the data set
  11. 11. Apparatus according to any preceding claim, wherein the display of the representation of the data set to the operator is provided by computer software, for instance VR control software adapted to display a VR representation of the environment to the operator and/or to allow the operator to move within the VR representation of the environment and/or to define the position of dements, such as robotic units, within the VR representation of the environment and/or to provide the operator with radiometric information on parts of the environment within the VR representation of the environment and/or to defining positions for tasks within the VR representation of the environment.
  12. 12. Apparatus according to any preceding claim, wherein the apparatus further includes 5 one or more user input devices, the user input devices being adapted to allow the operator to build a proposed task set.
  13. 13. Apparatus according to any preceding claim, wherein the proposed task set is built: a) by taking into account spatial data shared with the operator and/or radiometric 10 data shared with the operator and/or the combination of those and/or information on the materials within the environment and/or forming parts of the environment; and/or b) of one or more movements of the robotic unit, for instance one or more extensions and/or one or more retractions and/or one or more rotations and/or one or more twists applied to one or more of the parts of the robotic unit; c) of one or more operations by the robotic unit within the environment. for instance one or more spatial scans and/or one or more radiometric scans and/or one or more cutting operations and/or one or more movements applied to a part of the environment.
  14. 14. Apparatus according to any preceding claim, wherein the one or more first data processors evaluate one or more characteristics of the proposed task set against one or more test characteristics by obtaining a set of test characteristics.
  15. 15. Apparatus according to claim 14, wherein: a) the one or more Lest characteristics provided are permissible characteristics; and/or b) the one or more test characteristics provided are impermissible characteristics; and/or c) the one or more test characteristics arc defined for multiple environments, for 30 instance: i) permissions and/or limitations on the rate of movement of a robotic unit; ii) permissions and/or limitations on the laser power useable d) the one or more test characteristics are defined for a robotic unit type, for instance: i) permissible and/or impermissible movements or combinations of movement; e) the one or more test characteristics are defined for a specific environment, for instance: i) impermissible positions for a part of the robotic unit; 0 the one or more test characteristics are amended during the conduct of a task 10 set, for nstance: i) to reflect the changes on what is permissible and/or impermissible due to the changes in the environment by the conduct of the task set so far.
  16. 16. Apparatus according to any preceding claim, wherein a proposed task set which 15 includes one or more characteristics which do not pass the evaluation is declined as a task set for use and/or is not be classified as an evaluated task set.
  17. 17. Apparatus according to any preceding claim, wherein a proposed task set which includes only characteristics which pass the evaluation is accepted as a task set for use 20 and/or is classified as an evaluated task set.
  18. 18. Apparatus according to any preceding claim, wherein, one or more evaluated task sets arc be stored for later use and/or the evaluated task set is provided in a queue relative to one or more other evaluated task sets for the same and/or for different robotic units. 25
  19. 19. Apparatus according to any preceding claim, wherein the evaluated task set is provided to an output interface and the output interface converts the evaluated task set into operating instructions compatible with the interface for the robotic unit and/or for the robotic unit to which the evaluated task set is directed.
  20. 20. Apparatus according to claim 1 or claim 3 or any claim depending through those claims, wherein the second processor has one or more first data processors as clients, for instance greater than three clients, for instance the second processor has one or more interfaces for robotic units as clients for greater than 10 clients.
  21. 21. Apparatus according to claim 1 or claim 3 or any claim depending through those 5 claims, wherein the second processor is informed by the first processor to communicate with a particular interface and/or for the interface to communicate with a particular robotic unit for a given evaluated task set and/or robotic data set.
  22. 22. Apparatus according to claim 1 or claim 3 or any claim depending through those claims, wherein the robotic unit provides positional information to the interface, the positional information being on one or more or all parts of the robotic unit, the positional information enabling the first processor to provide visualisation of the position of the robotic unit in the environment to an operator.
  23. 23. Apparatus for configuring apparatus for generating a task set for one or more robotic units, the apparatus comprising: 1) a configuring processor adapted to: a. receive inputs from an operator and generate control signals for a robotic unit from amongst the one or more robotic units from those inputs; b. provide the control signals to a second processor, the second processor being adapted to communicate with the one or more robotic units; 2) a receiver for one or more observed data sets from the one or more robotic units provided with the control signals, the receiver being adapted to communicate a data set from an observed data set to one or more databases.
  24. 24. Apparatus according to claim 23, wherein the apparatus provides: a) a determination of the spatial form of the environment by one or more of: i) the determination of the spatial form of the environment including a scan of the environment from one or more positions; ii) the determination of the spatial form of the environment including a first spatial scan of the environment from a first spatial scan position to give a first spatial scan data set; and/or b) a determination of the radiometric form of the environment by one or more of: i) the determination of the radiometric form of the environment including a scan of the environment from one or more positions; ii) the determination of the radiometric form of the environment including a first radiometric scan of the environment from a first radiometric scan position to give a first radiometric scan data set.
  25. 25. Apparatus according to claim 23 or claim 24, wherein the configuring processor provides a user interface for an operator adapted to control the position of the robotic unit I() in the environment and/or the robotic unit is provided at a series of positions within the environment with one or more determinations of one or more types being made at one or more or all of those positions.
  26. 26. A computer implemented method of controlling one or more robotic units, the method 15 comprising: 1) obtaining one or more data sets from one or more databases storing the one or more data sets; 2) by one or more first data processors: a. receiving at least one data set from a database; b. displaying a representation of the data set to an operator; c. receiving a proposed task set from the operator; d. evaluating one or more characteristics of the proposed task set against one or more test characteristics; e. providing an evaluated task set to a second processor; 3) by a second processor: a. receiving at least one evaluated task set; and b. communicating said evaluated task set to a selected interface via a telecommunications network; 4) by an interface for a robotic unit, the interface: a. receiving the evaluated task set via the telecommunications network; and b. providing operating instructions to the robotic unit according to the content of the evaluated task set; 5) operating a robotic unit in the environment according to the operating instructions.
GB1917716.1A 2018-12-04 2019-12-04 Improvements in and relating to control apparatus Active GB2581013B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
GBGB1819806.9A GB201819806D0 (en) 2018-12-04 2018-12-04 Improvements in and relating to control system
GBGB1819805.1A GB201819805D0 (en) 2018-12-04 2018-12-04 Improvements in and relating to control systems
GBGB1910943.8A GB201910943D0 (en) 2019-07-31 2019-07-31 Improvements in and related to control apparatus

Publications (3)

Publication Number Publication Date
GB201917716D0 GB201917716D0 (en) 2020-01-15
GB2581013A true GB2581013A (en) 2020-08-05
GB2581013B GB2581013B (en) 2023-04-26

Family

ID=68887432

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1917716.1A Active GB2581013B (en) 2018-12-04 2019-12-04 Improvements in and relating to control apparatus

Country Status (5)

Country Link
EP (1) EP3890927A1 (en)
JP (1) JP2022511504A (en)
CA (1) CA3121735A1 (en)
GB (1) GB2581013B (en)
WO (1) WO2020115479A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220108046A1 (en) * 2020-10-05 2022-04-07 Autodesk, Inc. Generative design techniques for soft robot manipulators

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0674976A1 (en) * 1994-03-29 1995-10-04 General Electric Company Maintenance system
US20150239121A1 (en) * 2014-02-27 2015-08-27 Fanuc Corporation Robot simulation device for generating motion path of robot

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6815687B1 (en) * 1999-04-16 2004-11-09 The Regents Of The University Of Michigan Method and system for high-speed, 3D imaging of optically-invisible radiation
GB201105450D0 (en) 2011-03-31 2011-05-18 Babcock Nuclear Ltd Improvements in and relating to methods and systems for investigating radioactive sources in locations
US9283674B2 (en) * 2014-01-07 2016-03-15 Irobot Corporation Remotely operating a mobile robot
KR101585502B1 (en) * 2014-04-14 2016-01-22 한국원자력연구원 Cutting process simulation method with cad kernel and system thereof
WO2016040862A2 (en) * 2014-09-12 2016-03-17 Chizeck Howard Jay Integration of auxiliary sensors with point cloud-based haptic rendering and virtual fixtures
JP2017047519A (en) * 2015-09-04 2017-03-09 Rapyuta Robotics株式会社 Cloud robotics system, information processor, program, and method for controlling or supporting robot in cloud robotics system
JP6457421B2 (en) * 2016-04-04 2019-01-23 ファナック株式会社 Machine learning device, machine system, manufacturing system, and machine learning method for learning using simulation results

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0674976A1 (en) * 1994-03-29 1995-10-04 General Electric Company Maintenance system
US20150239121A1 (en) * 2014-02-27 2015-08-27 Fanuc Corporation Robot simulation device for generating motion path of robot

Also Published As

Publication number Publication date
CA3121735A1 (en) 2020-06-11
JP2022511504A (en) 2022-01-31
EP3890927A1 (en) 2021-10-13
GB201917716D0 (en) 2020-01-15
GB2581013B (en) 2023-04-26
WO2020115479A1 (en) 2020-06-11

Similar Documents

Publication Publication Date Title
JP6731478B2 (en) Inspection program editing environment including integrated alignment program planning and editing functions
CN108508850A (en) Manufacturing process data collects and analyzes
US10065318B2 (en) Methods and systems of repairing a structure
US20180267500A1 (en) Method and system for integrated manufacturing production quality inspections
KR100477917B1 (en) Information processing apparatus and method
Lindskog et al. Visualization support for virtual redesign of manufacturing systems
EP3191264A2 (en) Integration of auxiliary sensors with point cloud-based haptic rendering and virtual fixtures
CN105389410B (en) Three-dimensional model generation method and three-dimensional model generation system
KR100976829B1 (en) The method and service system for reviewing design of shipbuilding and offshore plant
EP3045394B1 (en) Method and system for repairing a structure
GB2581013A (en) Improvements in and relating to control apparatus
JP6719368B2 (en) Three-dimensional space visualization device, three-dimensional space visualization method and program
EP1400881A2 (en) Method and apparatus for supporting measurement of object to be measured
KR102117008B1 (en) System and Method for Inspecting Power Plant using VR Panoramic Contents
JP2020154466A (en) Concrete structure management device, information processing system, concrete structure management method, and program
US20190162680A1 (en) Method for operating an x-ray system
JP2020154467A (en) Concrete structure management device, information processing system, concrete structure management method, and program
Berglund et al. Using 3D laser scanning to support discrete event simulation of production systems: lessons learned
KR101717597B1 (en) The dimensional control system based on mobile
KR101716480B1 (en) Apparatus for inputting test-case using gui
JP2011145238A (en) Display processor for x-ray analysis
US7672810B2 (en) Method, device and computer program for evaluating an object using a virtual representation of said object
Leutert et al. Projector-based Augmented Reality support for shop-floor programming of industrial robot milling operations
US20230221709A1 (en) System and method for manufacturing and maintenance
Zou et al. Tele-assembly System for final assembly of the fusion ignition target