WO2017198299A1 - Method of simulating a robotic system - Google Patents

Method of simulating a robotic system Download PDF

Info

Publication number
WO2017198299A1
WO2017198299A1 PCT/EP2016/061171 EP2016061171W WO2017198299A1 WO 2017198299 A1 WO2017198299 A1 WO 2017198299A1 EP 2016061171 W EP2016061171 W EP 2016061171W WO 2017198299 A1 WO2017198299 A1 WO 2017198299A1
Authority
WO
WIPO (PCT)
Prior art keywords
tracking
representation
computer
end effector
path
Prior art date
Application number
PCT/EP2016/061171
Other languages
French (fr)
Inventor
Tommy Svensson
Original Assignee
Abb Schweiz Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Abb Schweiz Ag filed Critical Abb Schweiz Ag
Priority to PCT/EP2016/061171 priority Critical patent/WO2017198299A1/en
Publication of WO2017198299A1 publication Critical patent/WO2017198299A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/406Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by monitoring or safety
    • G05B19/4069Simulating machining process on screen
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39031Use of model for robot and for measuring device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40121Trajectory planning in virtual space
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40131Virtual reality control, programming of manipulator
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40314Simulation of program locally before remote operation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40515Integration of simulation and planning
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • the present disclosure generally relates to industrial robots. In particular, it relates to a method of simulating a robotic system.
  • Robotic systems are commonly used in many industrial applications, for example for welding, assembly and pick and place.
  • a robotic system typically includes a robot having an end effector including a tool such as a welding tool or a gripper.
  • the robotic system is controlled by software and is programmed to perform a number of operations, such as moving between predetermined positions, and carrying out operations such as gripping and releasing a workpiece in certain positions.
  • Robot simulators have been used in order to write, test and debug the software controlling a robotic system prior to it being tested on the robotic system.
  • US 2007/282485 Ai discloses a robot simulation apparatus capable of creating and executing a robot program including a virtual space creating unit for creating a virtual space, a workpiece model layout unit for automatically arranging at least one workpiece model in an appropriate posture at an appropriate position in a workpiece
  • the camera model acquires the virtual images of the workpiece models in the visual field of the camera model.
  • a virtual camera means displays the acquired virtual image as a screen on a display unit. Based on the virtual image displayed, the teaching points in the robot program are corrected by the correcting means.
  • the correcting means first selects an appropriate workpiece model from the virtual image and calculates the posture and position thereof.
  • the robot program describes the method by which the hand of the robot grasps a workpiece in a
  • the teaching points in the robot program are thus changed based on the calculated posture and position of the workpiece model so that the hand of the robot can grasp the workpiece model.
  • a computer-implemented method of simulating a robotic system in a virtual environment by means of a model of the robotic system including a representation of an end effector to which a representation of a tracking system is mounted, and a representation of a robot controller wherein the method comprises: a) obtaining a programmed nominal path, b) obtaining a tracking path in the virtual environment, to be followed by the representation of the end effector, c) detecting a tracking point along the tracking path by means of the representation of the tracking system, d) determining a location of the tracking point in the virtual environment, e) updating the programmed nominal path based on the location of the tracking point, and f) controlling movement of the representation of the end effector according to the updated programmed nominal path by means of the representation of the robot controller.
  • An effect obtainable thereby is that a tracking system may be tested in a virtual environment before use in a real environment by means of the representation thereof. Moreover, different mounting possibilities of a tracking system on the end effector may be tested, to optimise the position of the tracking system for a certain application. Furthermore, the reachability of the end effector and tracking system, with respect to workpiece which the end effector is to interact with, may be verified.
  • One embodiment comprises repeating steps c) to f) until a final position of the programmed nominal path has been reached.
  • step d) involves di) determining the coordinates of the tracking point in a coordinate system of the representation of the tracking system, and d2) converting the coordinates of the tracking point in the coordinate system of the representation of the tracking system to the coordinates of the tracking point in a coordinate system of the
  • the representation of the end effector is a model of a camera.
  • the model of the camera may advantageously correspond to an actual camera used in robot systems.
  • One embodiment comprises, prior to step c), providing a plurality of user- selectable representations of tracking systems, each being a model of an existing type of tracking system, and receiving a user-input of a selection of a representation of a tracking system of the plurality of representations of tracking systems.
  • the end effector comprises a tool.
  • the tool is welding tool. It is especially advantageous to follow a welding tool by a tracking system in order to ensure that welding is properly performed along an entire programmed nominal path along which the end effector is to move.
  • a computer program comprising computer-executable components which when executed by processing circuitry causes a robot simulator system to perform the method of the first aspect.
  • a computer program product comprising a computer program according to the second aspect, and a storage medium on which the computer program is stored.
  • a robot simulator system for simulating a robotic system in a virtual environment by means of a model of the robotic system including a representation of an end effector to which a representation of a tracking system is mounted, and a representation of a robot controller, wherein the robot simulator system comprises: a display device, processing circuitry, and a storage medium comprising computer code, which when run on the processing circuitry causes the robot simulator system to perform the method according to the first aspect presented herein.
  • Fig. l schematically shows a block diagram of a robot simulation device
  • Fig. 2 schematically shows a perspective view of a screen shot of a simulated robotic system
  • Fig. 3 is a flowchart of a method of simulating a robotic system; and Figs 4a-d schematically show top views of a tracking path.
  • the present disclosure relates to a computer-implemented method of simulating a robotic system in a virtual environment, typically a three- dimensional environment.
  • the method is based on simulation utilising a model of the robotic system including a model of an industrial robot having an end effector, a model of a tracking system including a tracking sensor, and a representation of a robot controller.
  • the method is hence based on simulations of representations of an industrial robot having an end effector and a representation of the tracking system, and of the robot controller.
  • the representation of the tracking system is mounted to the end effector. To this end, simulated movement of the end effector results in concurrent movement of the tracking system.
  • the representation of the tracking system models the behaviour and operation of a real tracking system.
  • the tracking system includes a sensor.
  • the representation of the tracking system includes a representation of a sensor.
  • the tracking system may for example be an optical tracking system.
  • the model of the optical tracking system may include a representation of an electromagnetic wave emitter configured to emit simulated or virtual electromagnetic waves onto a workpiece and a representation of a sensor configured to detect simulated electromagnetic waves emitted by the representation of the electromagnetic wave emitter reflected by the
  • a tracking system is a sonar tracking system.
  • realistic simulation of a robotic system may be provided. Examples of a robot simulator system and a computer-implemented method of simulating a robotic system will now be described with reference to Figs l to 4d.
  • Fig. l shows a block diagram of a robot simulator system l configured to perform the computer-implemented method disclosed herein.
  • the robot simulator system 1 is configured to simulate a robotic system in a virtual environment by means of a model of the robotic system including a
  • the robot simulator system 1 may for example be a personal computer, a workstation or a dedicated simulator device solely for robot simulator use.
  • the robot simulator system 1 comprises processing circuitry 3 and storage medium 5.
  • the processing circuitry 3 is configured to communicate with the storage medium 5, for example to retrieve computer-executable components therein.
  • the storage medium 5 includes computer-executable components, i.e. a computer code, which when executed by the processing circuitry 3 causes the robot simulator system 1 to perform the computer-implemented method disclosed herein.
  • the storage medium 5 may for example be a Random Access Memory (RAM), a Flash memory or a hard disk drive.
  • the processing circuitry 3 uses any combination of one or more of a suitable central processing unit (CPU), multiprocessor, microcontroller, digital signal processor (DSP), application specific integrated circuit (ASIC), field programmable gate arrays (FPGA) etc., capable of executing any herein disclosed operations.
  • the robot simulator system l may furthermore comprise a display device 7.
  • the processing circuitry 3 may be configured to communicate with the display device 7.
  • the display device 7 is configured to display graphics related to the simulation of the robotic system and provided by the processing circuitry 3.
  • the display device 7 may be configured to display a virtual environment and representations of an industrial robot, an optical tracking system and workpieces associated with the simulation.
  • the simulation is based on a model of a robotic system including a model of an industrial robot, a model of a tracking system, and a model of robot controller.
  • models which are coded in the computer program, realistic interaction between the representations of the tracking system, the robot controller and the industrial robot can be obtained.
  • the computer-executable components or computer code contained in the storage medium 5 form a robot simulator software.
  • the robot simulator system 1 may be configured to display a graphical user interface of the robot simulator software on the display device 7. A user may thus be allowed to interact with the robot simulator software, for example by selecting certain parameters to be tested in a simulation.
  • Fig. 2 shows a view in a virtual environment 10 of a representation of a robotic system 9 as displayed by the display device 7, including a
  • the virtual environment 10 also includes a workpiece 17 with which the end effector 13 is interacting, in this example performing seam welding, and a tracking path 19 along which the seam welding is to be performed.
  • a programmed nominal path is obtained.
  • the programmed nominal path may be obtained in a number of known ways, for example by manually programming the path, i.e. providing the coordinates in the coordinate system of the representation of the robot, in particular the end effector thereof.
  • a tracking path 19 is determined in the virtual environment. The tracking path may for example be visualised on a representation of a workpiece.
  • This tracking path is to be tracked or followed by the end effector 13, which is a representation of an end effector.
  • the tracking path 19 may for example be determined by the user via the graphical user interface, where a user may select suitable parameters concerning the geographical location of the tracking path in the virtual environment, or it may be computer-generated, for example randomly.
  • the tracking path 19 may for example initially be set to be the same as the programmed nominal path. A user may then manually alter the tracking path 19 so that it deviates from the programmed nominal path before the simulation commences. Alternatively, the tracking path 19 could be left unaltered, i.e. to be identical to the programmed nominal path.
  • the tracking path 19 may differ from the programmed nominal path as the programmed nominal path may be specifically adapted to one specific workpiece that is to interact with the end effector 13.
  • the tracking path 19 can thus be utilised to simulate the differences that can occur between a programmed nominal path and an actual tracking path in a real-world situation.
  • the workpiece which the robot is to interact with may be a structure on which welding is to be performed.
  • a following or subsequent workpiece for which the same welding procedure is to be performed may differ slightly from the original workpiece in structural terms and/or the following workpiece may be placed slightly differently on the support structure in the robot cell.
  • the simulation provided by the present computer-implemented method may according to one variation be visualised, i.e. on display device 7.
  • the user would be able to see the accuracy of the tracking/following of the tracking path in real-time.
  • the tracking path is visualised in the virtual environment when it has been determined in step b).
  • the simulation may be carried out without visualisation on a display device.
  • the result of the simulation may be presented to the user, for example on a display device after the simulation has been performed. The user could then be presented with data regarding the accuracy of the tracking/following of the tracking path.
  • a tracking point 21 is detected along the tracking path 19 by means of the representation of the tracking system 15.
  • the tracking point 21 may be any point identified along the tracking path 19 from the current position of the representation of the tracking system 15.
  • the tracking system 15 is an optical tracking system
  • image analysis of the virtual image obtained by the representation of the optical tracking system 15 is utilised to determine where the tracking path 19 is located in the virtual image captured by the representation of the optical tracking system 15. A tracking point 21 may thus be detected.
  • a plurality of user-selectable representations of tracking systems may be provided. These may be displayed in the graphical user interface on the display device 7.
  • the representations of tracking systems may be models of existing types of tracking systems, for example of different types of cameras.
  • the method may hence include a step of receiving a user-input of a selection of a representation of a tracking system of the plurality of representations of tracking systems.
  • different types of tracking systems for example models of different types of cameras may be tested to determine which one may best fit a specific application.
  • a user may determine and set the placement of the representation of the tracking system 15 on the end effector 13. Different placements of a tracking system may provide different results as to how surfaces of a workpiece may be efficiently scanned by the tracking system.
  • a location of the tracking point 21 is determined in the virtual environment.
  • Step d may involve a step di) of determining the coordinates of the tracking point 21 in a coordinate system of the representation of the tracking system, and step d2) of converting the coordinates of the tracking point 21 in the coordinate system of the representation of the tracking system 15 to the coordinates of the tracking point 21 in a coordinate system of the end effector 13. In this manner the location of the tracking point 21 may be determined in step d).
  • a step e) the programmed nominal path is updated based on the location of the tracking point 21, and on the location of the end effector 13.
  • the programmed nominal path is thus corrected in the event that the tracking point 21 deviates from the programmed nominal path.
  • Fig. 4a shows an example of an initial programmed nominal path 23 extending between a start position A and a final position B, and a tracking path 19.
  • the programmed nominal path 23 may for example have been programmed having a specific workpiece in mind.
  • the tracking path 19 is adapted to a corresponding path to be followed, of a subsequent workpiece and deviates from the initial programmed nominal path 23 by a curved portion 19a.
  • Fig. 4b shows a field of view 25 of the representation of the tracking system 15, which according to the example is an optical tracking system, and a tracking point 21 detected along the curved portion 19a.
  • the robot controller will thus update the initial programmed nominal path 23 so that the end effector 13 follows the tracking path 19 along the curved portion 19a instead of the initial programmed nominal path 23.
  • the initial programmed nominal path 23 will thus be updated each time a virtual image is captured by the representation of the tracking system, whereby the end effector 13 is able to follow the tracking path 19 from the start position A to the final position B. This is further shown in Figs 4c and d.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
  • Numerical Control (AREA)

Abstract

The present disclosure relates to a computer-implemented method of simulating a robotic system in a virtual environment (10) by means of a model of the robotic system including a representation of an industrial robot (11) having an end effector (13), a representation of a tracking system (15) mounted to the end effector (13), and a representation of a robot controller. The method comprises: a) obtaining a programmed nominal path, b) visualising a tracking path (19) in the virtual environment (10), to be tracked by the end effector (13), c) detecting a tracking point (21) along the tracking path (19) by means of the representation of the tracking system (15), d) determining a location of the tracking point (21) in the virtual environment (10), e) updating the programmed nominal path based on the location of the tracking point (21), and f) controlling movement of the end effector (13) according to the updated programmed nominal path by means of the representation of the robot controller.

Description

METHOD OF SIMULATING A ROBOTIC SYSTEM
TECHNICAL FIELD
The present disclosure generally relates to industrial robots. In particular, it relates to a method of simulating a robotic system.
BACKGROUND
Robotic systems are commonly used in many industrial applications, for example for welding, assembly and pick and place.
A robotic system typically includes a robot having an end effector including a tool such as a welding tool or a gripper. The robotic system is controlled by software and is programmed to perform a number of operations, such as moving between predetermined positions, and carrying out operations such as gripping and releasing a workpiece in certain positions.
Robot simulators have been used in order to write, test and debug the software controlling a robotic system prior to it being tested on the robotic system. US 2007/282485 Ai, for example, discloses a robot simulation apparatus capable of creating and executing a robot program including a virtual space creating unit for creating a virtual space, a workpiece model layout unit for automatically arranging at least one workpiece model in an appropriate posture at an appropriate position in a workpiece
accommodation unit model defined in the virtual space, a virtual camera unit for acquiring a virtual image of workpiece models in the range of a designed visual field as viewed from a designated place in the virtual space, a correction unit for correcting the teaching points in the robot program based on the virtual image, and a simulation unit for simulating the operation of the robot handing the workpieces. Thus, the camera model acquires the virtual images of the workpiece models in the visual field of the camera model. A virtual camera means displays the acquired virtual image as a screen on a display unit. Based on the virtual image displayed, the teaching points in the robot program are corrected by the correcting means. The correcting means first selects an appropriate workpiece model from the virtual image and calculates the posture and position thereof. The robot program describes the method by which the hand of the robot grasps a workpiece in a
predetermined posture at a predetermined position. The teaching points in the robot program are thus changed based on the calculated posture and position of the workpiece model so that the hand of the robot can grasp the workpiece model.
SUMMARY
The system disclosed in US 2007/282485 Ai is not suitable for certain robot simulation application, in particular for those applications for which it is critical that the camera detects a feature such as a weld seam at all times. This is not possible by means of the system disclosed in US 2007/282485 Ai, where the camera or virtual camera has a fixed position relative to the manipulator at all times. An object of the present disclosure is to provide a method of simulating a robotic system which solves or at least mitigates problems of the prior art.
There is hence according to a first aspect of the present disclosure provided a computer-implemented method of simulating a robotic system in a virtual environment by means of a model of the robotic system including a representation of an end effector to which a representation of a tracking system is mounted, and a representation of a robot controller, wherein the method comprises: a) obtaining a programmed nominal path, b) obtaining a tracking path in the virtual environment, to be followed by the representation of the end effector, c) detecting a tracking point along the tracking path by means of the representation of the tracking system, d) determining a location of the tracking point in the virtual environment, e) updating the programmed nominal path based on the location of the tracking point, and f) controlling movement of the representation of the end effector according to the updated programmed nominal path by means of the representation of the robot controller. An effect obtainable thereby is that a tracking system may be tested in a virtual environment before use in a real environment by means of the representation thereof. Moreover, different mounting possibilities of a tracking system on the end effector may be tested, to optimise the position of the tracking system for a certain application. Furthermore, the reachability of the end effector and tracking system, with respect to workpiece which the end effector is to interact with, may be verified.
It may thereby be possible to reduce costs during the phase of robot cell planning and design, and to save time during commissioning of the robot cell.
One embodiment comprises repeating steps c) to f) until a final position of the programmed nominal path has been reached.
According to one embodiment step d) involves di) determining the coordinates of the tracking point in a coordinate system of the representation of the tracking system, and d2) converting the coordinates of the tracking point in the coordinate system of the representation of the tracking system to the coordinates of the tracking point in a coordinate system of the
representation of the end effector, thereby determining the location of the tracking point in the virtual environment. According to one embodiment the representation of the tracking system is a model of a camera. The model of the camera may advantageously correspond to an actual camera used in robot systems.
One embodiment comprises, prior to step c), providing a plurality of user- selectable representations of tracking systems, each being a model of an existing type of tracking system, and receiving a user-input of a selection of a representation of a tracking system of the plurality of representations of tracking systems. Hereby, a number of different types of tracking system may be tested and evaluated in order to determine an optimal tracking system type for a certain application. According to one embodiment the end effector comprises a tool.
According to one embodiment the tool is welding tool. It is especially advantageous to follow a welding tool by a tracking system in order to ensure that welding is properly performed along an entire programmed nominal path along which the end effector is to move.
There is according to a second aspect of the present disclosure provided a computer program comprising computer-executable components which when executed by processing circuitry causes a robot simulator system to perform the method of the first aspect. There is according to a third aspect provided a computer program product comprising a computer program according to the second aspect, and a storage medium on which the computer program is stored.
There is according to a fourth aspect provided a robot simulator system for simulating a robotic system in a virtual environment by means of a model of the robotic system including a representation of an end effector to which a representation of a tracking system is mounted, and a representation of a robot controller, wherein the robot simulator system comprises: a display device, processing circuitry, and a storage medium comprising computer code, which when run on the processing circuitry causes the robot simulator system to perform the method according to the first aspect presented herein.
Generally, all terms used in the claims are to be interpreted according to their ordinary meaning in the technical field, unless explicitly defined otherwise herein. All references to "a/an/the element, apparatus, component, means, etc. are to be interpreted openly as referring to at least one instance of the element, apparatus, component, means, etc., unless explicitly stated otherwise.
BRIEF DESCRIPTION OF THE DRAWINGS
The specific embodiments of the inventive concept will now be described, by way of example, with reference to the accompanying drawings, in which: Fig. l schematically shows a block diagram of a robot simulation device;
Fig. 2 schematically shows a perspective view of a screen shot of a simulated robotic system;
Fig. 3 is a flowchart of a method of simulating a robotic system; and Figs 4a-d schematically show top views of a tracking path. DETAILED DESCRIPTION
The inventive concept will now be described more fully hereinafter with reference to the accompanying drawings, in which exemplifying
embodiments are shown. The inventive concept may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the inventive concept to those skilled in the art. Like numbers refer to like elements throughout the description. The present disclosure relates to a computer-implemented method of simulating a robotic system in a virtual environment, typically a three- dimensional environment. The method is based on simulation utilising a model of the robotic system including a model of an industrial robot having an end effector, a model of a tracking system including a tracking sensor, and a representation of a robot controller. The method is hence based on simulations of representations of an industrial robot having an end effector and a representation of the tracking system, and of the robot controller. The representation of the tracking system is mounted to the end effector. To this end, simulated movement of the end effector results in concurrent movement of the tracking system.
The representation of the tracking system models the behaviour and operation of a real tracking system. The tracking system includes a sensor. To be more precise, the representation of the tracking system includes a representation of a sensor. The tracking system may for example be an optical tracking system. In this case, the model of the optical tracking system may include a representation of an electromagnetic wave emitter configured to emit simulated or virtual electromagnetic waves onto a workpiece and a representation of a sensor configured to detect simulated electromagnetic waves emitted by the representation of the electromagnetic wave emitter reflected by the
workpiece. Another alternative of a tracking system is a sonar tracking system. By means of the present method realistic simulation of a robotic system may be provided. Examples of a robot simulator system and a computer-implemented method of simulating a robotic system will now be described with reference to Figs l to 4d.
Fig. l shows a block diagram of a robot simulator system l configured to perform the computer-implemented method disclosed herein. The robot simulator system 1 is configured to simulate a robotic system in a virtual environment by means of a model of the robotic system including a
representation of an industrial robot having an end effector, a representation of a tracking system mounted to the end effector, and a representation of a robot controller. The robot simulator system 1 may for example be a personal computer, a workstation or a dedicated simulator device solely for robot simulator use.
The robot simulator system 1 comprises processing circuitry 3 and storage medium 5. The processing circuitry 3 is configured to communicate with the storage medium 5, for example to retrieve computer-executable components therein. The storage medium 5 includes computer-executable components, i.e. a computer code, which when executed by the processing circuitry 3 causes the robot simulator system 1 to perform the computer-implemented method disclosed herein.
The storage medium 5 may for example be a Random Access Memory (RAM), a Flash memory or a hard disk drive. The processing circuitry 3 uses any combination of one or more of a suitable central processing unit (CPU), multiprocessor, microcontroller, digital signal processor (DSP), application specific integrated circuit (ASIC), field programmable gate arrays (FPGA) etc., capable of executing any herein disclosed operations. The robot simulator system l may furthermore comprise a display device 7. The processing circuitry 3 may be configured to communicate with the display device 7. The display device 7 is configured to display graphics related to the simulation of the robotic system and provided by the processing circuitry 3. In particular, the display device 7 may be configured to display a virtual environment and representations of an industrial robot, an optical tracking system and workpieces associated with the simulation.
The simulation is based on a model of a robotic system including a model of an industrial robot, a model of a tracking system, and a model of robot controller. By means of these models, which are coded in the computer program, realistic interaction between the representations of the tracking system, the robot controller and the industrial robot can be obtained.
The computer-executable components or computer code contained in the storage medium 5 form a robot simulator software. The robot simulator system 1 may be configured to display a graphical user interface of the robot simulator software on the display device 7. A user may thus be allowed to interact with the robot simulator software, for example by selecting certain parameters to be tested in a simulation.
Fig. 2 shows a view in a virtual environment 10 of a representation of a robotic system 9 as displayed by the display device 7, including a
representation of an industrial robot 11 having an end effector 13 which according to the present example is a tool, namely a welding tool, a representation of an optical tracking system 15, which in the present illustration is exemplified by a camera. The virtual environment 10 also includes a workpiece 17 with which the end effector 13 is interacting, in this example performing seam welding, and a tracking path 19 along which the seam welding is to be performed.
A computer-implemented method of simulating a robotic system by means of the robot simulator system 1 will now be described with reference to Fig. 3. In a step a) a programmed nominal path is obtained. The programmed nominal path may be obtained in a number of known ways, for example by manually programming the path, i.e. providing the coordinates in the coordinate system of the representation of the robot, in particular the end effector thereof. In a step b) a tracking path 19 is determined in the virtual environment. The tracking path may for example be visualised on a representation of a workpiece.
This tracking path is to be tracked or followed by the end effector 13, which is a representation of an end effector. The tracking path 19 may for example be determined by the user via the graphical user interface, where a user may select suitable parameters concerning the geographical location of the tracking path in the virtual environment, or it may be computer-generated, for example randomly. The tracking path 19 may for example initially be set to be the same as the programmed nominal path. A user may then manually alter the tracking path 19 so that it deviates from the programmed nominal path before the simulation commences. Alternatively, the tracking path 19 could be left unaltered, i.e. to be identical to the programmed nominal path.
In view of the above, the tracking path 19 may differ from the programmed nominal path as the programmed nominal path may be specifically adapted to one specific workpiece that is to interact with the end effector 13. The tracking path 19 can thus be utilised to simulate the differences that can occur between a programmed nominal path and an actual tracking path in a real-world situation. As an example, in a real-world situation, the workpiece which the robot is to interact with may be a structure on which welding is to be performed. However, a following or subsequent workpiece for which the same welding procedure is to be performed may differ slightly from the original workpiece in structural terms and/or the following workpiece may be placed slightly differently on the support structure in the robot cell. This results in that the programmed nominal path must be corrected during the welding procedure utilising a tracking system and a robot controller, and this correction and updating occurs each time a new image is captured by the tracking system. This procedure is simulated by the present method, as will be understood from the following.
It is further to be noted that the simulation provided by the present computer-implemented method may according to one variation be visualised, i.e. on display device 7. In this case, the user would be able to see the accuracy of the tracking/following of the tracking path in real-time. According to this variation, the tracking path is visualised in the virtual environment when it has been determined in step b). According to another variation, the simulation may be carried out without visualisation on a display device. In this case, the result of the simulation may be presented to the user, for example on a display device after the simulation has been performed. The user could then be presented with data regarding the accuracy of the tracking/following of the tracking path. In a step c) a tracking point 21 is detected along the tracking path 19 by means of the representation of the tracking system 15. The tracking point 21 may be any point identified along the tracking path 19 from the current position of the representation of the tracking system 15. In case the tracking system 15 is an optical tracking system, image analysis of the virtual image obtained by the representation of the optical tracking system 15 is utilised to determine where the tracking path 19 is located in the virtual image captured by the representation of the optical tracking system 15. A tracking point 21 may thus be detected.
Prior to step c) for example before or after any of steps a) and b) a plurality of user-selectable representations of tracking systems may be provided. These may be displayed in the graphical user interface on the display device 7. According to this variation, the representations of tracking systems may be models of existing types of tracking systems, for example of different types of cameras. The method may hence include a step of receiving a user-input of a selection of a representation of a tracking system of the plurality of representations of tracking systems. Thereby, different types of tracking systems, for example models of different types of cameras may be tested to determine which one may best fit a specific application. Moreover, according to one variation, a user may determine and set the placement of the representation of the tracking system 15 on the end effector 13. Different placements of a tracking system may provide different results as to how surfaces of a workpiece may be efficiently scanned by the tracking system. In step d) a location of the tracking point 21 is determined in the virtual environment.
Step d may involve a step di) of determining the coordinates of the tracking point 21 in a coordinate system of the representation of the tracking system, and step d2) of converting the coordinates of the tracking point 21 in the coordinate system of the representation of the tracking system 15 to the coordinates of the tracking point 21 in a coordinate system of the end effector 13. In this manner the location of the tracking point 21 may be determined in step d).
In a step e) the programmed nominal path is updated based on the location of the tracking point 21, and on the location of the end effector 13. The programmed nominal path is thus corrected in the event that the tracking point 21 deviates from the programmed nominal path.
In a step f) movement of the end effector 13 is controlled by the
representation of the robot controller according to the updated programmed nominal path. This procedure, in particular steps c) to f), is repeated until the final position of the programmed nominal path has been reached.
With reference to Figs 4a-d a simple example of the computer-implemented method will now be described. Fig. 4a shows an example of an initial programmed nominal path 23 extending between a start position A and a final position B, and a tracking path 19. The programmed nominal path 23 may for example have been programmed having a specific workpiece in mind. According to the example the tracking path 19 is adapted to a corresponding path to be followed, of a subsequent workpiece and deviates from the initial programmed nominal path 23 by a curved portion 19a. Fig. 4b shows a field of view 25 of the representation of the tracking system 15, which according to the example is an optical tracking system, and a tracking point 21 detected along the curved portion 19a. The robot controller will thus update the initial programmed nominal path 23 so that the end effector 13 follows the tracking path 19 along the curved portion 19a instead of the initial programmed nominal path 23. The initial programmed nominal path 23 will thus be updated each time a virtual image is captured by the representation of the tracking system, whereby the end effector 13 is able to follow the tracking path 19 from the start position A to the final position B. This is further shown in Figs 4c and d.
The inventive concept has mainly been described above with reference to a few examples. However, as is readily appreciated by a person skilled in the art, other embodiments than the ones disclosed above are equally possible within the scope of the inventive concept, as defined by the appended claims.

Claims

1. A computer-implemented method of simulating a robotic system in a virtual environment (l) by means of a model of the robotic system including a representation of an end effector (13) to which a tracking system (15) is mounted, and a representation of a robot controller, wherein the method comprises: a) obtaining a programmed nominal path, b) determining a tracking path (19) in the virtual environment (10), to be followed by the representation of the end effector (13), c) detecting a tracking point (21) along the tracking path (19) by means of the representation of the tracking system (15), d) determining a location of the tracking point (21) in the virtual environment (10), e) updating the programmed nominal path based on the location of the tracking point (21), and f) controlling movement of the representation of the end effector (13) according to the updated programmed nominal path by means of the representation of the robot controller.
2. The computer-implemented method as claimed in claim 1, comprising repeating steps c) to f) until a final position of the programmed nominal path has been reached.
3. The computer-implemented method as claimed in claim 1 or 2, wherein step d) involves: di) determining the coordinates of the tracking point (21) in a coordinate system of the representation of the tracking system (15), and d2) converting the coordinates of the tracking point (21) in the coordinate system of the representation of the tracking system (15) to the coordinates of the tracking point in a coordinate system of the representation of the end effector (13), thereby determining the location of the tracking point (21) in the virtual environment (10).
4. The computer-implemented method as claimed in any of the preceding claims, wherein the representation of the tracking system (15) is a model of a camera.
5. The computer-implemented method as claimed in any of the preceding claims, comprising, prior to step c), providing a plurality of user-selectable representations of tracking systems (15), each being a model of an existing type of tracking system, and receiving a user-input of a selection of a representation of a tracking system (15) of the plurality of representations of tracking systems (15).
6. The computer-implemented method as claimed in any of the preceding claims, wherein the representation of the end effector (13) comprises a tool.
7. The computer-implemented method as claimed in claim 6, wherein the tool is welding tool.
8. A computer program comprising computer-executable components which when executed by processing circuitry causes a robot simulator system (1) to perform the method as claimed in any of claims 1-7.
9. A computer program product comprising a computer program
according to claim 8, and a storage medium (5) on which the computer program is stored.
10. A robot simulator system (1) for simulating a robotic system in a virtual environment (10) by means of a model of the robotic system including a representation of an end effector (13) to which a representation of a tracking system (15) is mounted, and a representation of a robot controller, wherein the robot simulator system (1) comprises: a display device (7), processing circuitry (3), and a storage medium (5) comprising computer code, which when run on the processing circuitry (3) causes the robot simulator system (1) to perform the method as claimed in any of claims 1-7.
PCT/EP2016/061171 2016-05-19 2016-05-19 Method of simulating a robotic system WO2017198299A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2016/061171 WO2017198299A1 (en) 2016-05-19 2016-05-19 Method of simulating a robotic system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2016/061171 WO2017198299A1 (en) 2016-05-19 2016-05-19 Method of simulating a robotic system

Publications (1)

Publication Number Publication Date
WO2017198299A1 true WO2017198299A1 (en) 2017-11-23

Family

ID=56024294

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/061171 WO2017198299A1 (en) 2016-05-19 2016-05-19 Method of simulating a robotic system

Country Status (1)

Country Link
WO (1) WO2017198299A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109807891A (en) * 2019-01-31 2019-05-28 北京华航唯实机器人科技股份有限公司 Equipment moving processing method and processing device
EP3741515A1 (en) * 2019-05-20 2020-11-25 Siemens Aktiengesellschaft Method for controlling a robot with a simulated robot
CN112440018A (en) * 2019-09-04 2021-03-05 中冶赛迪技术研究中心有限公司 Welding system and welding method
CN112638594A (en) * 2018-09-10 2021-04-09 发纳科美国公司 Zero teaching of a continuous path of a robot
CN112703092A (en) * 2018-07-16 2021-04-23 快砖知识产权私人有限公司 Backup tracking for interactive systems
WO2022170572A1 (en) * 2021-02-10 2022-08-18 Abb Schweiz Ag Method and apparatus for tuning robot path for processing workpiece
WO2023202031A1 (en) * 2022-04-22 2023-10-26 奇瑞新能源汽车股份有限公司 Welding method and apparatus, and electronic device and computer-readable storage medium
EP4205920A4 (en) * 2020-08-27 2024-10-23 Kyocera Corp Robot control device, robot control system, and robot control method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4380696A (en) * 1980-11-12 1983-04-19 Unimation, Inc. Method and apparatus for manipulator welding apparatus with vision correction for workpiece sensing
US5570458A (en) * 1994-03-29 1996-10-29 Nippon Telegraph And Telephone Corporation Manipulator tracking apparatus and method for accurately tracking a path for sensors in low reliability conditions
EP1527850A2 (en) * 2003-10-31 2005-05-04 Fanuc Ltd Simulation apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4380696A (en) * 1980-11-12 1983-04-19 Unimation, Inc. Method and apparatus for manipulator welding apparatus with vision correction for workpiece sensing
US5570458A (en) * 1994-03-29 1996-10-29 Nippon Telegraph And Telephone Corporation Manipulator tracking apparatus and method for accurately tracking a path for sensors in low reliability conditions
EP1527850A2 (en) * 2003-10-31 2005-05-04 Fanuc Ltd Simulation apparatus

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112703092A (en) * 2018-07-16 2021-04-23 快砖知识产权私人有限公司 Backup tracking for interactive systems
CN112638594A (en) * 2018-09-10 2021-04-09 发纳科美国公司 Zero teaching of a continuous path of a robot
CN112638594B (en) * 2018-09-10 2024-05-24 发纳科美国公司 Zero teaching of continuous paths of robots
CN109807891A (en) * 2019-01-31 2019-05-28 北京华航唯实机器人科技股份有限公司 Equipment moving processing method and processing device
CN109807891B (en) * 2019-01-31 2020-02-07 北京华航唯实机器人科技股份有限公司 Equipment motion processing method and device
EP3741515A1 (en) * 2019-05-20 2020-11-25 Siemens Aktiengesellschaft Method for controlling a robot with a simulated robot
CN112440018A (en) * 2019-09-04 2021-03-05 中冶赛迪技术研究中心有限公司 Welding system and welding method
CN112440018B (en) * 2019-09-04 2023-08-11 中冶赛迪技术研究中心有限公司 Welding system and welding method
EP4205920A4 (en) * 2020-08-27 2024-10-23 Kyocera Corp Robot control device, robot control system, and robot control method
WO2022170572A1 (en) * 2021-02-10 2022-08-18 Abb Schweiz Ag Method and apparatus for tuning robot path for processing workpiece
WO2023202031A1 (en) * 2022-04-22 2023-10-26 奇瑞新能源汽车股份有限公司 Welding method and apparatus, and electronic device and computer-readable storage medium

Similar Documents

Publication Publication Date Title
WO2017198299A1 (en) Method of simulating a robotic system
JP4153528B2 (en) Apparatus, program, recording medium and method for robot simulation
Makhal et al. Reuleaux: Robot base placement by reachability analysis
US9387589B2 (en) Visual debugging of robotic tasks
US9352467B2 (en) Robot programming apparatus for creating robot program for capturing image of workpiece
JP6449534B2 (en) Teaching point program selection method for robot simulator
US11648683B2 (en) Autonomous welding robots
US20070293986A1 (en) Robot simulation apparatus
JP5113666B2 (en) Robot teaching system and display method of robot operation simulation result
JP5561384B2 (en) Recognition program evaluation apparatus and recognition program evaluation method
CN114599488B (en) Machine learning data generation device, machine learning device, work system, computer program, machine learning data generation method, and work machine manufacturing method
US11813754B2 (en) Grabbing method and device for industrial robot, computer storage medium, and industrial robot
CN107848117B (en) Robot system and control method
Andersson et al. AR-enhanced human-robot-interaction-methodologies, algorithms, tools
Fang et al. Robot programming using augmented reality
WO2023205209A1 (en) Autonomous assembly robots
JP2004280635A (en) Simulation device, simulation method, and simulation program
Gong et al. Projection-based augmented reality interface for robot grasping tasks
EP4088883A1 (en) Method and system for predicting a collision free posture of a kinematic system
WO2022119652A1 (en) Generating robotic control plans
Bulej et al. Simulation of manipulation task using iRVision aided robot control in Fanuc RoboGuide software
US20220301209A1 (en) Device and method for training a neural network for controlling a robot
JP2023505322A (en) Method and system for programming robots
WO2017032407A1 (en) An industrial robot system and a method for programming an industrial robot
US20220043455A1 (en) Preparing robotic operating environments for execution of robotic control plans

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16723741

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 16723741

Country of ref document: EP

Kind code of ref document: A1