WO2016193058A1 - Procédé pour determiner un point de trajectoire - Google Patents

Procédé pour determiner un point de trajectoire Download PDF

Info

Publication number
WO2016193058A1
WO2016193058A1 PCT/EP2016/061667 EP2016061667W WO2016193058A1 WO 2016193058 A1 WO2016193058 A1 WO 2016193058A1 EP 2016061667 W EP2016061667 W EP 2016061667W WO 2016193058 A1 WO2016193058 A1 WO 2016193058A1
Authority
WO
WIPO (PCT)
Prior art keywords
environment
determining
temperature
temperature profile
point
Prior art date
Application number
PCT/EP2016/061667
Other languages
German (de)
English (en)
Inventor
Jacob Saoumi
Original Assignee
Kuka Roboter Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kuka Roboter Gmbh filed Critical Kuka Roboter Gmbh
Priority to CN201680031527.8A priority Critical patent/CN107666988A/zh
Priority to EP16725497.8A priority patent/EP3302895A1/fr
Publication of WO2016193058A1 publication Critical patent/WO2016193058A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35444Gesture interface, controlled machine observes operator, executes commands
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37266Infrared
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37426Detected with infrared sensor
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/375673-D vision, stereo vision, with two cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39441Voice command, camera detects object, grasp, move

Definitions

  • the present invention relates to a method for determining a path point for a robot programming, and further to a corresponding system for
  • Robots (and in particular industrial robots) are freely programmable
  • the actual robot mechanism is commonly referred to as a manipulator, which may consist of a plurality of movable members or axles.
  • a manipulator which may consist of a plurality of movable members or axles.
  • Targeted control of motors (and in particular servomotors) allows the individual axes to be moved.
  • a corresponding manipulator typically travels a trajectory or path, with the individual points of the path dictated by a robot program or path planning.
  • path planning generally describes
  • Method and apparatus for moving a manipulator For programming a robot program, the positions of individual track points, which are to be approached by the manipulator, must first be recorded or specified. The remaining points, or the detailed path movement of the manipulator, is calculated by means of an algorithm, for example by means of a point-to-point path planning, or so-called spline path planning. This setting of individual train points, or the "teaching" of track points, or the setting of a trajectory is a time-consuming step in the
  • Robot programming In principle, a distinction can be made between two different approaches: online and offline programming.
  • individual path points can be generated by means of a CAD model or a simulation tool.
  • the manipulator will not required for program creation, because the movements of the
  • Manipulator can be determined using virtual three-dimensional simulation environments.
  • Manipulator For this purpose, for example, special programming handheld devices can be used to guide a manipulator directly from one point to another while recording the approached track points and store. However, this procedure is very time consuming and not intuitive. Furthermore, a playback method is known as another variant of online programming. In this case, the programmer moves the manipulator by directly guiding the manipulator along a later trajectory. Meanwhile, individual points
  • PdV programming-by-the-master
  • the present invention relates to a method for determining a path point for a robot program or for a robot program.
  • the inventive method can thus be used to determine one or more track points that can be used for robot programming.
  • robot programming here comprises predetermining a sequence of movements of a robot or manipulator corresponding to one or more track points.
  • the specific path points can for example be integrated directly into a robot program or a path planning, or be used to create such.
  • the method can preferably be used during offline or online programming of a manipulator.
  • the method includes detecting a first temperature profile of an environment, and determining a orbit point using the sensed first temperature profile. In this case, preferably more than one track point can be determined, for example if a track consisting of several track points is to be determined.
  • the detection of the temperature profile by means of a temperature measuring device and further preferably by means of a
  • a thermal signature which is present on a surface of the environment can be detected, and this thermal signature used to determine the point of the orbit.
  • a complex simulation or direct guidance of a manipulator is advantageously not necessary.
  • environment encompasses objects and objects which
  • workpieces can fall under the concept of "environment", which are processed by means of a manipulator (and a corresponding robot program) should.
  • the environment may also include planar or textured surfaces, such as walls, floors, tabletops, molds, and so forth.
  • the environment may also include one or more manipulators.
  • the method further comprises generating a temperature change on the environment, wherein said step of generating is preferably performed by a user and in particular by a human. Further preferably, the temperature change is generated by touching the environment with a finger. In particular, the detected first temperature profile of the environment is preferably characteristic of that generated by the user
  • the method according to the invention makes it possible to define corresponding track points by touching the surroundings.
  • traces of heat can be detected and processed in order to determine a path point.
  • These traces of heat are generated, for example, by a user by touching a surface. Since the body temperature of a human (about 37 ° C) is usually not equal to the ambient temperature, by touching the environment by the user, the environment is locally heated (or locally cooled).
  • the method according to the invention provides a simple and intuitive method
  • determining the path point preferably comprises determining the ones generated on the environment
  • Temperature change For example, simple detection algorithms can be used to detect traces of heat or single point heat points. Depending on the components used (such as the thermal imager), properties of the touched surface,
  • the finger touch should be sufficiently long to allow the most unambiguous determination of the path point.
  • generating the temperature change includes
  • determining the path point comprises associating a predefined robotic action.
  • the sign or gestures can be recognized and assigned to a corresponding robot action in the robot program.
  • the user can intuitively define certain robotic actions, such as starting or stopping a measurement.
  • the method further comprises detecting a second temperature profile of the environment.
  • the determination of the path point preferably takes place using the detected second temperature profile.
  • the second temperature profile is different from the first temperature profile, for example due to a user acting on the environment.
  • a first temperature profile of the environment can be detected, followed by a
  • Temperature change on the environment caused by a user for example, this user touches a point of the environment with his finger
  • a second temperature profile of the environment are detected. Based on the two recorded temperature profiles or under
  • the track point is determined, which corresponds to the location that was touched with the finger. Further preferably, determining the path point comprises comparing the first and second
  • the method further comprises capturing a three-dimensional image of the environment, and determining a three-dimensional structural profile of the environment based on the captured three-dimensional image.
  • a temperature profile and a three-dimensional image of the environment is detected.
  • the three-dimensional structure profile contains information about a spatial arrangement of the environment, such as a height profile.
  • the determination of the path point takes place using the determined three-dimensional structure profile of the environment. This makes it possible to detect traces of heat even on surfaces whose geometry is unknown beforehand. With the help of the three-dimensional image can be recognized which three-dimensional
  • Coordinates of the room or the environment are to be used for the determination of the point of the train. Further preferably, the detection of the three-dimensional image by means of a 3D camera.
  • the method further comprises providing the particular
  • the particular path point can be provided in particular preferably for spline path planning or point-to-point path planning.
  • the particular track point can be used directly to control a manipulator accordingly. For example, a user can touch an object, and then, by means of the method according to the invention, this object can be gripped by a manipulator.
  • Robot movement is not necessary, but the present invention allows intuitive setting of track points.
  • the method further comprises picking up a voice command, and determining the web-point further using the
  • a programmer can thus use his voice to intuitively control the process by, for example, recording a
  • Temperature profile controls by means of a corresponding voice command by means of a corresponding voice command.
  • the voice command is recorded by means of a microphone.
  • a method mode comprises determining individual path points based on, for example, individual ones
  • Heat points, and another method mode is determining a path based on, for example, a heat trace.
  • the environment has a substantially planar surface, for example a plate or glass surface. Furthermore, the detection of the
  • Temperature profile preferably detecting a temperature profile of the
  • a programmer can impose a particular trajectory with his finger on the substantially planar surface, such as by touching the surface with his index finger, and then the heat trace resulting from that contact can be detected and used to calculate the trajectory point be used.
  • This method is thus very robust, since, for example, fragile
  • Touch-sensitive handling devices for programming are not necessary.
  • the skilled person understands, according to the application, to choose a suitable material with a suitable thermal conductivity.
  • a material with high thermal conductivity such as steel
  • a low thermal conductivity material such as wood
  • a surface for example a wood panel
  • a corresponding coating eg., Copper coating
  • the present invention relates to a system for determining a path point for a robot program or for a robot programming.
  • the system comprises a temperature measuring device, configured to detect a first temperature profile of an environment, and a controller, which is set up to determine the path point using the means of the Temperature measuring device detected first temperature profile.
  • the temperature measuring device preferably comprises a thermal imaging camera.
  • the system further comprises a camera configured to capture a three-dimensional image of the environment, and the controller is further configured to determine the orbit point using the captured three-dimensional image, the camera preferably comprising a 3D camera.
  • the system comprises a microphone which is adapted to
  • the controller is further adapted to
  • the individual components of the system may be implemented (in part) separately or provided (partially) in a single device.
  • the controller and microphone may be provided in a single device, such as a portable computer.
  • the controller can be implemented together with a robot controller, and / or the temperature measuring device can be mounted on a corresponding robot or robot.
  • temperature profile in the sense of the present invention may include a temperature change generated by a user on the environment.
  • the temperature change produced on the environment may preferably comprise a single heat point, and / or a heat trace consisting of several heat points.
  • the term "heat point” encompasses a single point on the environment, which is distinctive due to its (relative) temperature, for example a heat point on a surface has a higher or lower temperature than the other points on the surface
  • a "heat trace”, on the other hand does not just come from touching one area at a time but, for example, by tracing a line on the surface.
  • a web line can be determined, which is to be traveled, for example, during a welding process by the manipulator to create a specific weld. 4th embodiments
  • FIG. 1 shows schematically the determination of a path point according to an embodiment
  • Fig. 2 shows schematically the determination of a path point according to another
  • FIG. 3 schematically shows the sequence of a method for determining a path point according to an embodiment
  • FIG. 5 schematically shows the sequence of a method for determining a path point according to a further embodiment.
  • FIG. 1 schematically shows the determination of a path point, according to an embodiment of the present invention.
  • a camera system 10 comprising a thermal imaging camera is directed onto a surface 12 that is substantially planar.
  • a programmer drives his finger 11 over the surface 12, thereby heating this surface 12 locally.
  • the resulting heat trace 13 is characterized by the
  • the programmer can depress a shutter button (not shown) when it is ensured that the thermal imager does not detect the programmer himself.
  • the programmer may also pronounce a voice command which is recorded by a microphone (not shown) and triggers the detection of the heat track 13.
  • an image can be compared, for example by means of the thermal imaging camera, before and after the contact of the surface 12 by the finger 11.
  • the illustration in FIG. 2 essentially corresponds to that of FIG. 1, but there are two objects 23, 24 on the surface 22.
  • no path is to be based on a heat track, but only two individual path points based on two individual heat points 25, 26 are determined.
  • the programmer can, for example, press a shift key (not shown) with which it is possible to switch between the modes "Detecting a train line" and "Detecting individual track points".
  • a shift key not shown
  • the camera housing 20 additionally comprises a 3D camera.
  • the programmer touches with his finger 21 each one point on the objects 23, 24, whereby the objects experience a punctual increase in temperature.
  • the thermal imager captures these locally elevated temperatures, or the
  • Heat points 25, 26 For this purpose, the 3D camera acquires a three-dimensional image, which makes it possible to determine the three-dimensional position of the points touched on the objects 23, 24. Subsequently, two track points are determined based on the data acquired by the thermal imager and the 3D camera.
  • FIG. 3 shows a method 30 for determining a path point.
  • a temperature profile of an environment is detected.
  • a track point is determined using the sensed temperature profile.
  • FIG. 4 shows a further method 40 for determining a path point.
  • a first temperature profile of an environment is detected.
  • a user by touching the environment with his finger, creates a heat trail on that environment.
  • a second temperature profile of an environment is detected.
  • a user by touching the environment with his finger, creates a heat trail on that environment.
  • a second temperature profile of an environment is detected.
  • FIG. 5 shows a further method 50 for determining a path point.
  • a thermal image is taken from an environment by means of a thermal imaging camera.
  • a heat trace is detected in this thermal image.
  • a three-dimensional image is taken of the same environment, and in step 54, a three-dimensional structural profile of that environment is determined based on the acquired three-dimensional image.
  • step 55 the track point is determined using the heat track recognized in step 52 and the three-dimensional one determined in step 54

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un procédé pour déterminer un point de trajectoire pour la programmation d'un robot, qui comprend les étapes consistant à acquérir un premier profil de températures d'un environnement et à déterminer le point de trajectoire au moyen du premier profil de températures acquis. L'invention concerne également un système correspondant permettant de déterminer un point de trajectoire pour la programmation d'un robot.
PCT/EP2016/061667 2015-05-29 2016-05-24 Procédé pour determiner un point de trajectoire WO2016193058A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201680031527.8A CN107666988A (zh) 2015-05-29 2016-05-24 用于确定路径点的方法
EP16725497.8A EP3302895A1 (fr) 2015-05-29 2016-05-24 Procédé pour déterminer un point de trajectoire

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102015209900.8 2015-05-29
DE102015209900.8A DE102015209900A1 (de) 2015-05-29 2015-05-29 Verfahren zum Bestimmen eines Bahnpunktes

Publications (1)

Publication Number Publication Date
WO2016193058A1 true WO2016193058A1 (fr) 2016-12-08

Family

ID=56084025

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/061667 WO2016193058A1 (fr) 2015-05-29 2016-05-24 Procédé pour determiner un point de trajectoire

Country Status (4)

Country Link
EP (1) EP3302895A1 (fr)
CN (1) CN107666988A (fr)
DE (1) DE102015209900A1 (fr)
WO (1) WO2016193058A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018189417A1 (fr) * 2017-04-11 2018-10-18 University Of Helsinki Procédé et système pour déterminer une interaction homme objet
JP6826069B2 (ja) * 2018-04-18 2021-02-03 ファナック株式会社 ロボットの動作教示装置、ロボットシステムおよびロボット制御装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1537959A2 (fr) * 2003-11-24 2005-06-08 Abb Research Ltd. Méthode et système de programmation d'un robot industriel
US20110288964A1 (en) * 2010-05-24 2011-11-24 Massachusetts Institute Of Technology Kinetic Input/Output
DE102012015056A1 (de) * 2012-07-28 2014-02-13 Bsautomatisierung Gmbh Robotersteuerungsvorrichtung
JP2014104527A (ja) * 2012-11-27 2014-06-09 Seiko Epson Corp ロボットシステム、プログラム、生産システム及びロボット
WO2014093822A1 (fr) * 2012-12-14 2014-06-19 Abb Technology Ag Apprentissage de trajet de robot à mains nues
US20140201112A1 (en) 2013-01-16 2014-07-17 Kabushiki Kaisha Yaskawa Denki Robot teaching system and robot teaching method
US20150002391A1 (en) * 2013-06-28 2015-01-01 Chia Ming Chen Systems and methods for controlling device operation according to hand gestures

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009035121A1 (de) * 2009-07-29 2011-02-03 Tekfor Cologne Gmbh Verfahren zur Verarbeitung von Werkstücken

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1537959A2 (fr) * 2003-11-24 2005-06-08 Abb Research Ltd. Méthode et système de programmation d'un robot industriel
US20110288964A1 (en) * 2010-05-24 2011-11-24 Massachusetts Institute Of Technology Kinetic Input/Output
DE102012015056A1 (de) * 2012-07-28 2014-02-13 Bsautomatisierung Gmbh Robotersteuerungsvorrichtung
JP2014104527A (ja) * 2012-11-27 2014-06-09 Seiko Epson Corp ロボットシステム、プログラム、生産システム及びロボット
WO2014093822A1 (fr) * 2012-12-14 2014-06-19 Abb Technology Ag Apprentissage de trajet de robot à mains nues
US20140201112A1 (en) 2013-01-16 2014-07-17 Kabushiki Kaisha Yaskawa Denki Robot teaching system and robot teaching method
US20150002391A1 (en) * 2013-06-28 2015-01-01 Chia Ming Chen Systems and methods for controlling device operation according to hand gestures

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ELLIOT N SABA ET AL: "Dante vision: In-air and touch gesture sensing for natural surface interaction with combined depth and thermal cameras", EMERGING SIGNAL PROCESSING APPLICATIONS (ESPA), 2012 IEEE INTERNATIONAL CONFERENCE ON, IEEE, 12 January 2012 (2012-01-12), pages 167 - 170, XP032116223, ISBN: 978-1-4673-0899-1, DOI: 10.1109/ESPA.2012.6152472 *
ERIC LARSON ET AL: "HeatWave", HUMAN FACTORS IN COMPUTING SYSTEMS, ACM, 2 PENN PLAZA, SUITE 701 NEW YORK NY 10121-0701 USA, 7 May 2011 (2011-05-07), pages 2565 - 2574, XP058041548, ISBN: 978-1-4503-0228-9, DOI: 10.1145/1978942.1979317 *

Also Published As

Publication number Publication date
CN107666988A (zh) 2018-02-06
EP3302895A1 (fr) 2018-04-11
DE102015209900A1 (de) 2016-12-01

Similar Documents

Publication Publication Date Title
DE102019009313B4 (de) Robotersteuerung, Verfahren und Computerprogramm unter Verwendung von erweiterter Realität und gemischter Realität
DE102018116053B4 (de) Robotersystem und Roboterlernverfahren
DE102019002898B4 (de) Robotersimulationsvorrichtung
WO2014117895A1 (fr) Procédé et dispositif pour commander un appareil d'atelier
DE102012212754B4 (de) Verfahren zum Betreiben eines Sensorsystems sowie Sensorsystem
DE102019119319B4 (de) Abtastsystem, Arbeitssystem, Verfahren zum Anzeigen von Augmented-Reality-Bildern, Verfahren zum Speichern von Augmented-Reality-Bildern und Programme dafür
EP2223191B1 (fr) Robot industriel et procédé de programmation d'un robot industriel
DE102019122865B4 (de) Erfassungssystem, Arbeitssystem, Anzeigeverfahren für ein Erweiterte-Realität-Bild und Programm
DE102015107436B4 (de) Lernfähige Bahnsteuerung
EP3098034B1 (fr) Selection d'un appareil ou d'un objet a l'aide d'une camera
DE102014118001A1 (de) Verfahren zur Bewegungssimulation eines Manipulators
DE102019109624B4 (de) Roboterbewegungseinlernvorrichtung, Robotersystem und Robotersteuerung
DE102007026299B4 (de) Industrieroboter und Verfahren zum Programmieren eines Industrieroboters
EP3366434B1 (fr) Procédé de vérification d'une fonction d'un véhicule et/ou d'au moins un dispositif de commande
EP3929675B1 (fr) Système de surveillance et de commande pour un poste de travail de production et procédé de fabrication d'un produit ou d'un sous-produit
DE102018112403A1 (de) Robotersystem, das informationen zur unterweisung eines roboters anzeigt
EP2216144A2 (fr) Système et procédé pour vérifier des composants et/ou des unités fonctionnelles avec un dispositif de test
DE102020129967A1 (de) Simulationsvorrichtung und Robotersystem mit erweiterter Realität
WO2020229028A1 (fr) Dispositif de saisie, procédé de fourniture d'instructions de mouvement à un actionneur et système actionneur
EP3302895A1 (fr) Procédé pour déterminer un point de trajectoire
DE102020102160B3 (de) Verfahren zum Erzeugen eines Eingabebefehls für einen Roboterarm und Roboterarm
DE102018124671B4 (de) Verfahren und Vorrichtung zur Erstellung eines Robotersteuerprogramms
DE102016221193B3 (de) Verfahren zum Steuern eines Manipulators basierend auf Handerkennung
DE102015200319A1 (de) Einmessverfahren aus Kombination von Vorpositionierung und Handführen
DE102019118012B3 (de) Verfahren und Vorrichtung zum Steuern eines Robotersystems mittels menschlicher Bewegung

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16725497

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2016725497

Country of ref document: EP