WO2007093413A1 - Robot comprenant une unité de commande pour commander un mouvement entre une position de début et une position de fin - Google Patents

Robot comprenant une unité de commande pour commander un mouvement entre une position de début et une position de fin Download PDF

Info

Publication number
WO2007093413A1
WO2007093413A1 PCT/EP2007/001310 EP2007001310W WO2007093413A1 WO 2007093413 A1 WO2007093413 A1 WO 2007093413A1 EP 2007001310 W EP2007001310 W EP 2007001310W WO 2007093413 A1 WO2007093413 A1 WO 2007093413A1
Authority
WO
WIPO (PCT)
Prior art keywords
map
robot
control unit
pose
movement
Prior art date
Application number
PCT/EP2007/001310
Other languages
German (de)
English (en)
Inventor
Michael Müller
Original Assignee
Kuka Roboter Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kuka Roboter Gmbh filed Critical Kuka Roboter Gmbh
Publication of WO2007093413A1 publication Critical patent/WO2007093413A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • B25J9/1666Avoiding collision or forbidden zones
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37277Inductive proximity sensor
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40442Voxel map, 3-D grid map
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40455Proximity of obstacles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40465Criteria is lowest cost function, minimum work path
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40544Detect proximity of object
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40564Recognize shape, contour of object, extract position and orientation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40607Fixed camera to observe workspace, object, workpiece, global

Definitions

  • the invention relates to a robot, in particular of an industrial or articulated robot, having at least two degrees of freedom of movement and a control unit for controlling a movement between an initial pose and an end pose and a method for controlling a movement of a robot between an initial pose and an end pose at least two degrees of freedom of movement, wherein a path of movement between the initial pose and the final pose is generated.
  • the prior art discloses a robot, for example an industrial or articulated robot, having at least two degrees of freedom of movement.
  • the robot includes a controller for controlling movement between an initial pose and an end pose along a preprogrammed motion path comprised of piecewise linear or arcuate path segments.
  • a preprogrammed motion path comprised of piecewise linear or arcuate path segments.
  • a user in a teach-in procedure can manipulate a robotic gripper arm. ters manually on the path while avoiding interference contours within the reach of the gripper arm.
  • the control unit stores individual blending points of the path and approximately reproduces the motion by a piecewise rectilinear motion connecting the blending points. Alternatively, the individual oversize points can be entered when programming the control unit.
  • the invention is in particular the object of providing a generic robot with a control unit that effectively prevents an unwanted collision of the robot with objects in its range.
  • a robot in particular an industrial or articulated robot, having at least two degrees of freedom of movement and a control unit for controlling a movement between an initial pose and an end pose, wherein the control unit comprises a memory unit for storing a map in which at least one
  • Interference contour is chargeable, and by a method for controlling such a robot.
  • the invention is based on a robot, in particular an industrial or articulated robot, having at least two degrees of freedom of movement and a control unit for controlling a movement between an initial pose and an end pose.
  • the control unit comprises a memory unit for storing a map, in which at least one interference contour can be recorded.
  • the control unit can check for any points or poses of the movement as to whether the point or the pose lies within or outside a disturbing contour. If the control unit detects that the point or pose is within the interference contour, it may generate a warning signal or block movement. As a result, a collision with an object represented by the interference contour can be reliably avoided.
  • An inventively designed control unit can in principle be used to control each robot, which moves in the range of static or at least only slowly changing objects.
  • the solution according to the invention can be used particularly advantageously in connection with articulated robots or SCARA robots in industrial applications.
  • a "map" is to be understood as a multidimensional data structure, which may also be an overlay or projection of one or more one-dimensional data structures.
  • the map is an illustration that assigns a value to each pose of the robot, from which it can be deduced whether the robot or one of its gripper arms is located inside or outside the disturbance contour. Quick access to the map can be achieved if the dimension of the map is less than or equal to a number of degrees of freedom of movement of the robot.
  • the map is generated particularly advantageously in an initialization process of the robot in order to minimize the amount of computing time required to determine a movement path.
  • control unit is provided to automatically generate a movement path between the initial pose and the final pose as a function of the map.
  • ease of use can be significantly increased since only one start and end point of the movement must be programmed, provided that the environment of the robot and thus the interfering contour to be taken into account does not change.
  • a systematic separation between the application-specific movement data or the coordinates of the poses to be controlled and the often application-independent interference contours can be achieved.
  • "provided” should also be understood to mean “designed” and "equipped”.
  • control unit If the control unit is programmed in such a way that it divides the map into at least two areas with different levels of detail in order to generate the movement path, a computing time to generate the movement path can be significantly reduced, since superfluous detail in areas that are largely free from interfering contours are, can be waived. In such areas, the control unit can assemble the motion path from long segments, while in critical areas with highly structured interference contours, a small-scale motion sequence can be generated. The generation of the movement path can be further accelerated if the movement is preferably guided through those areas which are not critical with regard to the level of detail are only in cases where it is unavoidable or advantageous in an overall context, the movement is guided into areas in which a high level of detail accuracy is necessary for collision avoidance.
  • control unit is programmed in such a way that it divides the map into several sub-maps with predefined path parts for generating the movement path, the generation of the path can be further accelerated because of the
  • the predetermined path parts preferably connect edge points of the sub-maps in a straight line with each other.
  • a transformation process from a spatial coordinate system to an axis-related coordinate system can be advanced to the initialization process of the control unit when the memory unit for storing the
  • Map is provided in the axis-related coordinate system. When generating the motion path then a coordinate transformation is no longer necessary.
  • a reference travel of the robot can be shortened or eliminated altogether if the robot comprises at least one absolute value encoder for detecting an absolute value of at least one coordinate of a current pose.
  • the robot comprises at least one absolute value encoder for detecting an absolute value of at least one coordinate of a current pose.
  • the robot comprises at least one absolute value encoder for detecting an absolute value of at least one coordinate of a current pose.
  • the control unit can detect the interference contour automatically when the robot has at least one sensor for detecting the Interference contour includes.
  • the sensor is part of a camera, in particular a CCD camera, which captures images of an environment of the robot.
  • the control unit can determine the interference contour from the images captured by the camera and enter it in the map.
  • the control unit is provided to generate the map depending on a predetermined interference contour.
  • a specification of the interference contour is particularly comfortable and without complicated conversion operations in a world coordinate system of the robot possible.
  • the computation time required to generate the map can be reduced if the control unit is intended to use a model for an outer contour of a gripper arm limited to a maximum of 20 parameters for generating the map. Due to the calculation time advantages gained by the model approximation, a fast processor and large main memory can be dispensed with, which ultimately also results in cost advantages for the production of the robot.
  • a particularly fast and simple determination of the interference contour can be achieved if the model represents the outer contour of the gripping arm by at least one polygon.
  • the model may represent the outer contour of the gripper arm by at least one circle. Also combinations of circles and polygons or other simple geometric shapes are conceivable.
  • the model generally does not have the full three-dimensional Outline outline of the gripper arm, but can be limited to a projection of the dimension of the map.
  • a sufficiently precise modeling of a gripper arm of the robot can be achieved with little effort, that each independent arm of the gripper arm is described by a single elementary geometric shape, for example by a polygon.
  • the invention relates to a method for controlling a movement of a robot, in particular a robot of the type described above.
  • FIG. 1 shows a robot with a gripper arm in a plan view with a schematically illustrated first model of two lines for an outer contour of the gripper arm;
  • FIG. 2 shows the robot with the gripping arm from FIG. 1 with a schematically illustrated second model of circles for an outer contour of the gripping arm;
  • FIG. 3 shows the robot with the gripping arm from FIGS. 1 and 2 with a schematically illustrated third model of two rectangles for an outer contour of the gripping arm;
  • FIGS. 1-3 shows the robot with the gripping arm from FIGS. 1-3 with a schematically illustrated further model of two polygons for an outer contour of the gripping arm;
  • FIG. 5 shows a working space of the robot from FIGS. 1-4 with objects which generate an interference contour in a plan view in world coordinates;
  • FIG. 6 shows a map with the interference contours of the arrangement of the objects from FIG. 5 in accordance with the first model for the outer contour of the gripping arm;
  • FIG. 8 shows a schematic representation for the manual generation of a movement path in a map with interference contours
  • FIG. 9 shows a schematic representation for the autonomous generation of a movement path in a map with interference contours; a schematic representation of the autonomous generation of a movement path in a map with Störkonturen according to an alternative algorithm;
  • FIG. 10 shows a schematic representation for the autonomous generation of a movement path in a map with interference contours according to a further alternative algorithm, which divides the map into areas with different levels of detail;
  • Fig. 11 is a first sub-map of the map of Fig. 11;
  • Fig. 12 is a second sub-map of the map of Fig. 11;
  • FIG. 13 shows a third sub-map of the map from FIG. 11
  • Fig. 14 is a fourth sub-map of the map of Fig. 11;
  • Algorithm that divides the map into rectangular sub-maps.
  • Figure 1 shows a robot, specifically merely by way of example embodied as a SCARA robot industrial robot, having at least two degrees of freedom with the rotational coordinates a LT (X 2 and a control unit here shown only schematically 12 for controlling a loading movement between an initial pose 14 and an end pose 16 (FIG. 7).
  • a LT X 2
  • a control unit here shown only schematically 12 for controlling a loading movement between an initial pose 14 and an end pose 16 (FIG. 7).
  • the robot comprises a pivotable gripping arm 10, which is pivotable about a first, vertically extending rotational axis Al.
  • the gripping arm 10 is mounted on a base 22 and consists of two arm parts 60, 62 connected by a hinge 24.
  • the arm parts 60, 62 are pivotable through the hinge 24 about a second rotary axis A2 extending parallel to the first rotary axis A1 , Other axes of the robot are not explicitly shown here.
  • the gripper arm 10 of the robot comprises two absolute value encoders 26, 28, shown here only schematically, for detecting an absolute value of the rotational coordinates OCi, 0C2 of a current pose.
  • the invention would in principle also be applicable if the gripper arm 10 only had incremental sensors for detecting changes in the values of the ro- tary coordinates oci, OC 2 . It would then be a reference to reference marks with a known position in space necessary.
  • the control unit 12 comprises a memory unit 30 for storing a map 32, in which at least one interference contour 34 can be recorded.
  • the memory unit 30 is provided for storing the map 32 in an axis-related coordinate system.
  • the control unit 12 must for this purpose enter the contours of the objects 64 in the axis-related coordinate system such that a risk of collision between the gripping arm 10 and one of the objects 64 can be read from the map 32.
  • a memory space is reserved for a 360X360 binary matrix forming the map 32. If an entry has the value zero, the point represented by the corresponding entry lies outside the interference contour 34, otherwise the point lies within the interference contour 34 and is therefore not accessible to the gripping arm 10 without colliding with an object 64.
  • the accuracy of the map corresponds to the accuracy of the pose determination of the robot or vice versa.
  • the control unit 12 measures the value of the coordinates OCi, OC 2 or angle and reads out the value assigned to the pose by the integer indices of the matrix from the memory unit 12.
  • the matrix therefore forms a discrete, two-dimensional map 32 which assigns each pose or each pair of values to the coordinates oti, (X 2 an index pair and thereby a value.
  • FIG. 1-16 show several alternative embodiments of the invention. Functionally identical features are provided with the same reference numerals. In each case, the description deals in particular with differences from the exemplary embodiments described above, while reference is made to the description of the previously described exemplary embodiments with regard to features which remain the same.
  • FIG. 2 shows a robot in an alternative embodiment of the invention, in which the gripping arm 10 only includes sensors 46 - 46 '' 'shown in sketched form for detecting an interference contour 34.
  • the sensors 46-46 '" are designed as inductive proximity sensors.
  • control unit 12 uses a model limited to a few parameters for an outer contour 58 of a gripper arm 10 for generating the map 32.
  • Various models are conceivable, which are shown schematically in FIGS. 1-4 and will be explained below.
  • control program approximates the outer contour 58 or the volume of the gripping arm 10 by mass points or voxels or finite elements. Because of the necessary high number of mass points, the computational effort for creating the map 32 is very high and also requires a similarly complex modeling of the objects 64 in the region of the gripper arm 10. Therefore, the following models are described, which reduced compared to the modeling by mass points computational effort require.
  • the model represents the outer contour 58 of the gripping arm 10 by two lines 48, 50, which each depict an arm part 60, 62 of the gripping arm 10.
  • the coordinates of the end points of the lines 48, 50 can be calculated trigonometrically knowing the lengths of the lines 48, 50 and the rotational coordinates (Xi, Gt 2) , because a width of the gripper arm 10 and the arm parts 60, 62 is not taken into account in the model
  • security zones must be taken into account between the lines 48, 50 and the objects 64 described by edge contours, which are at least as large as the maximum distance of the outer contour 58 from the lines 48, 50.
  • a map 32 resulting from the interference contour constellation according to FIG. 5 according to the first implementation is shown in FIG.
  • the objects 64 arranged to the left of the robot in FIG. 5 generate a common connected component of the interfering contour 34.
  • the object arranged centrally in front of the robot blocks an angular range with respect to the first rotational coordinate OC.sub.i. Bulges protruding out of this angular region on the left and right are due to the fact that a rear end of the front arm part 62 of the gripping arm 10 abuts against the object.
  • An object arranged radially further to the right of the robot generates a third, island-like connected component of the interference contour 34.
  • the model or the control unit 12 represents the outer contour 58 of the gripping arm 10 by means of seven circles 56 of different sizes (FIG. 2).
  • the surfaces of the circles 56 completely cover the outer contour 58.
  • the coordinates of the centers of the circles 56 are on the lines 48, 50 ( Figure 1) and are calculated with knowledge of the lengths of the lines 48, 50 and the rotational coordinates oci, ⁇ 2 trigonometrically.
  • the contours of the objects 64 are likewise covered by circles.
  • the control unit 12 checks for each value pair of coordinates (Xi, OC 2) whether one of the circles 56 covering the gripping arm 10 intersects one of the circles covering the objects 64. If this is the case, the control unit 12 carries a dot the interference contour 34 in the map 32 a.
  • the model or the control unit 12 represents the outer contour 58 of the gripping arm 10 by two polygons 52, 54 (FIG. 3), by rectangles which respectively cover one of the arm portions 60, 62 and corresponding arm part 60, 62 represent.
  • the 16 edge lines of the two polygons 52, 54 are shown to generate the map 32 as well as the edge lines of the objects 64 in the region of the gripping arm 10 by the control unit 12 by straight lines and by the two coordinates OCi, OC 2 and by two lengths and two widths parameterized.
  • Two further parameters describe the position of the axes Al, A2 within the polygons 52, 54. Overall, therefore, less than 20 parameters of the model result.
  • the control unit 12 checks for each pair of values of the coordinates OCi, OC 2 whether and at which point one of the boundary lines of the polygons 52, 54 intersects one of the edge lines of the objects 64. If the point of intersection is in each case between the end points of the edge lines, then the control unit 12 inserts a point of the interference contour 34 into the map 32 at the corresponding pose.
  • FIG. 7 shows a map 32 obtained according to the third implementation with an interference contour 34.
  • free inclusions in the clutter contour 34 result from the polygons 52, 54 surrounding an object 64 in the mathematical model so that none of the edge lines of the polygons 52, 54 intersect one of the edge lines of the object 64.
  • the corresponding poses are not accessible to the gripping arm 10 in reality.
  • the model represents the outer contour 58 of the gripper arm 10 by two polygons 52 ', 54' ( Figure 4), through
  • Rectangle 52 'and by an irregular hexagon 54' which takes into account a conical shape of the front arm portion 62.
  • the polygons 52 ', 54' are described by their edge lines analogously to the previously described embodiment, and the map 32 is generated by checking the intersections of the straight lines in the manner described above.
  • the outer contour 58 of the gripper arm 10 and the contours of the objects 64 may be modeled by further geometric shapes, such as triangles, other polygons, ellipses, or combinations of such elemental figures.
  • a higher level of detail of the model may be worthwhile in particular for the front regions of the gripping arm 10 in the vicinity of the tool center point.
  • the control unit 12 is designed by an implemented control software to automatically generate a movement path 36 (FIG. 7) between the initial pose 14 and the final pose 16 as a function of the map 32 during operation.
  • the control unit 12 uses the map 32 or the in the map 32 recorded interference contour 34 and generates the movement path 36 depending on the predetermined interference contour 34 or of the objects 64 whose coordinates are either entered manually or by the robot automatically via the sensors 46 - 46 '''( Figure 2) can.
  • FIG. 8 shows a path generated manually as a function of the map 32.
  • a programmer places rounding points 66, 66 'in the free regions outside the interfering contour 34 in such a way that the connecting straight lines between the blending points 66, 66' or between the starting pose 14 and the first blending point 66 or the last blending point and the end pose 16 is completely outside the interference contour 34.
  • FIG. 9 shows a movement path 36 generated autonomously by the control unit 12, which is generated by an algorithm with maximum safety. The algorithm is based on the first arm part 60 passing through
  • the rotational coordinate (X 2 of the second arm 62 is set in each step such that the maximum distance is maintained to all the interference contours 34th, the method checks the possible values of the coordinate OC 2 at predetermined coordinate CCi in a contiguous region, determines the Averaging the collision-free poses in this area, increases the value of the co-ordinate OCi by 1 and generates the motion path 36 by setting blending points 66. In the event that the mean of the collision-free poses is too strong from the coordinate OC 2 of the last blending point 66, the fluctuations are limited to a maximum value or another intelligent smoothening method that appears meaningful to a person skilled in the art is used.
  • FIG. 10 shows a movement path 36 generated autonomously by the control unit 12, which is generated by an alternative algorithm according to the minimum distance requirement.
  • the value of the coordinate oci is initially shifted as far as possible in the direction of the target pose 16 starting from the starting pose 14 or from a smoothing point 66.
  • the control unit 12 sets a smoothing point 66 on the last collision-free pose with the value X of the coordinate OCi.
  • the value of the coordinate Ct 2 is shifted from the smoothing point 66 until the pose with the value X + 1 or X - I of the coordinate OC x causes no collision.
  • the control unit 12 sets a
  • FIG. 11 shows, in a schematized manner, an alternative embodiment of the control software, which distinguishes between a critical area 38 of the map 32 and an uncritical area 40 of the map 32.
  • the control unit 12 is programmed so that it divides the map 32 for generating the movement path 36 in two areas 38,40 with different levels of detail.
  • a first, critical area 38 has a high areal density of the edge lines of the interference contour 34, while a second area 40 is free of interference contours 34.
  • the control unit 12 divides the map 32 for generating the movement path 36 into a plurality of sub-maps 42 - 42 '"with predetermined path parts 44.
  • FIG. 12 shows a first sub-map 42 of the map 32, which is free of interference contours 34.
  • the path parts 44 connect the vertices of the sub-map 42 along an edge of the sub-map 42 with each other.
  • a sub-map 42 'shown in FIG. 13 shows an interference contour 34 projecting into the sub-map 42' at a left edge.
  • the start and end points of the given path parts 44 lie outside the interference contour 34 on an irregular quadrangle on the edge of the sub-map 42 '.
  • Another sub-map 42 "shown in FIG. 14 shows a section of the interference contour 34, which map 42 '' divided by a vertical, not accessible to the gripping arm 10 strip into two parts.
  • a first group of start and end points is arranged on an irregular quadrilateral at the edge of the first part of subcard 42 ", and a second group of start and end points is on a triangle at the edge of the second part
  • a sub-card 42 '' 'shown in FIG. 15 is stamped in two places by an interference contour 34, namely from a first part of the interference contour 34 projecting into the sub-card 42' 'at a left-hand upper corner and a second one as a tip in the sub-card 42 '' 'in projecting part of the Störkontur 34.
  • a total of seven start and end points are connected by sub-paths 44 in pairs.
  • FIG. 16 shows an alternative possibility for path generation from a map 32, in which a distinction is made between critical and uncritical regions.
  • the control unit 12 divides the map 32 into rectangles of different sizes. The rectangles are each free of the interference contour 34. In Figure 16, six such rectangles with Roman numerals I - VI are designated. The division of the map 32 in the rectangles is done by known from image processing search insertion process.
  • the control unit 12 uses the following behavioral rules: Within a rectangle, a direct point-to-point (PTP) movement can occur. If, for example, a pose is reached in the rectangle I and the final pose 16 is likewise located in the rectangle I, the gripper arm 10 can move to the final pose 16 without further blending points.
  • PTP point-to-point
  • Two adjacent rectangles with a common page can always be considered as a rectangle with respect to the previous behavior rule. If, for example, a pose is reached in the rectangle II and the end pose 16 is in the rectangle V, the gripping arm 10 can move into the final pose 16 without further blending points.
  • the map 32 can be particularly quickly retrieved from an information source for path generation.
  • the robot is not controlled by an integrated control unit 12, but by an external computing unit designed as a universally programmable computer.
  • the arithmetic unit uses methods for controlling a movement of the robot between an initial pose 14 and an end pose 16 in at least two degrees of freedom of movement Od, (X 2 , a movement path 36 between the initial pose 14 and the final pose 16 is generated by the arithmetic unit.
  • the computing unit uses in the manner described above a map 32 in which at least one interference contour 34 is recorded.

Abstract

L'invention concerne un robot, notamment un robot industriel ou articulé, comprenant au moins deux degrés (a<SUB>1</SUB>, a<SUB>2</SUB>) de liberté de mouvement et une unité de commande (12) pour commander un mouvement entre une position de début (14) et une position de fin (16). Pour équiper un robot générique d'une unité de commande (12) qui empêche efficacement une collision involontaire du robot avec des objets (64) dans sa zone de travail, on propose que l'unité de commande (12) comprenne au moins une unité de mémorisation (30) pour mémoriser une carte topographique (32) dans laquelle peut être dessiné au moins un profil perturbateur (34).
PCT/EP2007/001310 2006-02-18 2007-02-15 Robot comprenant une unité de commande pour commander un mouvement entre une position de début et une position de fin WO2007093413A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102006007623.0A DE102006007623B4 (de) 2006-02-18 2006-02-18 Roboter mit einer Steuereinheit zum Steuern einer Bewegung zwischen einer Anfangspose und einer Endpose
DE102006007623.0 2006-02-18

Publications (1)

Publication Number Publication Date
WO2007093413A1 true WO2007093413A1 (fr) 2007-08-23

Family

ID=38042995

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2007/001310 WO2007093413A1 (fr) 2006-02-18 2007-02-15 Robot comprenant une unité de commande pour commander un mouvement entre une position de début et une position de fin

Country Status (2)

Country Link
DE (1) DE102006007623B4 (fr)
WO (1) WO2007093413A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2409816A2 (fr) 2010-07-19 2012-01-25 KUKA Roboter GmbH Commande de manipulateur
GB2542892A (en) * 2015-07-28 2017-04-05 Harris Corp Path-optimized manipulator reversing controller
DE102007059480B4 (de) 2007-12-11 2018-07-05 Kuka Roboter Gmbh Verfahren und Vorrichtung zur Posenüberwachung eines Manipulators
US11358278B2 (en) 2016-08-24 2022-06-14 Siemens Aktiengesellschaft Method for collision detection and autonomous system

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007060653A1 (de) * 2007-12-15 2009-06-18 Abb Ag Positionsermittlung eines Objektes
DE102008013400B4 (de) * 2008-03-06 2016-03-10 Voith Engineering Services Gmbh Verfahren zur Ermittlung von Verriegelungsbereichen wenigstens eines im Raum bewegbaren ersten Objekts
DE102008015779A1 (de) 2008-03-26 2009-10-01 Fpt Systems Gmbh Fahrerloses Transportsystem zum Transport, Aufnehmen und Absetzen von Lasten
DE102008057142B4 (de) 2008-04-29 2016-01-28 Siemens Aktiengesellschaft Verfahren zur rechnergestützten Bewegungsplanung eines Roboters
DE102016120763B4 (de) * 2016-10-31 2019-03-14 Pilz Gmbh & Co. Kg Verfahren zur kollisionsfreien Bewegungsplanung
IT201800004698A1 (it) * 2018-04-19 2019-10-19 Procedimento per il ripristino dello stato funzionale di una macchina automatica per la produzione di articoli dell’industria del tabacco
DE102018129727A1 (de) 2018-11-26 2020-05-28 Beckhoff Automation Gmbh Vorrichtung und Verfahren zum Vermeiden einer Kollision beim Antreiben von wenigstens zwei Movern auf einer Antriebsfläche
DE102019105817A1 (de) * 2019-03-07 2020-09-10 Liebherr-Mischtechnik Gmbh Gelenkarm-Steuerung einer Betonpumpe
IT202000003482A1 (it) 2020-02-20 2021-08-20 Gd Spa Procedimento per la gestione selettiva degli allarmi di una macchina automatica per la produzione o l’impacchettamento di articoli di consumo.

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6535794B1 (en) * 1993-02-23 2003-03-18 Faro Technologoies Inc. Method of generating an error map for calibration of a robot or multi-axis machining center
US20050022273A1 (en) * 2003-07-23 2005-01-27 Hitachi, Ltd. Location aware automata

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0179252A3 (fr) * 1984-09-14 1987-07-15 Siemens Aktiengesellschaft Procédé et dispositif de protection des personnes qui se trouvent dans la zone de travail d'un organe mobile d'une machine déplaçable ou orientable, en particulier d'un robot industriel
EP0317020B1 (fr) * 1987-11-20 1995-04-19 Koninklijke Philips Electronics N.V. Méthode et dispositif pour l'établissement de route
EP0439655A1 (fr) * 1990-01-31 1991-08-07 Siemens Aktiengesellschaft Méthode de commande d'un robot afin d'éviter les collisions entre un robot à programmation orientée sur les tâches et des objets ayant chacun des degrés de mobilité différents
JPH05127718A (ja) * 1991-11-08 1993-05-25 Fujitsu Ltd マニピユレータの手先軌道自動生成装置
US5347459A (en) * 1993-03-17 1994-09-13 National Research Council Of Canada Real time collision detection
DE19810341C2 (de) * 1998-03-10 2000-10-12 Deutsch Zentr Luft & Raumfahrt Verfahren zur automatischen Kollisionsvermeidung eines Manipulators in einem durch Hindernisse beschränkten Arbeitsraum
JP3975959B2 (ja) * 2003-04-23 2007-09-12 トヨタ自動車株式会社 ロボット動作規制方法とその装置およびそれを備えたロボット
DE10324517A1 (de) * 2003-05-28 2004-12-16 Daimlerchrysler Ag Roboter und Anlernverfahren dafür

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6535794B1 (en) * 1993-02-23 2003-03-18 Faro Technologoies Inc. Method of generating an error map for calibration of a robot or multi-axis machining center
US20050022273A1 (en) * 2003-07-23 2005-01-27 Hitachi, Ltd. Location aware automata

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007059480B4 (de) 2007-12-11 2018-07-05 Kuka Roboter Gmbh Verfahren und Vorrichtung zur Posenüberwachung eines Manipulators
EP2409816A2 (fr) 2010-07-19 2012-01-25 KUKA Roboter GmbH Commande de manipulateur
DE102010027572A1 (de) 2010-07-19 2012-04-19 Kuka Roboter Gmbh Manipulatorsteuerung
GB2542892A (en) * 2015-07-28 2017-04-05 Harris Corp Path-optimized manipulator reversing controller
GB2542892B (en) * 2015-07-28 2020-01-22 Harris Corp Path-optimized manipulator reversing controller
US11358278B2 (en) 2016-08-24 2022-06-14 Siemens Aktiengesellschaft Method for collision detection and autonomous system

Also Published As

Publication number Publication date
DE102006007623B4 (de) 2015-06-25
DE102006007623A1 (de) 2007-08-30

Similar Documents

Publication Publication Date Title
DE102006007623B4 (de) Roboter mit einer Steuereinheit zum Steuern einer Bewegung zwischen einer Anfangspose und einer Endpose
DE102015001527B4 (de) Robotersystem, das visuelle Rückmeldung verwendet
DE102012007254B4 (de) Verfahren und Vorrichtung zum Vorausberechnen einer Behinderung zwischen einem Zielteil eines Roboters und einem peripheren Objekt
DE112017002498B4 (de) Robotervorgang-auswertungseinrichtung, robotervorgang-auswertungsverfahren und robotersystem
DE102021107453A1 (de) Schnelle roboterbewegungsoptimierung mit distanzfeld
DE102015015093B4 (de) Roboterprogrammiervorrichtung zum Instruieren eines Roboters für eine Bearbeitung
DE102013008755B4 (de) Offline-Programmiersystem
DE102019118637B4 (de) Automatische pfadgenerierungsvorrichtung
DE102017127950B4 (de) Robotersteuerung, die automatisch eine störzone für einen roboter vorgibt
DE102017222057A1 (de) Robotersystem
EP3225366B1 (fr) Surveillance de la position d&#39;un système cinématique
DE10393527T5 (de) Systeme und Verfahren zur Darstellung komplexer n-Kurven für die Direktsteuerung einer Werkzeugbewegung
EP3484672A1 (fr) Résolution redondante pour manipulateur redondant
DE102022122663A1 (de) Verfahren zur dynamischen geschwindigkeitsändrung für ein robotiksystem
DE112019007889T5 (de) Bearbeitungsprogramm-umwandlungseinrichtung, numerische-steuereinrichtung und bearbeitungsprogramm-umwandlungsverfahren
DE102012022190B4 (de) Inverse Kinematik
DE102017116788B4 (de) Roboter-Steuerungsvorrichtung und Verfahren zur Steuerung derselben
WO2018091141A1 (fr) Mesure d&#39;un axe de déplacement d&#39;un robot
EP0449039B1 (fr) Méthode pour le contrôle des systèmes de positionnement
DE102021204148B3 (de) Verfahren und System zum koordinierten Abfahren vorgegebener Roboterbahnen
DE102022130341A1 (de) Punktmengen-störungsprüfung
DE102012010856A1 (de) Verfahren und Mittel zur Überwachung einer Roboteranordnung
DE102018209870B3 (de) Verfahren und System zum Überführen eines Endeffektors eines Roboters zwischen einer Endeffektorpose und einer weiteren Endeffektorpose
DE102021125628B3 (de) Geschwindigkeitsvorgaben zur Trajektorienbestimmung von Kinematiken
DE112018007703B4 (de) Robotersteuerung

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase

Ref document number: 07722822

Country of ref document: EP

Kind code of ref document: A1