EP2122424A2 - Industrieroboter und verfahren zum bestimmen der lage eines industrieroboters relativ zu einem objekt - Google Patents
Industrieroboter und verfahren zum bestimmen der lage eines industrieroboters relativ zu einem objektInfo
- Publication number
- EP2122424A2 EP2122424A2 EP08708188A EP08708188A EP2122424A2 EP 2122424 A2 EP2122424 A2 EP 2122424A2 EP 08708188 A EP08708188 A EP 08708188A EP 08708188 A EP08708188 A EP 08708188A EP 2122424 A2 EP2122424 A2 EP 2122424A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- model
- industrial robot
- images
- relative
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/18—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
- G05B19/408—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by data handling or data format, e.g. reading, buffering or conversion of data
- G05B19/4083—Adapting programme, configuration
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/35—Nc in input of data, input till input file format
- G05B2219/35506—Camera images overlayed with graphics, model
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/37—Measurements
- G05B2219/37572—Camera, tv, vision
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39393—Camera detects projected image, compare with reference image, position end effector
Definitions
- the invention relates to industrial robots and methods for determining the location of an industrial robot relative to an object.
- DE 102 49 786 A1 discloses a method for referencing a robot to a patient, from which an image is created at least from two positions with at least one camera attached to the robot.
- a reference point of the patient is selected in one of the images, a relation of the selected reference point in three-dimensional space is established via position data of the reference point of both images, and the position of the robot is correlated to the patient.
- WO 2004/071717 A1 discloses a plurality of measuring points on the surface of the object to register and store the orientation and position of a CAD model of the object relative to a coordinate system of the object Determine the industrial robot by correlating the measurement points in relation to the model and to determine a resulting deviation for at least some of the measurement points and the corresponding points in the model.
- the industrial robot moves the measuring points with a measuring tip, which is e.g. a contactless sensor comprises.
- Determining the position of an industrial robot relative to an object based on the registration of the measurement points on the surface of the object can be relatively difficult if, for example, required measurement points are relatively difficult or impossible. can also be achieved with the industrial robot.
- the object of the present invention is therefore to specify a simpler method for determining the position of an industrial robot relative to an object and a corresponding industrial robot.
- the object of the invention is achieved by a method for determining the position of an industrial robot relative to an object, comprising the following method steps:
- graphical model is at least a partial model of the object and described in coordinates relative to coordinates of the industrial robot, - manually assigning model points of the graphical model to corresponding pixels in the two images and
- the industrial robot When processing the object, eg a workpiece, with the industrial robot is the knowledge of the location of the object as regards the industrial robot, eg with regard to its base.
- the position of the industrial robot relative to the object is in particular its position and orientation to the industrial robot.
- images of the object are generated from at least two different positions of the industrial robot, i. At least two different image data records assigned to the object are generated by means of the 2D camera.
- the images can only partially or completely depict the object.
- more than two images of the object with the camera can be generated from different positions.
- the 2D camera is on the industrial robot, e.g. attached to the flange or an axis of the industrial robot and is thus moved by moving the industrial robot, i. by moving its axes, brought into the at least two positions.
- the placement of the camera on the industrial robot is known, so that due to the axis positions of the axes of the industrial robot at the positions of the coordinates of the camera related to the industrial robot are also known or can be calculated.
- the corresponding images are displayed with the display device and the graphic model of the object is displayed in the images. It is possible that initially only one of the images displayed and in this image, the model is displayed. It But it is also possible to display all images at the same time and to show the model in all images.
- the graphical model may or may not necessarily be a complete model of the object.
- the model can also be only a partial model of the object.
- the model may also be a so-called graphical wireframe model or a partial graphical wireframe model of the object.
- a wireframe model in English as
- Wireframe model is modeled in CAD especially three-dimensional objects, with surfaces of the object in the wireframe model are shown as lines and it is also possible to visualize especially only edges. Is the wireframe model merely a partial wireframe model , then this includes eg only some of these lines and in particular particularly prominent lines, such as prominent edges or corners of the object.
- model points of the model are assigned to corresponding pixels in the images.
- the assignment of the model and pixels can be done, for example, by means of a pointing device with which, for example, a corner point of the wire or partial wire mesh model is selected.
- the selection is made, for example, by means of the "object snap" method known from CAD techniques.
- the pointing device is for example a computer mouse. If the model point is selected in the model, then it can be manually dragged with the pointing device to the corresponding pixel in one of the images.
- a manual actuation has the advantage that a human, in particular in the images depicted corner points, for example, on the basis of tapered edges or shades can detect relatively easily.
- the possibility of enlarging a picture section comprising the relevant picture element or highlighting picture edges can support the accuracy of the assignment, as a result of which a possibly occurring error in the assignment can be reduced.
- the assignment of the model and pixels can also be done with a so-called six-dimensional mouse (Spacemouse).
- a space mouse is an input device that can be used to change six degrees of freedom simultaneously. It can be used in the assignment of the model and pixels, for example, to move the position of the displayed model relative to the image of the object until a desired coverage is achieved.
- At least four different pixels are necessary if the industrial robot has six degrees of freedom and the distances between the camera and the object at the two positions are not known, which will usually be the case. However, it is necessary to assign at least two different pixels in at least two images to the model, it being preferred to use an equal number of pixels per image. It can be made more than the mathematically least necessary number of point assignments. If an operator communicates the required accuracy of the relative position of the object to be determined, then for each additional point assignment be checked and reported until the required accuracy can be met.
- the position of the industrial robot can be determined relative to the object, since in addition the positions of the camera and the position of the camera relative to the industrial robot are known.
- the location of the object relative to the industrial robot may then be e.g. by solving a regular or overdetermined system of equations with full rank, by means of which a so-called 6 DOF ("6 degrees of freedom", six degrees of freedom) transformation can be performed. Then the model in the images can also be displayed according to the solved transformation.
- 6 DOF six degrees of freedom
- the model can be positioned in the images in such a way that it coincides with the overlapping points.
- the transformation can be done, for example, as follows:
- Em selected pixel B 1 can be represented in the two-dimensional coordinate system of the camera in homogeneous coordinate notation as follows:
- the corresponding model point P 1 in the three-dimensional coordinate system of the model can be represented in homogeneous coordinate notation as follows:
- the transformation matrix for a transformation from the coordinate system of the object to the coordinate system of the image in the ith position of the camera is:
- PrOj 1 0 0 1 0
- d z is a camera-position-dependent distance parameters of the position of the camera at the i-th position.
- the distance parameter d z corresponds to the distance between the focal point of the camera and the projection plane (image plane) of the perspective projection, as described, for example, in James D. Foley et. al., "Computer Graphics Principles and Practice", Addison-Wesley Publishing Company, Reading, Massachusetts, 1992, p.
- k is the vector of the fourth row of the projection matrix PrOj 1 and k 4 corresponds to the distance parameter d 2 .
- the position of the object to the industrial robot can finally by means of optimization, in particular non-linear optimization, such as Gauss Newton or Levenberg Marquardt be performed.
- nonlinear optimization for example, the following objective function f (x) can be established:
- this additionally has the following method steps:
- the two first points are assigned and subsequently the model displayed in the picture is in particular automatically shifted such that the two first points coincide.
- a displacement is in particular a translatory displacement, a tilting or a rotation of the superimposed model.
- the two overlapping dots are locked.
- the locking ensures that the superimposed model in the image can at most be rotated or tilted around the locked points.
- the next pair of points ie a model point of the displayed model and the corresponding pixel in one of the images
- the displayed model is shifted in such a way that this pair of points also coincides.
- This pair of points is in turn locked. Again, it may be provided to move the displayed model in all images.
- the assignment of pairs of points is then continued until the position of the industrial robot relative to the object can be determined. This is possible, for example, if the displayed model covers all images of the object.
- an automatic large adaptation of the displayed model is carried out on the basis of a manual allocation of one of the image points. This is necessary when the sizes of the displayed model differ from the sizes of the imaged object, which will usually be the case.
- lines and / or surfaces of the model can also be manually assigned to corresponding lines or surfaces in at least one of the images.
- a line eg an edge
- a line can for example be selected in the model, in particular in the wire frame model or in the partial wire frame model, with the pointing device using the "object snap" method known from the CAD world.
- the selected Edge can then be dragged to the corresponding edge in the image.
- the edges in the image are identified, for example, with the image data processing method "edge extraction”. If the pointer of the pointing device is brought near such an edge, the "snap-to-line" function known from the CAD world can assist the operator.
- the inventive method is not the 2D camera on the industrial robot, but the object attached to the industrial robot.
- the 2D camera is then immovable relative to a base coordinate system of the industrial robot and the object is moved by means of the industrial robot m at least two mutually different positions.
- the position of the industrial robot relative to the object can then be determined on the basis of the assigned model points of the model to the corresponding pixels in the images, the positions of the object assigned to the images and the position of the camera relative to the basic coordinate system of the industrial robot.
- the position of the flange of the industrial robot relative to the object can be determined. Since the position of the flange relative to the base coordinate system of the industrial robot is known, the position of the industrial robot relative to the object can be determined via the relative position of the flange to the object.
- the object is arranged on a table top, which is movable with respect to a relative to the environment of the industrial robot reference point movable.
- the camera is attached to the industrial robot or set immobile relative to a basic coordinate system of the industrial robot. Moving the industrial robot or the tabletop results in the two positions for which the two two-dimensional image data sets are generated.
- To the location of To determine object relative to the industrial robot first the position of the object relative to the table top due to the assigned model points of the model to the corresponding pixels in the images, the images of the associated positions of the tabletop relative to the industrial robot and the position of the camera relative to the industrial robot determined.
- Intuitive, interactive, flexible, semi-automatic orientation detection system relatively independent of the object's shape, using a simple 2D camera.
- the method according to the invention requires no learning of object features, as is necessary in conventional image processing solutions. This is associated with a disproportionately large amount of time, especially in the small series production of many different parts.
- the inventive method uses the human spatial imagination over the object to be measured.
- the operator can communicate his knowledge of the 3D geometry of the object to the transformation calculation algorithm.
- the object of the invention is also achieved by an industrial robot, comprising
- control device for controlling the drives, a 2D camera for producing a two-dimensional image data set, wherein the camera is attached to the industrial robot in such a way that it can be moved by the industrial robot,
- a graphical model stored in the control device comprising at least a partial model of an object and in
- a display device for displaying images associated with the image data sets generated by the camera and for fading the model into the displayed images
- an input device for manually assigning points of the graphical model to points in the images
- the industrial robot is arranged such that with it the inventive method is feasible to determine the position of the object relative to the industrial robot, when the object with respect to the environment of the industrial robot immovable or arranged on a table top, with respect to a respect to the environment Movable industrial robot immovable reference point.
- the object of the invention is also achieved by an industrial robot, comprising
- a 2D camera for generating a two-dimensional image data set, the camera being immovable with respect to a base coordinate system of the industrial robot; a graphical model stored in the control device, which is described at least a partial model of an object and in coordinates relative to coordinates of the industrial robot, a display device for displaying images associated with the image data sets generated by the camera, and for fading the model into the displayed images and
- an input device for manually assigning points of the graphic model to points in the images
- the industrial robot is set up such that the method according to the invention can be carried out with it in order to determine the position of the physical object relative to the industrial robot when the physical object is fastened to the industrial robot and can be moved therewith.
- the input device is, for example, a pointing device or a space mouse.
- FIGs. 5-7 further pictures of the engine block of Fig. 1 and
- FIG. 8-10 more industrial robots.
- Fig. 1 shows a 6-axis industrial robot 1 with a kinematics for movements of the six degrees of freedom and a reference to the environment of the industrial robot 1 immovable object, which is an engine block M in the case of the present embodiment.
- the industrial robot 1 has joints 2 - 4, levers 5 - 6, six movement axes Al - A6 and a flange F in a generally known manner. Each of the axes Al - A6 is moved by a drive.
- the drives are electric drives, each having an electric motor 7-12.
- the motor 7 moves the axis Al, the motor 8, the axis A2, the motor 9, the axis A3 and the motors 10-12, the axes A4 - A6 on in Fig. 1 not near, but the skilled person well known transmission.
- the electric drives or the electric motors 7 - 12 are connected in a manner not shown with a control computer 15 on which a suitable and the skilled worker in principle known computer program runs, which controls the movements of the industrial robot 1.
- the term "taxes” should also include a regulation in this context.
- a CAD (Computer Aided Design) model 16 of the engine block M is stored in the control computer 15, as shown in FIGS.
- the model 16 was created in the case of the present exemplary embodiment in a generally known manner by means of a CAD program and can be viewed by a person not closer in the figures by means of a connected to the control computer 15 monitor 14.
- the model 16 is a partial model of the engine block M and especially a partial wire mesh model.
- a wireframe model which is referred to in English as a "wireframe" model, in particular models three-dimensional objects in CAD. te, as in the case of the present embodiment, the engine block M.
- surfaces of the object in the wireframe model are shown as lines, in the present exemplary embodiment, only some corners and edges of the engine block M are modeled by means of the model 16.
- the industrial robot 1 For machining the engine block M by means of the industrial robot 1, the industrial robot 1 must be calibrated for the engine block M so that the coordinate system of the model 16 coincides with respect to the coordinate system of the engine block M, i. the position of the engine block M relative to the industrial robot 1 is determined.
- the determination of the position of the engine block M with respect to the industrial robot 1 is illustrated by means of a flow chart shown in FIG. 4.
- a 2D camera 17 connected in a manner not shown to the control computer 15 is fastened to the flange F of the industrial robot 1.
- the 2D camera 17 is, for example, a CCD sensor or a well-known digital camera.
- the position of the camera 17 on the industrial robot 1 is known.
- the purpose of the camera 17 is to provide at least two 2D images of the
- the camera 17 If the camera 17 is in the respective position, then it generates in each case an image data record whose associated images 20, 30 are shown in FIGS. 2 and 3, Step S2 of the flowchart.
- the images 20, 30 are images of the engine block M, wherein in the case of the present embodiment, substantially all of the engine block M is depicted in each of the images 20, 30. However, this is not absolutely necessary; In at least one of the images, only a part of the engine block M can be shown.
- the images 20, 30 are generated, they are simultaneously displayed on the monitor 14 in the case of the present embodiment.
- the model 16 is displayed in each of the images 20, 30, step S3 of the flowchart.
- the input device is a pointing device in the form of a computer mouse 13
- the point 2 IA of the model 16 represents an edge of the motor block M.
- the point 2IA becomes the one known from computer graphics so-called "object snap" method. Subsequently, the person selects the point 21B corresponding to the point 21A in the first image 20 of the engine block M.
- a computer program running on the control computer 15 automatically moves the model 16 displayed in the images 20, 30 such that the points 21A, 21B in both images 20, 30 cover what is indicated by an arrow A.
- step S5 of the flowchart At- closing the overlapping points 21A, 21B are locked, step S5 of the flowchart.
- Model 16 and a point 22A corresponding to the selected point 22A of the model 16 in the first image 20 The computer program running on the control computer 15 then automatically moves the model 16 displayed in the images 20, 30 such that the points 22A, 22B overlap in both images 20, 30, which is indicated by an arrow B.
- the model 16 displayed in the images 20, 30 is now shifted such that the points 21A and 21B and the points 22A and 22B overlap and are also locked, step S6 of the flow chart.
- the person selects in the second image 30 further corresponding points 31A, 31B and 32A, 32B.
- the computer program running on the control computer 15 again moves the model 16 shown in FIGS. 20, 30 in such a way that the points pairs 31A and 31B overlap and the pair of points comprising points 32A and 32B overlap, which is indicated by arrows C, D is indicated.
- the computer program running on the control computer 15 is designed such that it automatically automatically adapts the size of the model 16 displayed in the images 20, 30, if this is necessary due to an assignment of a pair of points. is agile so that the selected pairs of points can overlap.
- the industrial robot 1 has six degrees of freedom.
- the respective distances between the camera 17 and the engine block M are unknown at the two positions. Accordingly, at least four different pairs of points in the two images 20, 30 must be assigned to the calculation of the position of the engine block M relative to the industrial robot 1.
- the layer can then be e.g. by solving a regular, if there are exactly four different (BiId) pairs of points, or by means of a certain system of equations, if there are more than four different pairs of points, with full rank, by means of which a so-called 6 DOF ("6 degrees of freedom", six degrees of freedom) transformation can be performed.
- 6 DOF 6 degrees of freedom
- the transformation can be done, for example, as follows:
- a selected pixel B 1 can be represented in the two-dimensional coordinate system of the camera 17 in homogeneous coordinate notation as follows:
- the corresponding model point Px in the three-dimensional coordinate system of the model 16 can be represented in homogeneous coordinate notation as follows:
- the transformation matrix for a transformation from the coordinate system of the engine block M to the coordinate system of the first image 20 in the first position of the camera 17 is:
- the projection matrix for the projection of the coordinates of the i-th pixel onto the coordinates of the model 16 is e.g. as follows:
- d ⁇ is a camera position-dependent distance parameter of the position of the camera 17 at the i-th position
- ie Ci 1 is the distance parameter associated with the distance between the camera 17 and the engine block M in the first position
- d ⁇ is that of the distance between the camera 17 and the engine block M associated distance parameter in the second position.
- the distance parameter d ⁇ corresponds, for example, to the distance between the focal point of the camera 17 and the projection plane (image plane) of the perspective projection.
- k is the vector of the fourth row of the projection matrix Proji and k 4 corresponds to the distance parameter d ⁇ .
- the position of the engine block M relative to the industrial robot 1 is finally carried out by means of optimization, in particular non-linear optimization, such as Gauss Newton or Levenberg Marquardt.
- non-linear optimization e.g. set up the following objective function f (x):
- the input device used was a pointing device in the form of the computer mouse 13.
- other input devices may also be used, such as a space mouse.
- lines or areas of the model 16 can be assigned to corresponding lines or areas in the images of the object.
- FIGS. 5 and 6 show, by way of example, two images 50, 60 taken from the two positions by means of the camera 17, into which, in turn, a model 16a of the engine block M is superimposed.
- the image 50 corresponds to the image 20 and the image 60 corresponds to the image 30.
- the model 16a is also a partial wireframe model of the engine block M and differs slightly in the case of the present embodiment of the model 16 of Figures 2 and 3.
- the person with the computer mouse 13 in the model l ⁇ a does not select individual points 21A, 22A, 31A, 31A, but lines 51A and 52A in the first image 50 and a line 61A in the second image 60.
- Lines 51A, 52B and 61A in the case of the present embodiment correspond to edges 51B, 52B and 61B of the engine block M which selects the person in the images 50, 60.
- Suitable image processing algorithms are e.g. Edge extraction or Sobel operator. Detected edges can also be divided into straight sections.
- a relevant line 51A, 52B and 61A in the model 16a is selected with the computer mouse 13 using the "object catching" method known from the CAD world.
- the selected line 51A, 52B, 61A, 62A and 63A is then drawn onto the corresponding edges 51B, 52B and 61B in the images 50, 60.
- the edges 51B, 52B and 61B in the images 50, 60 are e.g. identified with the image data processing method "edge extraction". If a pointer moved by means of the computer mouse 13 is brought near such an edge, then the "snap-to-line" function known from the CAD world can support the person.
- the model l ⁇ b is a partial model of the engine block M and shows in particular surfaces 71A, 72A, the surfaces 71B and 72B of the engine block M. assigned.
- the surfaces 71B and 72B of the engine block M are recesses of the engine block M in the case of the present embodiment.
- the camera 17 is attached to the flange F of the industrial robot 1.
- the camera 17 may also be attached to one of the axes Al-A ⁇ , as long as it is moved by the industrial robot 1.
- FIG. 8 shows another industrial robot 81. Unless expressly mentioned, functionally equivalent components of the industrial robot 1 shown in FIG. 1 are provided with the same reference numerals as components of the industrial robot 81 shown in FIG.
- the two industrial robots 1 and 81 are essentially identical. Instead of the camera 17, however, an object 82 is fastened to the flange F of the industrial robot 81, the position of which, in particular its orientation relative to the industrial robot 81, is to be determined. To achieve this is in the case of the present exemplary embodiment, a camera 83 is placed on the ground, for example on a stand 84, immovably with respect to the surroundings of the industrial robot 81 and connected in a manner not shown to the control computer 15 of the industrial robot 81.
- the object 81 is brought into at least two different positions by means of the industrial robot 81 and a 2D image of the object 81 is generated for each position.
- the images are then displayed on the monitor 14.
- a model of the object 82 is superimposed on the images, and then, as for the first embodiments, the position of the object 82 relative to the industrial robot 81 is calculated.
- FIG. 9 again shows the industrial robot 1.
- the engine block M rests on a table top P of a table 90.
- the table foot 91 of the table 90 is pivotable relative to an axis 92 by means of a motor, not shown.
- the motor of the table 90 is connected in a manner not shown to the control computer 15 and is also controlled by this, so that the position of the table top P relative to the table foot 91 is known. Furthermore, the position of the table 90 or its table foot 91 relative to the industrial robot 1 is known. Information about this position is stored in the control computer 15.
- the position of the motor block M on the table top P is unknown, since the motor block M has been placed essentially arbitrarily on the table top P.
- the position of the motor block M is initially determined relative to the tabletop P. If this is determined, the position of the engine block M relative to the industrial robot 1 can also be determined, since the position of the table top P relative to the table foot 91 and the position of the table foot 91 relative to the industrial robot 1 are known.
- images are taken with the camera 17 from two different positions.
- the two positions result from a movement of the industrial robot 1 or its flange F and / or by a pivoting of the table top P with respect to the axis 92.
- the model 16 of the engine block M is superimposed in the recorded images and points of the model 16 points of the image of the engine block M are assigned. Based on this assignment, which is carried out analogously to the assignment of the pairs of points shown in FIGS. 2 and 3, the position of the motor block M relative to the table top P can subsequently be calculated. This calculation is carried out analogously to the calculation of the position of the engine block M relative to the industrial robot 1 according to the scenario shown in FIG. 1.
- the camera may also be fixedly mounted on a stand 84, similar to the scenario illustrated in FIG. 8.
- a scenario is shown in FIG. 10, in which the camera has the reference numeral 83.
- the table top P By pivoting the table top P about the axis 92, two different positions can be set for which the camera 83 takes pictures of the engine block M. Subsequently, the position of the engine block M relative to the table top P can be determined according to the scenario shown in FIG. 9.
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Manufacturing & Machinery (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
- Numerical Control (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102007009851A DE102007009851B3 (de) | 2007-02-28 | 2007-02-28 | Industrieroboter und Verfahren zum Bestimmen der Lage eines Industrieroboters relativ zu einem Objekt |
PCT/EP2008/050849 WO2008104426A2 (de) | 2007-02-28 | 2008-01-25 | Industrieroboter und verfahren zum bestimmen der lage eines industrieroboters relativ zu einem objekt |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2122424A2 true EP2122424A2 (de) | 2009-11-25 |
Family
ID=39326698
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP08708188A Ceased EP2122424A2 (de) | 2007-02-28 | 2008-01-25 | Industrieroboter und verfahren zum bestimmen der lage eines industrieroboters relativ zu einem objekt |
Country Status (4)
Country | Link |
---|---|
US (1) | US20110037839A1 (de) |
EP (1) | EP2122424A2 (de) |
DE (1) | DE102007009851B3 (de) |
WO (1) | WO2008104426A2 (de) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8923602B2 (en) | 2008-07-22 | 2014-12-30 | Comau, Inc. | Automated guidance and recognition system and method of the same |
EP2685403A3 (de) | 2012-07-09 | 2017-03-01 | Technion Research & Development Foundation Limited | Natürliches Maschinenschnittstellensystem |
US9214021B2 (en) * | 2012-10-09 | 2015-12-15 | The Boeing Company | Distributed position identification |
US10078330B2 (en) | 2016-03-25 | 2018-09-18 | International Business Machines Corporation | Coordinating robotic apparatus deliveries |
CN108810425B (zh) * | 2017-05-02 | 2024-06-11 | 北京米文动力科技有限公司 | 一种摄像头配置方法及装置 |
DE102019107417A1 (de) * | 2019-03-22 | 2020-09-24 | Günther Battenberg | Verfahren zur Durchführung von zumindest einem Tätigkeitsprozess mittels eines Roboters |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4233625A (en) * | 1978-11-03 | 1980-11-11 | Teledyne, Inc. | Television monitoring system for automatically aligning semiconductor devices during manufacture |
US4853771A (en) * | 1986-07-09 | 1989-08-01 | The United States Of America As Represented By The Secretary Of The Navy | Robotic vision system |
US6175415B1 (en) * | 1997-02-19 | 2001-01-16 | United Technologies Corporation | Optical profile sensor |
GB9803364D0 (en) | 1998-02-18 | 1998-04-15 | Armstrong Healthcare Ltd | Improvements in or relating to a method of an apparatus for registering a robot |
JP3421608B2 (ja) * | 1999-04-08 | 2003-06-30 | ファナック株式会社 | 教示モデル生成装置 |
DE10159574B9 (de) * | 2001-10-15 | 2009-04-30 | Tropf, Hermann, Dr.-Ing. | Vorrichtung und Verfahren zur Korrektur der Bewegung von Greif- und Bearbeitungswerkzeugen |
US7233841B2 (en) * | 2002-04-19 | 2007-06-19 | Applied Materials, Inc. | Vision system |
DE10249786A1 (de) * | 2002-10-24 | 2004-05-13 | Medical Intelligence Medizintechnik Gmbh | Referenzierung eines Roboters zu einem Werkstück und Vorrichtung hierfür |
SE524818C2 (sv) * | 2003-02-13 | 2004-10-05 | Abb Ab | En metod och ett system för att programmera en industrirobot att förflytta sig relativt definierade positioner på ett objekt |
JP3708083B2 (ja) * | 2003-02-28 | 2005-10-19 | ファナック株式会社 | ロボット教示装置 |
US20050105791A1 (en) * | 2003-10-29 | 2005-05-19 | Lee Ken K. | Surface inspection method |
DE10351669B4 (de) * | 2003-11-05 | 2012-09-13 | Kuka Laboratories Gmbh | Verfahren und Vorrichtung zum Steuern eines Handhabungsgeräts relativ zu einem Objekt |
DE102004006596B4 (de) * | 2004-02-10 | 2007-02-15 | Vision Tools Bildanalyse Systeme Gmbh | Handeingabe von Werkstückposen |
US20080011313A1 (en) * | 2005-10-24 | 2008-01-17 | Philip Gildenberg | System and method for robotic assisted wig construction |
DE102005058867B4 (de) * | 2005-12-09 | 2018-09-27 | Cine-Tv Broadcast Systems Gmbh | Verfahren und Vorrichtung zum Bewegen einer auf einem Schwenk- und Neigekopf angeordneten Kamera entlang einer vorgegebenen Bewegungsbahn |
-
2007
- 2007-02-28 DE DE102007009851A patent/DE102007009851B3/de not_active Expired - Fee Related
-
2008
- 2008-01-25 EP EP08708188A patent/EP2122424A2/de not_active Ceased
- 2008-01-25 US US12/528,549 patent/US20110037839A1/en not_active Abandoned
- 2008-01-25 WO PCT/EP2008/050849 patent/WO2008104426A2/de active Application Filing
Non-Patent Citations (2)
Title |
---|
KIM W S: "Virtual reality calibration for telerobotic servicing", ROBOTICS AND AUTOMATION, 1994. PROCEEDINGS., 1994 IEEE INTERNATIONAL C ONFERENCE ON SAN DIEGO, CA, USA 8-13 MAY 1994, LOS ALAMITOS, CA, USA,IEEE COMPUT. SOC, 8 May 1994 (1994-05-08), pages 2769 - 2775, XP010097336, ISBN: 978-0-8186-5330-8, DOI: 10.1109/ROBOT.1994.350918 * |
RAINER BISCHOFF ET AL: "2006-10-22 Industrial AR ISMAR 2006 Page 1 Bischoff / Kurth -KUKA Robot Group Concepts, Tools and Devices for Facilitating Human-Robot Interaction with Industrial Robots through Augmented Reality Industrial AR ISMAR 2006 Page 2 Bischoff / Kurth -KUKA Robot Group AR Concepts, Tools and Devices for Fa", 22 October 2006 (2006-10-22), XP055191766, Retrieved from the Internet <URL:http://ismar06.tinmith.net/data/2a-KUKA.pdf> [retrieved on 20150528] * |
Also Published As
Publication number | Publication date |
---|---|
DE102007009851B3 (de) | 2008-05-29 |
US20110037839A1 (en) | 2011-02-17 |
WO2008104426A2 (de) | 2008-09-04 |
WO2008104426A3 (de) | 2008-11-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102019006800B4 (de) | Robotersteuerung und Anzeigevorrichtung unter Verwendung von erweiterter Realität und gemischter Realität | |
DE102018109463B3 (de) | Verfahren zur Benutzung einer mehrgliedrigen aktuierten Kinematik, vorzugsweise eines Roboters, besonders vorzugsweise eines Knickarmroboters, durch einen Benutzer mittels einer mobilen Anzeigevorrichtung | |
DE102015002760B4 (de) | Robotersimulationssystem, das den Prozess des Entnehmens von Werkstücken simuliert | |
DE102018009023B4 (de) | Einlernvorrichtung zum Ausführen von Robotereinlernvorgängen und Einlernverfahren | |
EP2879842B1 (de) | Verfahren und programmiermittel zur modifikation einer roboterbahn | |
DE102007059478B4 (de) | Verfahren und System zur Ausrichtung eines virtuellen Modells an einem realen Objekt | |
DE102009012590A1 (de) | Vorrichtung zum Ermitteln der Stellung eines Roboterarms mit Kamera zur Durchführung von Aufnahmen | |
DE102007009851B3 (de) | Industrieroboter und Verfahren zum Bestimmen der Lage eines Industrieroboters relativ zu einem Objekt | |
DE102011083876A1 (de) | Verfahren zur Bewegungssteuerung einer Röntgenvorrichtung und Röntgensystem | |
EP3443908B1 (de) | Verfahren zum betreiben eines röntgengeräts mit einem gelenkarm und röntgengerät mit einem gelenkarm | |
DE102015000587A1 (de) | Roboterprogrammiervorrichtung zum Erstellen eines Roboterprogramms zum Aufnehmen eines Bilds eines Werkstücks | |
DE102006007623A1 (de) | Roboter mit einer Steuereinheit zum Steuern einer Bewegung zwischen einer Anfangspose und einer Endpose | |
DE102018113336A1 (de) | Verfahren zum Verwenden mit einer Maschine zum Einstellen einer Erweiterte-Realität-Anzeigeumgebung | |
DE102009020307A1 (de) | Simulator für eine Sichtprüfungsvorrichtung | |
DE102015104582A1 (de) | Verfahren zum Kalibrieren eines Roboters an einem Arbeitsbereich und System zum Durchführen des Verfahrens | |
DE112019007663T5 (de) | Bildbasierte steuerungskorrektur | |
DE102015104587A1 (de) | Verfahren zum Kalibrieren eines Roboters an einem Arbeitsbereich und System zum Durchführen des Verfahrens | |
EP3809094B1 (de) | Verfahren und anordnung zum visualisieren von sensorsignalen eines optischen sensors eines koordinatenmessgeräts sowie verfahren und anordnung zum visualisieren eines sensors eines koordinatenmessgeräts | |
DE102010036904A1 (de) | Haptische Messvorrichtung und Messverfahren | |
DE102020204677A1 (de) | Trackingsystem und Verfahren zur Kompensation von Sichtschatten bei der Nachverfolgung von Messobjekten | |
DE102004032996A1 (de) | Einfache Roboterprogrammierung | |
DE10149795B4 (de) | Semiautomatische Registrierung zur Überlagerung zweier medizinischer Bilddatensätze | |
DE102021133633B3 (de) | Verfahren zum Bereitstellen einer Greifvorrichtung | |
WO2002065888A2 (de) | Anordnung und verfahren zur sensorpositionierung | |
DE102004006596B4 (de) | Handeingabe von Werkstückposen |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20090904 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: SEDLMAYR, ANDREAS Inventor name: KURTH, JOHANNES |
|
17Q | First examination report despatched |
Effective date: 20091130 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: KUKA LABORATORIES GMBH |
|
DAX | Request for extension of the european patent (deleted) | ||
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: KUKA ROBOTER GMBH |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: KUKA DEUTSCHLAND GMBH |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20190210 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230528 |