EP3612355A1 - Verfahren zur erstellung einer datenbank mit greiferposen, verfahren zum steuern eines roboters, computerlesbares speichermedium und handhabungssystem - Google Patents
Verfahren zur erstellung einer datenbank mit greiferposen, verfahren zum steuern eines roboters, computerlesbares speichermedium und handhabungssystemInfo
- Publication number
- EP3612355A1 EP3612355A1 EP18719839.5A EP18719839A EP3612355A1 EP 3612355 A1 EP3612355 A1 EP 3612355A1 EP 18719839 A EP18719839 A EP 18719839A EP 3612355 A1 EP3612355 A1 EP 3612355A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- gripper
- robot
- pose
- database
- poses
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 55
- 238000003860 storage Methods 0.000 title claims description 8
- 238000005007 materials handling Methods 0.000 title 1
- 238000012549 training Methods 0.000 claims abstract description 49
- 230000008447 perception Effects 0.000 claims description 13
- 239000011159 matrix material Substances 0.000 claims description 3
- 230000009466 transformation Effects 0.000 claims description 3
- 239000012636 effector Substances 0.000 claims description 2
- 238000013459 approach Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 239000007787 solid Substances 0.000 description 2
- ATJFFYVFTNAWJD-UHFFFAOYSA-N Tin Chemical compound [Sn] ATJFFYVFTNAWJD-UHFFFAOYSA-N 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1612—Programme controls characterised by the hand, wrist, grip control
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39484—Locate, reach and grasp, visual guided grasping
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39543—Recognize object and plan hand shapes in grasping movements
Definitions
- the invention relates to a method for creating a database with
- Gripper poses a method for controlling a robot, a corresponding computer-readable storage medium and a handling system.
- Modern handling systems such as multi-axis robots, serve to transport objects from a first location to a second location. For this purpose it is necessary that the robot grips the object by means of a gripper and holds it securely so that it does not fall down during the transport.
- the objects to be gripped are always in the same place with the same orientation, so that a gripping needs to be programmed only once and then can be executed repeatedly.
- the objects may be on
- robotic systems have perception systems, e.g. B. on stereo camera systems, the detection of the objects to be gripped
- the dimensions and the orientation of the object to be gripped can be determined by means of these perception systems, so that the points of application for the gripper can be determined.
- a pose is as one Representation comprising three spatial coordinates and three angles, which describes the position and orientation of a second coordinate system with respect to a reference coordinate system.
- the mass distribution of the object to be gripped by means of a visual perception system is usually not recognizable. It can
- Working space of the robot indicates how many directions a point in the
- Publication further describes that a database can be used which contains information on possible gripping actions of known objects.
- a disadvantage of the procedure described in the cited publication is that the creation of the database is very complicated. It is not sufficient to store only a single gripper pose for each object. Rather, a variety of gripper poses must be manually recorded and stored in the database so that the gripping of an object can be carried out safely. Based on the described prior art, it is the task of
- Invention to provide a method for creating a database with gripper poses, which addresses the above-mentioned disadvantages.
- the efficient filling of a database is to be made possible.
- a method for controlling a robot is to be specified, which allows an efficient control of the robot.
- a corresponding computer-readable storage medium should be specified.
- a handling system is to be specified, which allows the efficient gripping of an object.
- the object is achieved by a method for creating a database with gripper poses according to claim 1.
- the object is achieved by a method for creating a database with gripper poses comprising: a. Save, in particular by recording, at least one
- Trainingsgreiferpose for gripping a training object in particular with respect to a training object associated
- One core of the invention is therefore that one or a plurality of further gripper poses are generated for a single training gripper pose. So only the training gripper pose has to be set manually. It is particularly advantageous that the further gripper poses are generated using object prototypes. Thus, the properties of the object prototypes can be used to determine the further gripper poses. The described method therefore allows the efficient storage of at least one further gripper pose in addition to the training gripper pose.
- the object is further achieved by a method for controlling a robot according to claim 2.
- the object is achieved by a method for controlling a robot, comprising:
- a pose particularly the training gripper pose, may be represented as a six-tuple having three spatial coordinates and three angles relative to a training object associated with the training object
- Training object coordinate system has.
- a pose can be stored particularly efficiently in a data structure that indicates the position and orientation with respect to the training object.
- a gripper pose can be used, regardless of where the object to be gripped is in space or in a robot coordinate system.
- the method may include selecting the at least one object prototype from a set of candidate prototypes.
- the set of candidate prototypes may include a cylinder, a sphere, a box, a cone, a truncated cone, and / or a pyramid.
- the at least one object prototype can thus be a geometric basic form, whereby a very flexible handling of the described method is possible.
- the object prototypes may be described by geometric parameters such as length, depth, width, height, and / or radius.
- the known geometry of the object prototype can then be utilized for automatic generation of the at least one further gripper pose.
- at least one symmetry axis may be associated with the at least one object prototype, wherein a plurality of gripper poses may be generated taking into account the at least one symmetry axis, in particular by rotation about the at least one symmetry axis, preferably using a transformation matrix, wherein the gripper poses are respectively stored can.
- Object prototype is used to generate the variety of gripper poses.
- a rotationally symmetrical body can be gripped equally well using a plurality of poses.
- a large number of possible gripper poses can be generated, which deliver similarly good gripping results.
- An object prototype may be in a CAD data format, such as. IGES or STL.
- the CAD data format can directly or indirectly specify an axis of symmetry and / or a plane of symmetry that can be used to generate the further gripper poses.
- the training gripper pose and the further at least one gripper pose may each be stored with an object identification number that may be associated with the training object.
- the assignment of the training gripper pose and the further at least one gripper pose to an object identification number makes it possible to find a plurality of possible poses for gripping an object, if the
- Object database to query a variety of possible gripper poses at once for a single object.
- a storing of gripping parameters can be provided together with the poses in an object database, in particular gripper information, a gripping mode, a quality indication, a force, a gripping strategy and / or a grip description.
- the object database which can preferably be assigned to individual poses.
- This can be data that specifies the grip closer. For example, may be indicated by a force with which force an object should be grasped.
- a gripping type can be stored.
- a multi-finger gripper z For example, a finger is used to grasp an object only with the fingertips of the multi-finger gripper.
- Gripper information may include information indicating a gripper type or even a particular gripper product.
- Gripper information can be stored in a gripper table or gripper database. So it can be a reference to the poses
- the corresponding entry in the gripper table or gripper database are stored.
- the corresponding entry may also include a CAD model of the gripper or performance parameters, such as a maximum application force.
- at least one rule may be provided which indicates that for each pose
- the rule can z. B. implemented as part of the object database.
- a gripping strategy can specify how an object should be gripped.
- a quality indication may include indications as to whether a reliable gripping is possible, i. whether in experiments it turned out that one
- the saving of the gripping parameters has the advantage that gripping an object is easier
- the method may include identifying the first item.
- Training object comprise, in particular by means of a perception system, for. B. by means of a stereo camera.
- the training object is automatically identified, so that information about the training object automatically in the
- Object database can be written. This makes the process even more efficient.
- the method may include storing
- Object features for recognizing the training object comprise, in particular geometric information, such as corners and / or edges of the Training object, preferably such that poses are attributable to the object features.
- Object features may also include visual information. These include image descriptors, such as color and color intensity gradients,
- Texture information can be stored as object features.
- a description of the training object can be stored in the database.
- This description may be implemented by object features.
- the position and orientation of corners and / or edges can be stored so that a perception system can find and / or identify an object on the basis of the stored object features.
- the object features may be stored so as to be associated with the stored gripper poses.
- This assignment can be implemented directly or indirectly. For example, all poses may be stored in a first database table. All objects can be stored in a second database table and all object features can be stored in a third database table. An assignment can then be executed using corresponding primary keys and references in the tables.
- Object features can be selected.
- a method for identifying the training object is selected.
- a Pezeptions mentor is selected, which uses image sections for the identification of objects.
- An object can be identified in such a way that first of all a hypothesis is drawn as to which object is in the view of a perception system.
- the method may include: an identification of a target object to be gripped, in particular of an object identification number assigned to the target object;
- an end effector e.g. B. a parallel gripper, a robot.
- the described embodiment has the advantage that the gripping of an object by the robot is simplified. After identifying the object to be gripped, using an associated
- the determining the reading may comprise a plurality of candidate poses, in particular from an object database, wherein the reading may be performed in consideration of a reachability map and / or a capability map of a robot.
- the described embodiment has the advantage that from the plurality of candidate poses a Zielgreiferpose can be selected, which leads to particularly good gripping results for a particular robot. This can be ensured via the reachability map and / or the ability map.
- a reachability map indicates which points in the room can be reached by the robot taking into account its kinematics.
- a skill card indicates how many directions the robot can reach to a point within its working space.
- Embodiment the gripping of a target object safer and easier.
- the determining may include sorting one or more candidate poses, wherein the sorting may be performed considering the skill card.
- Candidate poses are tried one after the other. Represents, for example out that a first candidate pose is not appropriate, then proceed to a second candidate pose.
- the method may include motion planning to move a robot from an initial configuration to a robot
- Target configuration wherein the target configuration may be associated with a Zielgreiferpose.
- Candidate pose corresponds to the pose that has been identified as the most suitable pose during sorting.
- it can be established, for example, that the most suitable pose, taking into account a current robot configuration, for example due to obstacles, can not be achieved or insufficiently well. Therefore, a loop of selecting a target gripper pose and motion planning can be executed. This ensures that exactly that Zielgreiferpose is used, which is most suitable for the robot at the current time.
- the object is further achieved by a computer-readable storage medium according to claim 15.
- a storage medium containing instructions that cause at least one processor to implement a method as claimed in any one of the preceding claims when the instructions are executed by the at least one processor.
- the object is further achieved by a handling system according to claim 16.
- a handling system comprising:
- a perception system for detecting an object which is arranged in particular on the robot; wherein the robot is adapted
- NEN robot 10 in an initial configuration before grasping an object 1;
- Fig. 2b shows the object 1 of Fig. 2a and the gripper 20 of Fig. 2a in one
- FIG. 3 shows a schematic illustration of an image of an object 1 on an object prototype 30
- 4a shows an object prototype 30 with symmetry axis S; 4b shows the generation of a plurality of gripper poses PP "under
- FIG. 5 shows a schematic representation of a database table 41
- Fig. 6 is a schematic representation of a handling system
- FIG. 7 is a flow chart illustrating a method of controlling a
- Robot 10 indicates.
- Fig. 1 shows a robot 10 shown in an initial configuration.
- the robot 10 has a gripper 20 which is shown in FIG.
- Embodiment designed as a parallel gripper 20 That is, the
- Parallel gripper 20 has two jaws, which are parallel to each other and
- a force sensor is provided on the jaws of the parallel gripper, which are designed to measure how much force is applied to an object to be gripped.
- a stereo camera 11 is attached to the robot 10
- the stereo camera 11 is aligned so that it also perceives the gripper 20. So also gripped objects can be permanently recorded. Therefore, the handle of the object can also be monitored. For example, it is possible to determine the slip (so-called "slip detection")
- the stereo camera 11 By means of the stereo camera 11, it is possible to produce a three-dimensional map of the surroundings by moving the robot 10.
- the three-dimensional map consists of a point cloud representing the position in a first step individual discrete points in a coordinate system of the robot 10 indicates.
- a world model can be created from the three-dimensional map. It can be determined using image processing, where z. B. surfaces are located. Also, individual objects in the environment of the robot 10 can be recognized and their pose stored within the world model. When the pose of an object in the
- a bottle 1 is arranged in the immediate vicinity of the robot 10.
- the bottle 1 is located within the working space of the robot 10, ie in the area in which the robot 10 can theoretically grasp objects.
- the stereo camera 11 now detects the contours of the bottle 1.
- the robot 10 can be moved so that the bottle 1 is taken from as many angles through the stereo camera 11.
- the robot 10 can be moved such that the bottle 1 can be gripped by the gripper 20.
- the robot 10 may e.g. B. over
- the robot 10 is a
- the training gripper pose is recorded manually by an operator.
- FIGS. 2a and 2b it is briefly explained how the gripping of an object 1 is carried out.
- Fig. 2a shows the gripper 20 in a Anfahrtspose.
- the approach pose is usually stored in relation to an object coordinate system 2.
- the object coordinate system 2 is the one to be gripped
- the approaching pose comprises a three-dimensional vector VI as well as three solid angles, which rotate around that of the
- the vector VI points from the origin of the object coordinate system 2 to the so-called “tool center point” (TCP) of the gripper
- Fig. 2b shows the gripper 20 of Fig. 2a in a gripper pose, i. in a pose in which the object 1 can be grasped.
- the gripper pose of the gripper 20 is also stored with respect to the object coordinate system 2.
- Gripper pose includes the vector V2 and three solid angles.
- the vector V2 also points from the origin of the object coordinate system 2 to the TCP of the gripper 20.
- the gripper jaws of the gripper 20 need be compressed in the embodiment of FIG. 2b.
- the gripper jaws may be compressed until a threshold force value, e.g. B. 20 or 50 Newton, is exceeded.
- a threshold force value e.g. B. 20 or 50 Newton
- Robot configuration to be moved
- FIG. 3 shows how a plurality of gripper poses P, PP "can be generated for an object 1.
- FIG. 3 shows how an object prototype is determined for an object 1.
- the exact configuration of the object prototypes 30, 30 ⁇ 30 can be determined via a large number of parameters, such that the height of the cylinder 30 can be determined by means of the parameter LI
- the radius of the cylinder can be defined by means of the parameter R1 ⁇ can be defined in its dimensions by means of the parameters L2, L3 and L4
- the sphere 30 can be defined via the radius R2.
- the parameters LI to L4 and Rl, R2 are changed so that a maximum volume of the object prototypes 30, 30 ⁇ 30" through the bottle 1 is filled.
- the bottle 1 is thus placed in the object prototypes 30, 30 ⁇ 30 "This can be carried out by known optimization methods
- the object prototypes 30, 30 ⁇ 30 "of Fig. 3 have symmetries which can be determined by means of symmetry axes S.
- Fig. 4a the
- a cylinder is
- the symmetry is used in the following to generate a multiplicity of possible gripper poses ⁇ ⁇ , P "for gripping the cylinder 30.
- FIG. FIG. 4 b shows a gripper pose P of a gripper which has been manually created by an operator by means of the robot 10.
- the axis of symmetry S can further gripper poses ⁇ be generated ⁇ , P "now.
- the Greiferpose P is rotated by means of a transformation matrix to the center of the cylinder 30 in a plan view around. In this manner, as can in any angular intervals, e.g., every 1 degree, every 5 degrees or every 10 degrees, a gripper pose P ⁇ P "be generated.
- any angular intervals e.g., every 1 degree, every 5 degrees or every 10 degrees
- Fig. 5 shows a schematic representation of a table 41 of the object database 40.
- Database table 41 includes the columns “Object”, “Gripper”, “PoseStart”, “PoseGr.” and “F”.
- object contains identification numbers associated with objects 1. These may be, for example, the
- Identification number act in another database table are stored in the object descriptions.
- Each row of the database table 41 corresponds to a gripper pose.
- the gripper used is stored for each gripper pose P2, P4.
- a parallel gripper is stored as a gripper type in both rows of the table.
- each an associated An marspose PI, P3 is stored.
- the associated approach positions PI, P3 can like the
- Gripper poses of Fig. 4b are generated by exploiting symmetry axes.
- the gripper poses P2, P4 correspond to either recorded
- Gripper poses P2, P4 can also optionally a force value F are stored, indicating the force with which the jaws of the parallel gripper used to press against the object to be lifted 1.
- Fig. 6 shows an embodiment in which the object database 40 is used to grasp an object.
- the robot 10 shows a robot 10 with a stereo camera 11 and a gripper 20.
- the robot 10 further comprises a control device 12, which executes the control of the robot.
- Arranged next to the robot 10 is a bottle.
- the stereo camera 11 recognizes the object to be gripped as a bottle.
- the recognized bottle can be assigned to an object type.
- the controller 12 communicates via a network or locally with the object database 40.
- the controller 12 sends a
- Candidate poses P2, P4 selected that are compatible with the detected object After the candidate poses P2, P4 have been received by the controller 12, they are filtered using a skill card. In the process, those candidate poses are sorted out that can not be reached. In addition, a sorting takes place, which is also performed using the skill card. The order of
- Gripper poses are determined by the number of directions from which the coordinates associated with the gripper poses can be reached.
- a path planner is also implemented which maps the path of the robot 10 from the current configuration into the
- Target configuration corresponding to a first candidate pose Can the path planner pick a path from the current robot configuration to the
- the path planner Used to grab the object. If the path planner can not calculate a path because, for example, another object obstructs the path, then the next candidate pose is selected in the sorted list of candidate poses. This process is carried out until a candidate pose P2, P4 is found for which the path planner can calculate a path.
- step Sl video data 51 is analyzed to identify objects 1, Tin of the environment of the robot 10.
- Known objects 1 are stored together with object features in the object database 40.
- a perception algorithm is selected which is used to identify the object 1. It is now started for the first known object of the object database 40 to determine whether the object is recognizable in the video data 51. If the object is not recognizable, the second object from object database 40 is continued. Possibly. For the second object, a different perception algorithm is used than for the first object. It is determined that the second object of the object database 40 in the Video data 51 is recognizable, the step S1 is completed.
- the resulting object data 52 are used in step S2 to query for the detected object from the object database 40 a list of candidate poses P2, P4.
- the list of candidate poses 53 is filtered in step S3 using a capability map of a robot 10. In the process, those candidate poses are removed which can not be reached with the robot 10 or gripper 20 used. In addition, the list of candidate poses 53 is sorted using the skill card 54. The list of candidate poses 53 is further sorted using user-defined criteria. For example, the gripping type stored for the candidate poses is taken into account in the sorting. Has a user indicated that when using a multi-finger gripper a
- the list of candidate poses 53 is sorted accordingly.
- the sorted list of candidate poses 55 is combined with a world model 56, which includes the poses of objects in the environment of the robot 10, to generate a path for the robot 10 in step S4
- Path planning performed a collision check, with objects that are stored in the world model 56 are taken into account in the path planning. If it is determined in step S5 that no path for the first candidate pose of the list can be determined, a new path planning is started with the next candidate pose of the list. The process repeats until it is determined in step S5 that a robot path 57 can be calculated. In step S6, the robot 10 is then moved using the robot path 57.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Manipulator (AREA)
Abstract
Description
Claims
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102017108517 | 2017-04-21 | ||
DE102017108727.3A DE102017108727B4 (de) | 2017-04-24 | 2017-04-24 | Verfahren zur Erstellung einer Datenbank mit Greiferposen, Verfahren zum Steuern eines Roboters, computerlesbares Speichermedium und Handhabungssystem |
PCT/EP2018/060264 WO2018193130A1 (de) | 2017-04-21 | 2018-04-23 | Verfahren zur erstellung einer datenbank mit greiferposen, verfahren zum steuern eines roboters, computerlesbares speichermedium und handhabungssystem |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3612355A1 true EP3612355A1 (de) | 2020-02-26 |
Family
ID=62046918
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP18719839.5A Pending EP3612355A1 (de) | 2017-04-21 | 2018-04-23 | Verfahren zur erstellung einer datenbank mit greiferposen, verfahren zum steuern eines roboters, computerlesbares speichermedium und handhabungssystem |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP3612355A1 (de) |
WO (1) | WO2018193130A1 (de) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3831546A4 (de) * | 2019-03-15 | 2022-03-23 | OMRON Corporation | Greifpositions- und ausrichtungsregistrierungsvorrichtung, greifpositions- und ausrichtungsregistrierungsverfahren und programm |
CN111504328B (zh) | 2020-05-22 | 2022-07-08 | 梅卡曼德(北京)机器人科技有限公司 | 机器人运动规划方法、路径规划方法、抓取方法及其装置 |
CN112633187B (zh) * | 2020-12-28 | 2023-05-05 | 山东电子职业技术学院 | 基于图像分析的机器人自动搬运方法、系统和存储介质 |
CN113062697B (zh) * | 2021-04-29 | 2023-10-31 | 北京三一智造科技有限公司 | 一种钻杆装卸控制方法、装置及钻杆装卸设备 |
CN113538459B (zh) * | 2021-07-07 | 2023-08-11 | 重庆大学 | 一种基于落点区域检测的多模式抓取避障检测优化方法 |
CN114683251A (zh) * | 2022-03-31 | 2022-07-01 | 上海节卡机器人科技有限公司 | 机器人抓取方法、装置、电子设备及可读取存储介质 |
CN115570564B (zh) * | 2022-09-26 | 2024-06-14 | 北京航空航天大学 | 一种家用服务机器人对目标位姿的识别与规范化方法 |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AT11337U1 (de) * | 2009-02-26 | 2010-08-15 | Ih Tech Sondermaschb U Instand | Verfahren und vorrichtung zum robotergesteuerten greifen und bewegen von objekten |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4835616B2 (ja) * | 2008-03-10 | 2011-12-14 | トヨタ自動車株式会社 | 動作教示システム及び動作教示方法 |
US8923602B2 (en) * | 2008-07-22 | 2014-12-30 | Comau, Inc. | Automated guidance and recognition system and method of the same |
DE102014223167A1 (de) * | 2014-11-13 | 2016-05-19 | Kuka Roboter Gmbh | Bestimmen von objektbezogenen Greifräumen mittels eines Roboters |
-
2018
- 2018-04-23 EP EP18719839.5A patent/EP3612355A1/de active Pending
- 2018-04-23 WO PCT/EP2018/060264 patent/WO2018193130A1/de active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AT11337U1 (de) * | 2009-02-26 | 2010-08-15 | Ih Tech Sondermaschb U Instand | Verfahren und vorrichtung zum robotergesteuerten greifen und bewegen von objekten |
Also Published As
Publication number | Publication date |
---|---|
WO2018193130A1 (de) | 2018-10-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102017108727B4 (de) | Verfahren zur Erstellung einer Datenbank mit Greiferposen, Verfahren zum Steuern eines Roboters, computerlesbares Speichermedium und Handhabungssystem | |
EP3612355A1 (de) | Verfahren zur erstellung einer datenbank mit greiferposen, verfahren zum steuern eines roboters, computerlesbares speichermedium und handhabungssystem | |
DE102015208584B4 (de) | Greifvorrichtung und Greifverfahren | |
DE102018116053B4 (de) | Robotersystem und Roboterlernverfahren | |
DE102014102943B4 (de) | Robotersystem mit Funktionalität zur Ortsbestimmung einer 3D- Kiste | |
DE102015001527B4 (de) | Robotersystem, das visuelle Rückmeldung verwendet | |
DE112017002498B4 (de) | Robotervorgang-auswertungseinrichtung, robotervorgang-auswertungsverfahren und robotersystem | |
DE102021121063A1 (de) | Effiziente datengenerierung für das greifenlernen mit allgemeinen greifern | |
DE102016122678B4 (de) | Werkstückpositions-/-Stellungsberechnungssystem und Handhabungssystem | |
DE102015002658B4 (de) | Robotersimulationssystem, das einen Entnahmevorgang eines Werkstücks simuliert | |
EP3020516B1 (de) | Bestimmen von objektbezogenen greifräumen mittels eines roboters | |
DE112019000097B4 (de) | Steuervorrichtung, Arbeitsroboter, Programm und Steuerverfahren | |
DE112018007727B4 (de) | Robotersystem | |
DE102015000587A1 (de) | Roboterprogrammiervorrichtung zum Erstellen eines Roboterprogramms zum Aufnehmen eines Bilds eines Werkstücks | |
DE102021107568A1 (de) | Adaptive planung von zugriffen zum aufheben von behältern | |
DE102016001174A1 (de) | Robotersystem zum Entnehmen von Werkstücken mit Umrechenfunktion von Position und Orientierung und Verfahren zum Entnehmen von Werkstücken | |
EP3966731A1 (de) | Maschinelles lernen einer objekterkennung mithilfe einer robotergeführten kamera | |
DE102021205324A1 (de) | Greifverfahren und Greifvorrichtung eines Industrieroboters, Computerspeichermedium und Industrieroboter | |
DE112018007729B4 (de) | Maschinelle Lernvorrichtung und mit dieser ausgestattetes Robotersystem | |
EP3323565B1 (de) | Verfahren und vorrichtung zur inbetriebnahme eines mehrachssystems | |
DE112017007903B4 (de) | Haltepositions- und Orientierungslehreinrichtung, Haltepositions- und Orientierungslehrverfahren und Robotersystem | |
EP3702108A1 (de) | Verfahren zum ermitteln einer greifposition zum ergreifen eines werkstücks | |
DE102015214857A1 (de) | Verfahren und System zum Erstellen eines dreidimensionalen Modells einer Produktionsumgebung | |
WO2019029878A1 (de) | Handhabungsanordnung mit einer handhabungseinrichtung zur durchführung mindestens eines arbeitsschritts sowie verfahren und computerprogramm | |
WO2023237323A1 (de) | Erhöhung der greifrate |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20191118 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: GAMBARO, ELENA FRANCESCA Inventor name: STELZER, ANNETT Inventor name: ROA GARZON, MAXIMO ALEJANDRO Inventor name: SUPPA, MICHAEL Inventor name: FLOREK-JASINSKA, MONIKA Inventor name: EMMERICH, CHRISTIAN |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20210929 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230621 |