EP1569776A1 - Verfahren und anordnung zur vermeidung von kollision zwischen einem roboter und seinerumgebung bei der aufnahme von komponenten mit einem sensorsystem - Google Patents

Verfahren und anordnung zur vermeidung von kollision zwischen einem roboter und seinerumgebung bei der aufnahme von komponenten mit einem sensorsystem

Info

Publication number
EP1569776A1
EP1569776A1 EP03812746A EP03812746A EP1569776A1 EP 1569776 A1 EP1569776 A1 EP 1569776A1 EP 03812746 A EP03812746 A EP 03812746A EP 03812746 A EP03812746 A EP 03812746A EP 1569776 A1 EP1569776 A1 EP 1569776A1
Authority
EP
European Patent Office
Prior art keywords
robot
components
prohibited areas
component
gripper
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP03812746A
Other languages
English (en)
French (fr)
Inventor
Ari Kesti
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Svensk Industriautomation AB
Original Assignee
Svensk Industriautomation AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Svensk Industriautomation AB filed Critical Svensk Industriautomation AB
Publication of EP1569776A1 publication Critical patent/EP1569776A1/de
Ceased legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • B25J9/1676Avoiding collision or forbidden zones
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40053Pick 3-D object from pile of objects
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40478Graphic display of work area of robot, forbidden, permitted zone

Definitions

  • the present invention relates to a method and an arrangement for a robot and gripper for picking up components with a guided robot
  • robots are nowadays guided automatically by various forms of sensor system, such as camera or laser sensors.
  • the technology is primarily used in materials handling to replace manual assembly line production. Its function is to guide the robot in order to grip components having an unknown orientation.
  • the pick-up arena there are multiple components present in the area within which the robot must pick up the component, hereinafter referred to as the pick-up arena.
  • Problems often arise with the robot gripper and/or the robot itself colliding with adjacent components.
  • Current solutions are aimed at preventing the components getting close to one another in the arrangement that supplies the pick-up arena with components.
  • One way of doing this is to mechanically arrange the components on a conveyor belt, for example, in order to prevent them lying against or too close to one another. This is not particularly reliable, however.
  • the object of the present invention is to demonstrate a means in the form of a method and arrangement for picking up components with a guided robot.
  • the present invention also affords the advantage that when picking up components the robot with associated gripper does not collide with adjacent components.
  • the present invention furthermore affords the advantage that the pick-up arena can be supplied with components with a simpler prior separation or without the use of any prior sorting or separation.
  • the present invention provides for improved pick-up of components by means of a guided robot and having the characterising features specified in claim 1.
  • the arrangement according to the present invention comprises a sensor system, such as a camera system (vision system), for example, an optical 3D measuring system or laser equipment, which creates an image or a multi-dimensional range of the pick-up arena.
  • a sensor system such as a camera system (vision system), for example, an optical 3D measuring system or laser equipment, which creates an image or a multi-dimensional range of the pick-up arena.
  • the components and their orientation are identified in the sensor system.
  • the information obtained on the orientation is used to guide the robot to grip the component.
  • the sensor system defines the gripper and or robot and the area for the pick-up arena.
  • the result from this or another sensor system is then used to prevent the robot or the gripper colliding with adjacent components or with the surroundings.
  • Prohibited areas for the aforementioned gripper and the robot are monitored in order to prevent any collision occurring in operation before the components has been collected or gripped by the robot.
  • This is done by programming into the sensor system the components that are to be searched for and where on the component it is to be gripped.
  • the so-called programming is done by defining the component in one or a number of different ways, for example by selecting an area on an image of the component where sensing is to be performed using some form of pattern recognition, or by defining blob parameters for the component.
  • blob parameters refers, for example, to the area, circumference, maximum length, minimum length, and compactness (area per circumference) of a defined area.
  • the system searches for the programmed component, which is most commonly done with some form of image processing.
  • Fig. 1 shows the programming of a component with the gripping position drawn in.
  • the centre of the gripping position is in the thin circle 1 with a rotational orientation according to the thin vertical section 2.
  • the system searches for components resembling the component 4.
  • the rectangles 3 describe the gripper fingers.
  • Fig. 2 shows parts from an example of a graphic interface for programming in the appearance and location of the gripper in relation to the gripper reference position, TCP.
  • the designation A describes the distance between TCP and the gripper fingers 3.
  • the designations B and C describe the size of the gripper fingers 3.
  • Fig. 3 shows an image from programming in components, where thresholding is used to define the component within the pick-up area. In this example this method works well for the three smaller components 6 but on the large component 5 significant parts 7 are missing, see Fig. 4.
  • Fig. 4a shows a component 5, which is to be programmed in.
  • Fig. 4b shows two components 5, 6, where parts of the larger component 5 (cf. Fig. 4a) have been rendered invisible by thresholding, for example due to background or light setting.
  • Fig. 4c shows two components 5, 6 where during programming parts of the component 5 are rendered visible by manually defining the lighter grey area 7 in relation to the position of the component 5, thereby creating the required definition of the entire component 5.
  • Fig. 4d shows two components which are correctly defined within the pick-up arena.
  • Fig. 5 shows three components 9, 10, 11 which cannot be picked up due to collision and a component 8 which can be picked up.
  • the first component 9 has been picked up, it becomes possible to pick up a least one further component 10 (to the right of the figure).
  • the arrangement according to the present invention comprises a sensor system, such as a camera system (vision system), for example, an optical 3D measuring system or laser equipment, which creates an image or multi-dimensional range of the pick-up arena.
  • a sensor system such as a camera system (vision system), for example, an optical 3D measuring system or laser equipment, which creates an image or multi-dimensional range of the pick-up arena.
  • the components and their orientation are defined in the sensor system.
  • the information obtained on the orientation is used to guide the robot to grip the component.
  • the sensor system defines the gripper, the robot and the area of the pick-up arena prior to operation.
  • gripper or gripping fingers relate to all forms of tool for carrying the component with it, such as a pair of fingers that grip around the component, three fingers that grip around a cylinder or in an aperture such as a lathe chuck, suction cup or suction cups, magnet or magnets and so forth. All these parts are not necessarily defined prior to operation, very often only the gripping fingers being defined.
  • Grippers may be defined, for example, simply by determining whether the TCP of the components are situated closer together than a certain number of millimetres, in which case the components must not be picked up.
  • the term robot relates, for example, to simpler moving systems comprising a few linear units which can be brought to a specific position, four-axis pick-up robots, six-axis industrial robots etc.
  • the robots may be floor, wall or ceiling-mounted.
  • the result from the sensor system is then used in order to prevent the robot or the gripper colliding with adjacent components or with the surroundings.
  • the gripper, the robot and prohibited areas are normally defined in 2 or 3 dimensions, that is to say in two dimensions represented as the plane or in three dimensions represented as the space.
  • the surroundings, the gripper and the robot are defined either together or individually in relation to the centre of the gripper (see the example of a gripping finger definition in Fig.2).
  • the definition may be done manually, for example, through a graphical interface or from a virtual model which is imported into the system or by inputting with the sensor system.
  • Prohibited areas for the aforementioned gripper or the robot are monitored in order to prevent any collision occurring in operation before the component has been collected or gripped by the robot.
  • a collision occurs if the gripper or the robot encroaches on areas in which there are components or on prohibited adjacent areas. If the result of monitoring shows that collision will occur, the component is not picked up.
  • the components are only monitored to ensure that their centres do not lie too close to one another, there being no need in this case to define gripping fingers and grippers. If one component is situated too close to another, it will not be picked up. Components are not picked up if the distance between them is less then a certain predetermined measurement.
  • the components that are not picked up for the aforementioned reason may sometimes be picked up later once the adjacent components that could be picked up have been picked up or the components have been reoriented by the arrangement which supplies the pick-up arena with components. This can be achieved by transporting the components via an W 20
  • Fig. I- The so-called programming is done by defining the component in one or a number of different ways, for example by selecting an area on an image of the component where sensing is to be performed using some form of pattern recognition, or by defining blob parameters for the component.
  • thresholding is used to mean the creation of a digital image.
  • the technique for thresholding the image is used to create an image in which the components are white and the surroundings black, or vice versa.
  • the arrangement does not depend on any particular method of thresholding, it being possible to use a single fixed threshold value or more sophisticated solutions with automatic adjustment of the threshold value in proportion to variations between images over time, with a threshold value varying over the image and adapted to local variations in light setting or different variants for colour-coding of component and background etc.
  • the defining of prohibited areas is done by programming in relation to the programmed gripping positions.
  • the system searches for multiple components in the sensor result and prohibited areas are marked in relation to the search results obtained, see Fig.4.
  • This method is used, for example, when parts of the component cannot be reliably distinguished from the surroundings by thresholding.
  • There are many different methods of searching for the component which can in principle all be used for this arrangement for collision protection.
  • the method and arrangement according to the present invention afford the facility for undertaking said control of robot and gripper in operation and for supplementing the virtual world with components having an unknown position.
  • the unknown position can be handled by the sensor system, which among other things programs in the components and reads off the pickup arena so that the robot is allowed to pick up components without the risk of collision with the immediate surroundings.
  • the steps involved in the method using the arrangement for the robot and gripper when picking up components with a guided robot with associated sensor system can be performed in any feasible order.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
EP03812746A 2002-12-10 2003-12-10 Verfahren und anordnung zur vermeidung von kollision zwischen einem roboter und seinerumgebung bei der aufnahme von komponenten mit einem sensorsystem Ceased EP1569776A1 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
SE0203655A SE524796C2 (sv) 2002-12-10 2002-12-10 Kollisionsskydd
SE0203655 2002-12-10
PCT/SE2003/001933 WO2004052596A1 (en) 2002-12-10 2003-12-10 Method and arrangement to avoid collision between a robot and its surroundings while picking details including a sensorsystem

Publications (1)

Publication Number Publication Date
EP1569776A1 true EP1569776A1 (de) 2005-09-07

Family

ID=20289818

Family Applications (1)

Application Number Title Priority Date Filing Date
EP03812746A Ceased EP1569776A1 (de) 2002-12-10 2003-12-10 Verfahren und anordnung zur vermeidung von kollision zwischen einem roboter und seinerumgebung bei der aufnahme von komponenten mit einem sensorsystem

Country Status (4)

Country Link
EP (1) EP1569776A1 (de)
AU (1) AU2003302921A1 (de)
SE (1) SE524796C2 (de)
WO (1) WO2004052596A1 (de)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10414043B2 (en) 2017-01-31 2019-09-17 Fanuc America Corporation Skew and circular boundary for line tracking and circular tracking

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7313464B1 (en) * 2006-09-05 2007-12-25 Adept Technology Inc. Bin-picking system for randomly positioned objects
WO2012066819A1 (ja) * 2010-11-17 2012-05-24 三菱電機株式会社 ワーク取り出し装置
FI20106387A (fi) * 2010-12-30 2012-07-01 Zenrobotics Oy Menetelmä, tietokoneohjelma ja laite tartuntakohdan määrittämiseksi
DE102014019209A1 (de) * 2014-12-19 2016-06-23 Daimler Ag Verfahren zum Betreiben eines Roboters
US11452248B2 (en) * 2017-02-08 2022-09-20 Fuji Corporation Work machine
US20240199349A1 (en) * 2022-12-16 2024-06-20 Berkshire Grey Operating Company, Inc. Systems and methods for automated packaging and processing with static and dynamic payload guards

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4613269A (en) * 1984-02-28 1986-09-23 Object Recognition Systems, Inc. Robotic acquisition of objects by means including histogram techniques
US5041907A (en) * 1990-01-29 1991-08-20 Technistar Corporation Automated assembly and packaging system
GB2261069B (en) * 1991-10-30 1995-11-01 Nippon Denso Co High speed picking system for stacked parts
US5495410A (en) * 1994-08-12 1996-02-27 Minnesota Mining And Manufacturing Company Lead-through robot programming system
JPH11300670A (ja) * 1998-04-21 1999-11-02 Fanuc Ltd 物品ピックアップ装置
JP3300682B2 (ja) * 1999-04-08 2002-07-08 ファナック株式会社 画像処理機能を持つロボット装置
JP3537362B2 (ja) * 1999-10-12 2004-06-14 ファナック株式会社 ロボットシステム用グラフィック表示装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2004052596A1 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10414043B2 (en) 2017-01-31 2019-09-17 Fanuc America Corporation Skew and circular boundary for line tracking and circular tracking

Also Published As

Publication number Publication date
AU2003302921A1 (en) 2004-06-30
WO2004052596A1 (en) 2004-06-24
SE0203655D0 (sv) 2002-12-10
SE524796C2 (sv) 2004-10-05
SE0203655L (sv) 2004-06-11

Similar Documents

Publication Publication Date Title
CN108399639B (zh) 基于深度学习的快速自动抓取与摆放方法
EP3383593B1 (de) Anlernen eines industrieroboters zum greifen von teilen
EP1945416B1 (de) Verfahren und anordnung zur suche und entnahme von objekten auf einem träger
US9233469B2 (en) Robotic system with 3D box location functionality
EP2045772B1 (de) Vorrichtung zur Aufnahme von Objekten
US11701777B2 (en) Adaptive grasp planning for bin picking
EP1905548B1 (de) Vorrichtung zur Werkstückaufnahme
Nerakae et al. Using machine vision for flexible automatic assembly system
US20130211593A1 (en) Workpiece pick-up apparatus
CN111745640B (zh) 物体检测方法、物体检测装置以及机器人系统
CN114758236A (zh) 一种非特定形状物体识别、定位与机械手抓取系统及方法
CN113538459B (zh) 一种基于落点区域检测的多模式抓取避障检测优化方法
CN108038861A (zh) 一种多机器人协作分拣方法、系统及装置
US20230286140A1 (en) Systems and methods for robotic system with object handling
EP1569776A1 (de) Verfahren und anordnung zur vermeidung von kollision zwischen einem roboter und seinerumgebung bei der aufnahme von komponenten mit einem sensorsystem
US20230173660A1 (en) Robot teaching by demonstration with visual servoing
CN110914021A (zh) 带有用于执行至少一个工作步骤的操纵设备的操纵装置以及方法和计算机程序
CN116175542B (zh) 确定夹具抓取顺序的方法、装置、电子设备和存储介质
CN116197885B (zh) 基于压叠检测的图像数据过滤方法、装置、设备和介质
CN115393696A (zh) 具有旋转补偿的对象料箱拾取
Abegg et al. Manipulating deformable linear objects-Vision-based recognition of contact state transitions
Weisenboehler et al. Automated item picking for fashion articles using deep learning
Tudorie Different approaches in feeding of a flexible manufacturing cell
Liu et al. Research on Accurate Grasping Method of Steel Shaft Parts Based on Depth Camera
Pop et al. Robot vision application for bearings identification and sorting

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20050601

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL LT LV MK

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20070718

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20100416