WO2004052596A1 - Method and arrangement to avoid collision between a robot and its surroundings while picking details including a sensorsystem - Google Patents

Method and arrangement to avoid collision between a robot and its surroundings while picking details including a sensorsystem Download PDF

Info

Publication number
WO2004052596A1
WO2004052596A1 PCT/SE2003/001933 SE0301933W WO2004052596A1 WO 2004052596 A1 WO2004052596 A1 WO 2004052596A1 SE 0301933 W SE0301933 W SE 0301933W WO 2004052596 A1 WO2004052596 A1 WO 2004052596A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
components
prohibited areas
component
gripper
Prior art date
Application number
PCT/SE2003/001933
Other languages
French (fr)
Inventor
Ari Kesti
Original Assignee
Svensk Industriautomation Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Svensk Industriautomation Ab filed Critical Svensk Industriautomation Ab
Priority to EP03812746A priority Critical patent/EP1569776A1/en
Priority to AU2003302921A priority patent/AU2003302921A1/en
Publication of WO2004052596A1 publication Critical patent/WO2004052596A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • B25J9/1676Avoiding collision or forbidden zones
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40053Pick 3-D object from pile of objects
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40478Graphic display of work area of robot, forbidden, permitted zone

Definitions

  • the present invention relates to a method and an arrangement for a robot and gripper for picking up components with a guided robot
  • robots are nowadays guided automatically by various forms of sensor system, such as camera or laser sensors.
  • the technology is primarily used in materials handling to replace manual assembly line production. Its function is to guide the robot in order to grip components having an unknown orientation.
  • the pick-up arena there are multiple components present in the area within which the robot must pick up the component, hereinafter referred to as the pick-up arena.
  • Problems often arise with the robot gripper and/or the robot itself colliding with adjacent components.
  • Current solutions are aimed at preventing the components getting close to one another in the arrangement that supplies the pick-up arena with components.
  • One way of doing this is to mechanically arrange the components on a conveyor belt, for example, in order to prevent them lying against or too close to one another. This is not particularly reliable, however.
  • the object of the present invention is to demonstrate a means in the form of a method and arrangement for picking up components with a guided robot.
  • the present invention also affords the advantage that when picking up components the robot with associated gripper does not collide with adjacent components.
  • the present invention furthermore affords the advantage that the pick-up arena can be supplied with components with a simpler prior separation or without the use of any prior sorting or separation.
  • the present invention provides for improved pick-up of components by means of a guided robot and having the characterising features specified in claim 1.
  • the arrangement according to the present invention comprises a sensor system, such as a camera system (vision system), for example, an optical 3D measuring system or laser equipment, which creates an image or a multi-dimensional range of the pick-up arena.
  • a sensor system such as a camera system (vision system), for example, an optical 3D measuring system or laser equipment, which creates an image or a multi-dimensional range of the pick-up arena.
  • the components and their orientation are identified in the sensor system.
  • the information obtained on the orientation is used to guide the robot to grip the component.
  • the sensor system defines the gripper and or robot and the area for the pick-up arena.
  • the result from this or another sensor system is then used to prevent the robot or the gripper colliding with adjacent components or with the surroundings.
  • Prohibited areas for the aforementioned gripper and the robot are monitored in order to prevent any collision occurring in operation before the components has been collected or gripped by the robot.
  • This is done by programming into the sensor system the components that are to be searched for and where on the component it is to be gripped.
  • the so-called programming is done by defining the component in one or a number of different ways, for example by selecting an area on an image of the component where sensing is to be performed using some form of pattern recognition, or by defining blob parameters for the component.
  • blob parameters refers, for example, to the area, circumference, maximum length, minimum length, and compactness (area per circumference) of a defined area.
  • the system searches for the programmed component, which is most commonly done with some form of image processing.
  • Fig. 1 shows the programming of a component with the gripping position drawn in.
  • the centre of the gripping position is in the thin circle 1 with a rotational orientation according to the thin vertical section 2.
  • the system searches for components resembling the component 4.
  • the rectangles 3 describe the gripper fingers.
  • Fig. 2 shows parts from an example of a graphic interface for programming in the appearance and location of the gripper in relation to the gripper reference position, TCP.
  • the designation A describes the distance between TCP and the gripper fingers 3.
  • the designations B and C describe the size of the gripper fingers 3.
  • Fig. 3 shows an image from programming in components, where thresholding is used to define the component within the pick-up area. In this example this method works well for the three smaller components 6 but on the large component 5 significant parts 7 are missing, see Fig. 4.
  • Fig. 4a shows a component 5, which is to be programmed in.
  • Fig. 4b shows two components 5, 6, where parts of the larger component 5 (cf. Fig. 4a) have been rendered invisible by thresholding, for example due to background or light setting.
  • Fig. 4c shows two components 5, 6 where during programming parts of the component 5 are rendered visible by manually defining the lighter grey area 7 in relation to the position of the component 5, thereby creating the required definition of the entire component 5.
  • Fig. 4d shows two components which are correctly defined within the pick-up arena.
  • Fig. 5 shows three components 9, 10, 11 which cannot be picked up due to collision and a component 8 which can be picked up.
  • the first component 9 has been picked up, it becomes possible to pick up a least one further component 10 (to the right of the figure).
  • the arrangement according to the present invention comprises a sensor system, such as a camera system (vision system), for example, an optical 3D measuring system or laser equipment, which creates an image or multi-dimensional range of the pick-up arena.
  • a sensor system such as a camera system (vision system), for example, an optical 3D measuring system or laser equipment, which creates an image or multi-dimensional range of the pick-up arena.
  • the components and their orientation are defined in the sensor system.
  • the information obtained on the orientation is used to guide the robot to grip the component.
  • the sensor system defines the gripper, the robot and the area of the pick-up arena prior to operation.
  • gripper or gripping fingers relate to all forms of tool for carrying the component with it, such as a pair of fingers that grip around the component, three fingers that grip around a cylinder or in an aperture such as a lathe chuck, suction cup or suction cups, magnet or magnets and so forth. All these parts are not necessarily defined prior to operation, very often only the gripping fingers being defined.
  • Grippers may be defined, for example, simply by determining whether the TCP of the components are situated closer together than a certain number of millimetres, in which case the components must not be picked up.
  • the term robot relates, for example, to simpler moving systems comprising a few linear units which can be brought to a specific position, four-axis pick-up robots, six-axis industrial robots etc.
  • the robots may be floor, wall or ceiling-mounted.
  • the result from the sensor system is then used in order to prevent the robot or the gripper colliding with adjacent components or with the surroundings.
  • the gripper, the robot and prohibited areas are normally defined in 2 or 3 dimensions, that is to say in two dimensions represented as the plane or in three dimensions represented as the space.
  • the surroundings, the gripper and the robot are defined either together or individually in relation to the centre of the gripper (see the example of a gripping finger definition in Fig.2).
  • the definition may be done manually, for example, through a graphical interface or from a virtual model which is imported into the system or by inputting with the sensor system.
  • Prohibited areas for the aforementioned gripper or the robot are monitored in order to prevent any collision occurring in operation before the component has been collected or gripped by the robot.
  • a collision occurs if the gripper or the robot encroaches on areas in which there are components or on prohibited adjacent areas. If the result of monitoring shows that collision will occur, the component is not picked up.
  • the components are only monitored to ensure that their centres do not lie too close to one another, there being no need in this case to define gripping fingers and grippers. If one component is situated too close to another, it will not be picked up. Components are not picked up if the distance between them is less then a certain predetermined measurement.
  • the components that are not picked up for the aforementioned reason may sometimes be picked up later once the adjacent components that could be picked up have been picked up or the components have been reoriented by the arrangement which supplies the pick-up arena with components. This can be achieved by transporting the components via an W 20
  • Fig. I- The so-called programming is done by defining the component in one or a number of different ways, for example by selecting an area on an image of the component where sensing is to be performed using some form of pattern recognition, or by defining blob parameters for the component.
  • thresholding is used to mean the creation of a digital image.
  • the technique for thresholding the image is used to create an image in which the components are white and the surroundings black, or vice versa.
  • the arrangement does not depend on any particular method of thresholding, it being possible to use a single fixed threshold value or more sophisticated solutions with automatic adjustment of the threshold value in proportion to variations between images over time, with a threshold value varying over the image and adapted to local variations in light setting or different variants for colour-coding of component and background etc.
  • the defining of prohibited areas is done by programming in relation to the programmed gripping positions.
  • the system searches for multiple components in the sensor result and prohibited areas are marked in relation to the search results obtained, see Fig.4.
  • This method is used, for example, when parts of the component cannot be reliably distinguished from the surroundings by thresholding.
  • There are many different methods of searching for the component which can in principle all be used for this arrangement for collision protection.
  • the method and arrangement according to the present invention afford the facility for undertaking said control of robot and gripper in operation and for supplementing the virtual world with components having an unknown position.
  • the unknown position can be handled by the sensor system, which among other things programs in the components and reads off the pickup arena so that the robot is allowed to pick up components without the risk of collision with the immediate surroundings.
  • the steps involved in the method using the arrangement for the robot and gripper when picking up components with a guided robot with associated sensor system can be performed in any feasible order.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

Method and arrangement for a robot having an associated sensor system for preventing collisions between robot or a gripper arranged on the robot and its surroundings when picking up components, involving the following steps: creation of an image or multi-dimensional representation of a pick-up arena which is supplied with components; identification of the components and their orientation; and definition of one or more following: the gripper, the robot and the area of the pick-up arena, the sensor system monitoring prohibited areas for the robot or the gripper arranged on the robot.

Description

Method and arrangement to avoid collision between a robot and its surroundings while picking details including a sensorsystem
The present invention relates to a method and an arrangement for a robot and gripper for picking up components with a guided robot
Background of the invention
In the field of automation, robots are nowadays guided automatically by various forms of sensor system, such as camera or laser sensors. The technology is primarily used in materials handling to replace manual assembly line production. Its function is to guide the robot in order to grip components having an unknown orientation. In many applications there are multiple components present in the area within which the robot must pick up the component, hereinafter referred to as the pick-up arena. Problems often arise with the robot gripper and/or the robot itself colliding with adjacent components. Current solutions are aimed at preventing the components getting close to one another in the arrangement that supplies the pick-up arena with components. One way of doing this is to mechanically arrange the components on a conveyor belt, for example, in order to prevent them lying against or too close to one another. This is not particularly reliable, however. Another alternative is to make the grippers so small that the risk of collision is reduced. For some time now, methods have existed of preventing collisions between a robot and known surroundings in a virtual world. However, these methods are only used "off-line" in conjunction with the creation or simulation of robot programs.
The problem nevertheless persists and collisions occur which cause costly production stoppages.
The object of the present invention is to demonstrate a means in the form of a method and arrangement for picking up components with a guided robot.
The present invention also affords the advantage that when picking up components the robot with associated gripper does not collide with adjacent components.
The present invention furthermore affords the advantage that the pick-up arena can be supplied with components with a simpler prior separation or without the use of any prior sorting or separation. The present invention provides for improved pick-up of components by means of a guided robot and having the characterising features specified in claim 1.
Summary of the invention
The arrangement according to the present invention comprises a sensor system, such as a camera system (vision system), for example, an optical 3D measuring system or laser equipment, which creates an image or a multi-dimensional range of the pick-up arena.
The components and their orientation are identified in the sensor system.
The information obtained on the orientation is used to guide the robot to grip the component.
The sensor system defines the gripper and or robot and the area for the pick-up arena.
The result from this or another sensor system is then used to prevent the robot or the gripper colliding with adjacent components or with the surroundings. Prohibited areas for the aforementioned gripper and the robot are monitored in order to prevent any collision occurring in operation before the components has been collected or gripped by the robot. This is done by programming into the sensor system the components that are to be searched for and where on the component it is to be gripped. The so-called programming is done by defining the component in one or a number of different ways, for example by selecting an area on an image of the component where sensing is to be performed using some form of pattern recognition, or by defining blob parameters for the component. The term blob parameters refers, for example, to the area, circumference, maximum length, minimum length, and compactness (area per circumference) of a defined area. In operation, the system searches for the programmed component, which is most commonly done with some form of image processing.
Brief description of the drawings
Fig. 1 shows the programming of a component with the gripping position drawn in. The centre of the gripping position is in the thin circle 1 with a rotational orientation according to the thin vertical section 2. The system searches for components resembling the component 4. The rectangles 3 describe the gripper fingers. Fig. 2 shows parts from an example of a graphic interface for programming in the appearance and location of the gripper in relation to the gripper reference position, TCP. The designation A describes the distance between TCP and the gripper fingers 3. The designations B and C describe the size of the gripper fingers 3.
Fig. 3 shows an image from programming in components, where thresholding is used to define the component within the pick-up area. In this example this method works well for the three smaller components 6 but on the large component 5 significant parts 7 are missing, see Fig. 4.
Fig. 4a shows a component 5, which is to be programmed in.
Fig. 4b shows two components 5, 6, where parts of the larger component 5 (cf. Fig. 4a) have been rendered invisible by thresholding, for example due to background or light setting.
Fig. 4c shows two components 5, 6 where during programming parts of the component 5 are rendered visible by manually defining the lighter grey area 7 in relation to the position of the component 5, thereby creating the required definition of the entire component 5.
Fig. 4d shows two components which are correctly defined within the pick-up arena.
Fig. 5 shows three components 9, 10, 11 which cannot be picked up due to collision and a component 8 which can be picked up. When the first component 9 has been picked up, it becomes possible to pick up a least one further component 10 (to the right of the figure).
Detailed description of the invention
The arrangement according to the present invention comprises a sensor system, such as a camera system (vision system), for example, an optical 3D measuring system or laser equipment, which creates an image or multi-dimensional range of the pick-up arena. The components and their orientation are defined in the sensor system.
The information obtained on the orientation is used to guide the robot to grip the component. The sensor system defines the gripper, the robot and the area of the pick-up arena prior to operation. The terms gripper or gripping fingers relate to all forms of tool for carrying the component with it, such as a pair of fingers that grip around the component, three fingers that grip around a cylinder or in an aperture such as a lathe chuck, suction cup or suction cups, magnet or magnets and so forth. All these parts are not necessarily defined prior to operation, very often only the gripping fingers being defined. Grippers may be defined, for example, simply by determining whether the TCP of the components are situated closer together than a certain number of millimetres, in which case the components must not be picked up.
The term robot relates, for example, to simpler moving systems comprising a few linear units which can be brought to a specific position, four-axis pick-up robots, six-axis industrial robots etc. The robots may be floor, wall or ceiling-mounted. The result from the sensor system is then used in order to prevent the robot or the gripper colliding with adjacent components or with the surroundings. The gripper, the robot and prohibited areas are normally defined in 2 or 3 dimensions, that is to say in two dimensions represented as the plane or in three dimensions represented as the space. The surroundings, the gripper and the robot are defined either together or individually in relation to the centre of the gripper (see the example of a gripping finger definition in Fig.2). The definition may be done manually, for example, through a graphical interface or from a virtual model which is imported into the system or by inputting with the sensor system.
Prohibited areas for the aforementioned gripper or the robot are monitored in order to prevent any collision occurring in operation before the component has been collected or gripped by the robot. A collision occurs if the gripper or the robot encroaches on areas in which there are components or on prohibited adjacent areas. If the result of monitoring shows that collision will occur, the component is not picked up. In a variant, the components are only monitored to ensure that their centres do not lie too close to one another, there being no need in this case to define gripping fingers and grippers. If one component is situated too close to another, it will not be picked up. Components are not picked up if the distance between them is less then a certain predetermined measurement. The components that are not picked up for the aforementioned reason may sometimes be picked up later once the adjacent components that could be picked up have been picked up or the components have been reoriented by the arrangement which supplies the pick-up arena with components. This can be achieved by transporting the components via an W 20
arrangement that regroups them and allowing them to pass through the pick-up arena again or by the robot shifting the components. This is done by programming into the sensor system the components to search for and where the component is to be gripped, see Fig. I-. The so-called programming is done by defining the component in one or a number of different ways, for example by selecting an area on an image of the component where sensing is to be performed using some form of pattern recognition, or by defining blob parameters for the component.
The description of the prohibited areas for the gripper and the robot is defined separately or together by one or more of the following methods:
• In the sensor result the components can be distinguished from the surroundings by thresholding, see Fig. 3. The term thresholding is used to mean the creation of a digital image. There are several different variants of the technique for thresholding the image. In this instance it is used to create an image in which the components are white and the surroundings black, or vice versa. The arrangement does not depend on any particular method of thresholding, it being possible to use a single fixed threshold value or more sophisticated solutions with automatic adjustment of the threshold value in proportion to variations between images over time, with a threshold value varying over the image and adapted to local variations in light setting or different variants for colour-coding of component and background etc.
• The defining of prohibited areas is done by programming in relation to the programmed gripping positions. The system searches for multiple components in the sensor result and prohibited areas are marked in relation to the search results obtained, see Fig.4. This method is used, for example, when parts of the component cannot be reliably distinguished from the surroundings by thresholding. There are many different methods of searching for the component, which can in principle all be used for this arrangement for collision protection.
• There may also be parts of a component which do not pose a risk of collision. The parts of the components can be removed from the component definition in the same way that invisible parts are rendered visible. Several ways of obtaining multi-dimensional descriptions or areas for the robot and the gripper range(s) within the area, together with prohibited areas, have been described above in the same space/system of coordinates for each gripping position. If some of these descriptions or areas overlap one another, there is a risk of collision. With the arrangement according to the present invention the system in operation continuously reads off the pick-up arena and only allows the robot to pick up components, the descriptions or areas of which do not overlap one another, thereby preventing collisions, see Fig. 5. It is also possible to define an area which follows the movement of a conveyor belt, for example. The sensor is then very often located over a different geometric area to the pick-up arena. The gripper, the robot and prohibited areas may also be defined in a space that moves together with a conveyor belt.
The method and arrangement according to the present invention afford the facility for undertaking said control of robot and gripper in operation and for supplementing the virtual world with components having an unknown position. The unknown position can be handled by the sensor system, which among other things programs in the components and reads off the pickup arena so that the robot is allowed to pick up components without the risk of collision with the immediate surroundings.
The steps involved in the method using the arrangement for the robot and gripper when picking up components with a guided robot with associated sensor system can be performed in any feasible order.

Claims

Claims
Method for a robot having an associated sensor system for preventing collisions between robot or a gripper arranged on the robot and its surroundings when picking up components, the method comprising the following steps: creation of an image or multi-dimensional representation of a pick-up arena which is supplied with components; identification of the components and their orientation; and definition of one or more of the following: the gripper, the robot and the area of the pick-up arena, characterised in that the sensor system monitors prohibited areas for the robot or the gripper arranged on the robot.
Method according to Claim 1, characterised in that the prohibited areas are the areas for the robot and gripper arranged on the robot in which there is a risk of collision with components or its surroundings.
Method according to Claim 1 or 2, characterised in that the prohibited areas are defined by differentiating components from the background, in two or more dimensions.
Method according to Claim 3, characterised in that the prohibited areas are defined by thresholding, in two or more dimensions.
Method according to any one of the preceding Claims, characterised in that the prohibited areas are defined for various positions of a component in relation to gripping points of the component, in two or more dimensions, in order to thereby obtain the prohibited areas in the imaging area of the sensor on the basis of the components identified.
Method according to any one of the preceding Claims, characterised in that the prohibited areas are revised on the basis of information on which components are to be picked up.
7. Method according to any one of the preceding Claims, characterised in that the prohibited areas are also defined in relation to gripping points of the component, in two or more dimensions.
8. Method according to any one of the preceding Claims, characterised in that the risk of collision is monitored by calculating the overlap of prohibited areas and the robot or gripper arranged on the robot.
9. Arrangement for a robot having an associated sensor system for preventing collisions between robot or a gripper arranged on the robot and its surroundings when picking up components, the arrangement comprising means of performing the following steps: creation of an image or multi-dimensional representation of a pick-up arena which is supplied with components; identification of the components and their orientation; and definition of one or more of the following: the gripper, the robot and the area of the pick-up arena, characterised in that the sensor system comprises means of monitoring prohibited areas for the robot or the gripper arranged on the robot.
10. Arrangement according to Claim 9, characterised in that the prohibited areas are the areas for the robot and gripper arranged on the robot in which there is a risk of collision with components or its surroundings.
11. Arrangement according to Claim 9 or 10, characterised in that the prohibited areas are defined by differentiating components from the background, in two or more dimensions.
12. Arrangement according to Claim 11, characterised in that the prohibited areas are defined by thresholding, in two or more dimensions.
13. Arrangement according to any one of the preceding Claims 9 to 12, characterised in that the prohibited areas are defined for various positions of a component in relation to gripping points of the component, in two or more dimensions, in order to thereby obtain the prohibited areas in the imaging area of the sensor on the basis of the components identified.
14. Arrangement according to any one of the preceding Claims 9 to 13, characterised in that the prohibited areas are revised on the basis of information on which components are to be picked up.
15. Arrangement according to any one of the preceding Claims 9 to 14, characterised in that the prohibited areas are also defined in relation gripping points of the component, in two or more dimensions.
16. Arrangement according to any one of the preceding Claims 9 to 15, characterised in that the risk of collision is monitored by calculating the overlap of prohibited areas and the robot or gripper arranged on the robot.
PCT/SE2003/001933 2002-12-10 2003-12-10 Method and arrangement to avoid collision between a robot and its surroundings while picking details including a sensorsystem WO2004052596A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP03812746A EP1569776A1 (en) 2002-12-10 2003-12-10 Method and arrangement to avoid collision between a robot and its surroundings while picking details including a sensorsystem
AU2003302921A AU2003302921A1 (en) 2002-12-10 2003-12-10 Method and arrangement to avoid collision between a robot and its surroundings while picking details including a sensorsystem

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE0203655-6 2002-12-10
SE0203655A SE524796C2 (en) 2002-12-10 2002-12-10 collision Protection

Publications (1)

Publication Number Publication Date
WO2004052596A1 true WO2004052596A1 (en) 2004-06-24

Family

ID=20289818

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2003/001933 WO2004052596A1 (en) 2002-12-10 2003-12-10 Method and arrangement to avoid collision between a robot and its surroundings while picking details including a sensorsystem

Country Status (4)

Country Link
EP (1) EP1569776A1 (en)
AU (1) AU2003302921A1 (en)
SE (1) SE524796C2 (en)
WO (1) WO2004052596A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1897663A2 (en) * 2006-09-05 2008-03-12 Adept Technology Inc. Bin-picking system for randomly positioned objects
WO2012089928A1 (en) 2010-12-30 2012-07-05 Zenrobotics Oy Method, computer program and apparatus for determining a gripping location
US9089966B2 (en) 2010-11-17 2015-07-28 Mitsubishi Electric Corporation Workpiece pick-up apparatus
WO2016094925A1 (en) * 2014-12-19 2016-06-23 Keba Ag Method for predetermining the working area of a robot
US10414043B2 (en) 2017-01-31 2019-09-17 Fanuc America Corporation Skew and circular boundary for line tracking and circular tracking
US11452248B2 (en) * 2017-02-08 2022-09-20 Fuji Corporation Work machine

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4613269A (en) * 1984-02-28 1986-09-23 Object Recognition Systems, Inc. Robotic acquisition of objects by means including histogram techniques
US5041907A (en) * 1990-01-29 1991-08-20 Technistar Corporation Automated assembly and packaging system
US5446835A (en) * 1991-10-30 1995-08-29 Nippondenso Co., Ltd. High-speed picking system for stacked parts
US5495410A (en) * 1994-08-12 1996-02-27 Minnesota Mining And Manufacturing Company Lead-through robot programming system
EP0951968A2 (en) * 1998-04-21 1999-10-27 Fanuc Ltd Article picking-up apparatus
EP1043642A2 (en) * 1999-04-08 2000-10-11 Fanuc Ltd Robot system having image processing function
EP1092513A2 (en) * 1999-10-12 2001-04-18 Fanuc Ltd Graphic display apparatus for robot system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4613269A (en) * 1984-02-28 1986-09-23 Object Recognition Systems, Inc. Robotic acquisition of objects by means including histogram techniques
US5041907A (en) * 1990-01-29 1991-08-20 Technistar Corporation Automated assembly and packaging system
US5446835A (en) * 1991-10-30 1995-08-29 Nippondenso Co., Ltd. High-speed picking system for stacked parts
US5495410A (en) * 1994-08-12 1996-02-27 Minnesota Mining And Manufacturing Company Lead-through robot programming system
EP0951968A2 (en) * 1998-04-21 1999-10-27 Fanuc Ltd Article picking-up apparatus
EP1043642A2 (en) * 1999-04-08 2000-10-11 Fanuc Ltd Robot system having image processing function
EP1092513A2 (en) * 1999-10-12 2001-04-18 Fanuc Ltd Graphic display apparatus for robot system

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1897663A2 (en) * 2006-09-05 2008-03-12 Adept Technology Inc. Bin-picking system for randomly positioned objects
EP1897663A3 (en) * 2006-09-05 2010-04-07 Adept Technology Inc. Bin-picking system for randomly positioned objects
US9089966B2 (en) 2010-11-17 2015-07-28 Mitsubishi Electric Corporation Workpiece pick-up apparatus
DE112011103794B4 (en) * 2010-11-17 2019-01-24 Mitsubishi Electric Corporation Pick-up device for workpieces
WO2012089928A1 (en) 2010-12-30 2012-07-05 Zenrobotics Oy Method, computer program and apparatus for determining a gripping location
EP2658691A4 (en) * 2010-12-30 2016-08-24 Zenrobotics Oy Method, computer program and apparatus for determining a gripping location
WO2016094925A1 (en) * 2014-12-19 2016-06-23 Keba Ag Method for predetermining the working area of a robot
US10414043B2 (en) 2017-01-31 2019-09-17 Fanuc America Corporation Skew and circular boundary for line tracking and circular tracking
US11452248B2 (en) * 2017-02-08 2022-09-20 Fuji Corporation Work machine

Also Published As

Publication number Publication date
AU2003302921A1 (en) 2004-06-30
EP1569776A1 (en) 2005-09-07
SE0203655L (en) 2004-06-11
SE524796C2 (en) 2004-10-05
SE0203655D0 (en) 2002-12-10

Similar Documents

Publication Publication Date Title
CN111776759B (en) Robotic system with automated package registration mechanism and method of operation thereof
CN109955222B (en) Article transfer device, robot system, and article transfer method
EP1945416B1 (en) A method and an arrangement for locating and picking up objects from a carrier
EP3383593B1 (en) Teaching an industrial robot to pick parts
EP2045772B1 (en) Apparatus for picking up objects
EP1905548B1 (en) Workpiece picking apparatus
US11701777B2 (en) Adaptive grasp planning for bin picking
Nerakae et al. Using machine vision for flexible automatic assembly system
US20150224648A1 (en) Robotic system with 3d box location functionality
US20130211593A1 (en) Workpiece pick-up apparatus
US20120165986A1 (en) Robotic picking of parts from a parts holding bin
CN111745640B (en) Object detection method, object detection device, and robot system
CN113538459B (en) Multimode grabbing obstacle avoidance detection optimization method based on drop point area detection
CN114758236A (en) Non-specific shape object identification, positioning and manipulator grabbing system and method
CN108038861A (en) A kind of multi-robot Cooperation method for sorting, system and device
US20230173660A1 (en) Robot teaching by demonstration with visual servoing
WO2004052596A1 (en) Method and arrangement to avoid collision between a robot and its surroundings while picking details including a sensorsystem
CN110914021A (en) Operating device with an operating device for carrying out at least one work step, and method and computer program
US20230286140A1 (en) Systems and methods for robotic system with object handling
CN116175542B (en) Method, device, electronic equipment and storage medium for determining clamp grabbing sequence
CN115393696A (en) Object bin picking with rotation compensation
Abegg et al. Manipulating deformable linear objects-Vision-based recognition of contact state transitions
CN116197885B (en) Image data filtering method, device, equipment and medium based on press-fit detection
Weisenboehler et al. Automated item picking for fashion articles using deep learning
CN115837985B (en) Disordered grabbing method based on machine vision

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2003812746

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2003812746

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP