WO2004029864A1 - Erfassen und greifen von gegenständen - Google Patents
Erfassen und greifen von gegenständen Download PDFInfo
- Publication number
- WO2004029864A1 WO2004029864A1 PCT/EP2003/010411 EP0310411W WO2004029864A1 WO 2004029864 A1 WO2004029864 A1 WO 2004029864A1 EP 0310411 W EP0310411 W EP 0310411W WO 2004029864 A1 WO2004029864 A1 WO 2004029864A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- directions
- degrees
- recording
- lighting
- devices
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/145—Illumination specially adapted for pattern recognition, e.g. using gratings
Definitions
- the invention relates to the detection, gripping or processing of disordered or poorly ordered or inaccurately positioned parts, in particular bulk goods parts, preferably by means of robots or other handling devices.
- a crucial problem in the practical implementation of such a system is that only a part of the workpiece contours is sufficiently reliable in the previously known camera and (possibly) lighting arrangements.
- DE 3545960 an attempt is made to take several pictures with different illuminations using a single camera; Although this increases the probability that an edge is represented in one of these image recordings, this is not ensured, for example, not if the background of the edge has the same surface properties as the upper workpiece and the same spatial orientation.
- contours are reliable and which are not depends on the random spatial orientation of the workpieces in known camera and lighting arrangements. This makes the implementation of reliable image evaluation extremely difficult, especially if the task requires a true three-dimensional position determination of the workpieces (generally three position parameters and three orientation parameters).
- One way out is the transition from contour-oriented processing to flat 3D evaluation using structured light. However, this method is technically extremely complex and requires "exotic" lighting components.
- the object of the invention is the reliable representation of all or as many uncovered contours of a workpiece as possible, regardless of the random spatial orientation of the workpiece, using simple standard lighting components, in particular without structured light.
- Another task is to enable learning by simply showing it in setup mode and comparing data in automatic mode without having to enter model data beforehand in setup mode.
- a method for setting up a data collection with the aid of at least one imaging device and at least one lighting device, an object being recorded from at least three different recording directions and being illuminated from at least three different lighting directions, in each case in incident light, in each case a shooting direction is essentially opposite to a lighting direction, so that from each of the three shooting directions at least one contour of the object appears with a light side and a shadow side of the object, and essentially the entire object from the at least three shooting directions by the at least one imaging device is recorded, the recording directions and the lighting directions on the one hand and the object on the other hand can be moved relative to one another in a defined manner with a plurality of degrees of freedom, and wherein the image recordings and / or data derived therefrom are stored in the data collection.
- image recordings can be made in various relative positions of the recording devices and lighting devices on the one hand and the object on the other.
- changes in the relative positions between the image recordings can be recorded and assigned to the image recordings, it being possible to store position information associated with the image recordings or with data derived therefrom in the data collection.
- the position of the receiving devices and lighting devices can be determined by means of a robot position.
- the position of the object can be changed using a robot.
- a further embodiment of the invention provides a data collection with image recordings and / or data derived therefrom and a computer-readable storage medium with data from a data collection.
- a method for gripping an object from a plurality of objects with the aid of at least one imaging device and at least one lighting device, the object being picked up from at least three different recording directions and from at least three different lighting directions, is illuminated in each case in incident light, wherein in each case one recording direction is essentially opposite to one lighting direction, so that at least one contour of the object with one light side and one shadow side of the object appears from each of the three recording directions, and essentially the entire object from the at least one three recording directions is recorded by the at least one imaging device.
- reference image recordings and / or data derived therefrom which are contained in a data collection according to claim 10, can be used.
- image recordings of the object and image recordings or derived data in the data collection can be compared.
- the illumination directions in pairs have an angle between 75 degrees and 145 degrees, preferably 120 degrees, and from one Seen direction, the recording directions in pairs can have an angle between 75 degrees and 145 degrees, preferably 120 degrees.
- the image recording from the recording directions can in each case be carried out essentially using light from the essentially opposite direction of illumination, preferably by switching and / or by polarization and / or spectral filtering and / or by using at least one color-capable recording device ,
- a further embodiment of the invention provides a computer-readable storage medium with a method for setting up a data collection.
- Yet another embodiment of the invention provides a device for gripping an object from a plurality of objects, with at least three pick-up devices and at least three reflected-light illuminating devices, one pick-up device and one illuminating device essentially lying opposite each other, so that each of the three pick-up devices each can depict at least one contour of the object with a light side and a shadow side of the object, and essentially the entire object can be depicted by the at least three pick-up devices.
- Yet another embodiment of the invention provides a device which is designed to use reference image recordings and / or data derived therefrom which are contained in a data collection.
- the device can furthermore have an essentially star-shaped arrangement of the lighting devices and receiving devices, wherein, viewed from one direction, the loading directions of illumination in pairs have an angle between 75 degrees and 145 degrees, preferably 120 degrees, and seen from one direction, the recording directions in pairs have an angle between 75 degrees and 145 degrees, preferably 120 degrees.
- the device which is designed to record images in each case essentially via light from the essentially opposite direction of illumination, preferably by means of a switching means and / or by polarization and / or spectral filters and / or by at least one color-capable pick-up device.
- FIG. 1 is a side view of the camera and lighting assembly according to the present invention.
- FIG. 2 the arrangement from FIG. 1 seen vertically from below;
- FIG. 3 shows a detail of a camera and the associated lighting according to the present invention.
- FIG. 4 shows a workpiece scene and three cameras as well as schematic representations of the images taken by the cameras.
- FIG. 1 shows a side view of the camera and lighting arrangement, with the cameras 1, 2, 3, and the lighting 11 (for camera 1), 12 (for camera 2) and 13 (for camera 3) assigned to them (symbolically drawn) ), as well as the bulk material with the workpieces 4.
- the solid underlay drawn can of course be replaced, for example, by a conveyor belt or a container.
- Fig. 2 shows the arrangement seen vertically from below.
- the opposite workpiece edge 7 can only be seen from the viewpoint of the camera 1 under favorable circumstances, for example with a dark surface.
- the edges 6, on the other hand, are reliably visible from camera 1, irrespective of the material brightness and the three-dimensional position of the workpiece; if the size of the lighting is suitably adapted, this also applies to shiny workpiece surfaces: the edges 6 create a contour in the image. Both sides of the contour of the workpiece can be seen from the camera, namely on the one hand the part surface 9 illuminated by the opposite illumination and on the other hand the part surface 8 not illuminated by the opposite illumination.
- the workpiece surface 9 reflects brightly in any case, since it is at least approximately at the gloss angle (a small light source is sufficient to achieve this, matt surfaces require a larger light source).
- a drop shadow 5 into which the camera looks, which is visible (a drop shadow, for example, which might be on the opposite side of the workpiece, for example, would not be visible):
- area 8 always appears darker than area 9, regardless of the nature of the workpiece material. This means that the workpiece contour is always reliably displayed in the camera image, regardless of the background, since the contour is bordered by two sides of the same workpiece.
- the end of the drop shadow also forms a contour.
- this has the opposite polarity in the image: in the camera image, the brightness changes from light to dark on the workpiece contour, seen from top to bottom, and from dark to light on the shadow edge. This means that workpiece contour contours of shadow contours are very easily separated from one another distinguish (a drop shadow that may occur on the workpiece edge 7 would have inverted polarity, but this cannot be seen from the camera).
- a particular advantage of the invention is that with the arrangement with at least three cameras and correspondingly configured image fields, each contour section in at least one camera is in the situation shown in FIG. 3 (edge 6) and the contour for this camera is reliably in Image appears.
- a subsequent image evaluation method now has - apart from any basic geometric ambiguities that may be present - all the information required to implement type recognition or position recognition. All uncovered contours of the workpiece top are reliably represented in at least one camera.
- a particular advantage of the arrangement described with three cameras standing at an angle to one another is that, with small rotations of the workpiece in space, at least one contour image changes significantly (generally several contour images change simultaneously) in such a way that 3D position determination can also be carried out with good numerical accuracy and simple teaching-in is only sufficient by showing workpieces in different positions without having to use model information.
- the geometry of the workpiece edges need not be known to be particularly advantageous if the detection is realized by comparison with the images recorded in set-up operation in defined relative positions or data derived therefrom in a reference data collection or a reference database, since workpiece analysis is used for image analysis - Very easy to distinguish contours from the environment-dependent shadow contours (see above) without having to use previous knowledge of the workpiece geometry.
- a reliable distinction between workpiece and shadow edges can only be made within the framework of the (generally model-based) image analysis method; Here this distinction is made in advance and without prior knowledge, which in turn makes the subsequent analysis itself much easier and more robust.
- cameras and lights are also movable relative to one another, but are always in the same or at least approximately the same relative position to one another when the image is recorded.
- Multi-camera arrangements with transmitted light instead of incident light also produce reliable contour images, but may be more likely to be realized in simple special cases. This applies in particular to multi-dimensional position detection; With transmitted light arrangements, no illuminated part surfaces can be seen: for the detection of disordered parts lying one above the other, as shown in FIG. 1, transmitted light arrangements can hardly be realized in practice.
- Figure 2 shows a symmetrical arrangement with 3 cameras.
- the arrangement described here and the method described here naturally also relate to arrangements of the same type which differ from a symmetrical geometry. Cameras and lamps do not have to be at the same height with each other or with each other; the division of the 360 degree circumference does not have to take place at the same angles. In principle, the image fields do not need to completely cover the 360-degree scope (unambiguity of the evaluation depends on the workpiece geometry, for example in the case of symmetries or vice versa in the case of very significant local contour shapes).
- Figures 1 and 2 show arrangements with cameras directed from the outside inwards. Cameras do not necessarily have to be directed from the outside in, e.g. for workpieces with inner edges, e.g. Circular rings.
- Structured lighting is not required; Of course, structured lighting can of course also be used to e.g. to achieve better separation against ambient light via downstream image processing software filters.
- the arrangement described can of course also be implemented with analog mirror arrangements and fewer cameras instead of three cameras, even with the aid of rigid or flexible image guides.
- the illuminations located opposite the cameras can consist of a single illumination, or of an ambient illumination that happens to be opposite the cameras.
- the invention naturally also relates to any controllable / regulatable movement and handling devices.
- Another exemplary embodiment of the invention is an arrangement for type and / or position detection of disordered or poorly ordered or one or more inaccurately positioned parts, in particular bulk material parts, which are recorded by incident light using cameras, in particular with the aim of gripping or processing, in which
- At least three cameras with different directions are aimed at the parts or a part
- the word "potential" is to be understood in such a way that the fields of view of the cameras are set up in such a way that a common workpiece with one contour each can be in the fields of view (if this is not the case, a robot with cameras, for example, takes a new one Starting position to search for a workpiece; with permanently installed cameras, for example, a conveyor belt continues until a common workpiece is in the image fields).
- cameras and lights can be mounted on a robot in an arrangement for position detection and can be moved together in a defined manner against the part or parts.
- a part in an arrangement for position detection with cameras and illuminations not installed on a robot in set-up mode, a part can be shown, preferably stored, in different defined positions in the image field of the cameras, particularly preferably by a robot.
- switching devices and / or by polarization filters and / or by color filters and / or by color cameras, the lighting image recording channels can be separated in an arrangement.
- a method for gripping and / or processing disordered or poorly ordered or imprecisely positioned parts, in particular bulk goods parts is provided with an arrangement for position detection, the image recording of the at least three cameras being separated by means of it opposite lighting happens, preferably
- Robot and with an arrangement for position detection in set-up mode in which Images of a sample part in different relative positions of cameras and lighting on the one hand and sample part on the other are recorded.
- a sample part in a method for gripping with an arrangement for position detection in the set-up mode for teaching the parts in, can be shown, preferably stored, in different defined positions in the image field of the cameras.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Manipulator (AREA)
- Image Processing (AREA)
- Toys (AREA)
- Developing Agents For Electrophotography (AREA)
- Stringed Musical Instruments (AREA)
- Image Analysis (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP03757868A EP1543471B1 (de) | 2002-09-23 | 2003-09-18 | Erfassen und greifen von gegenst nden |
US10/529,174 US7844104B2 (en) | 2002-09-23 | 2003-09-18 | Method for establishing a data collection and method and device for gripping an object |
AU2003273903A AU2003273903A1 (en) | 2002-09-23 | 2003-09-18 | Detection and gripping of objects |
DE50303239T DE50303239D1 (de) | 2002-09-23 | 2003-09-18 | Erfassen und greifen von gegenst nden |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE10244275.4 | 2002-09-23 | ||
DE10244275 | 2002-09-23 | ||
DE10338323.9 | 2003-08-21 | ||
DE10338323A DE10338323B4 (de) | 2002-09-23 | 2003-08-21 | Verfahren und Anordnung zur Bildaufnahme für die Lageerkennung von ungeordneten Teilen |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004029864A1 true WO2004029864A1 (de) | 2004-04-08 |
Family
ID=32043955
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2003/010411 WO2004029864A1 (de) | 2002-09-23 | 2003-09-18 | Erfassen und greifen von gegenständen |
Country Status (6)
Country | Link |
---|---|
US (1) | US7844104B2 (de) |
EP (1) | EP1543471B1 (de) |
AT (1) | ATE325394T1 (de) |
AU (1) | AU2003273903A1 (de) |
DE (1) | DE50303239D1 (de) |
WO (1) | WO2004029864A1 (de) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4199264B2 (ja) * | 2006-05-29 | 2008-12-17 | ファナック株式会社 | ワーク取り出し装置及び方法 |
JP4316630B2 (ja) * | 2007-03-29 | 2009-08-19 | 本田技研工業株式会社 | ロボット、ロボットの制御方法およびロボットの制御プログラム |
WO2011035078A1 (en) * | 2009-09-18 | 2011-03-24 | Foba Technology + Services Gmbh | Method of measuring the outline of a feature |
JP4938115B2 (ja) * | 2010-07-27 | 2012-05-23 | ファナック株式会社 | ワーク取出し装置およびワーク取出し方法 |
JP5767464B2 (ja) * | 2010-12-15 | 2015-08-19 | キヤノン株式会社 | 情報処理装置、情報処理装置の制御方法、およびプログラム |
JP2012223839A (ja) * | 2011-04-15 | 2012-11-15 | Yaskawa Electric Corp | ロボットシステムおよびロボットシステムの駆動方法 |
US9978036B1 (en) | 2011-09-14 | 2018-05-22 | Express Scripts Strategic Development, Inc. | Methods and systems for unit of use product inventory |
KR101490921B1 (ko) * | 2013-07-11 | 2015-02-06 | 현대자동차 주식회사 | 자동차 부품의 품질 검사 장치 및 그 방법 |
JP6408259B2 (ja) * | 2014-06-09 | 2018-10-17 | 株式会社キーエンス | 画像検査装置、画像検査方法、画像検査プログラム及びコンピュータで読み取り可能な記録媒体並びに記録した機器 |
JP6457587B2 (ja) * | 2017-06-07 | 2019-01-23 | ファナック株式会社 | ワークの動画に基づいて教示点を設定するロボットの教示装置 |
KR102558937B1 (ko) * | 2020-01-27 | 2023-07-21 | 코그넥스코오포레이션 | 다수의 유형들의 광을 이용한 비전 검사를 위한 시스템들 및 방법들 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS58213382A (ja) * | 1982-06-04 | 1983-12-12 | Fujitsu Ltd | 認識装置 |
EP0226938A2 (de) * | 1985-12-23 | 1987-07-01 | Messerschmitt-Bölkow-Blohm Gesellschaft mit beschränkter Haftung | Verfahren und Anordnung zur Erkennung von Teilen |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1080886A (ja) * | 1996-09-10 | 1998-03-31 | Rekoode Onkyo:Kk | 視覚制御ロボット |
JP2001317916A (ja) * | 2000-05-10 | 2001-11-16 | Fuji Mach Mfg Co Ltd | エッジ検出方法および装置 |
-
2003
- 2003-09-18 EP EP03757868A patent/EP1543471B1/de not_active Expired - Lifetime
- 2003-09-18 US US10/529,174 patent/US7844104B2/en not_active Expired - Fee Related
- 2003-09-18 DE DE50303239T patent/DE50303239D1/de not_active Expired - Lifetime
- 2003-09-18 WO PCT/EP2003/010411 patent/WO2004029864A1/de not_active Application Discontinuation
- 2003-09-18 AU AU2003273903A patent/AU2003273903A1/en not_active Abandoned
- 2003-09-18 AT AT03757868T patent/ATE325394T1/de active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS58213382A (ja) * | 1982-06-04 | 1983-12-12 | Fujitsu Ltd | 認識装置 |
EP0226938A2 (de) * | 1985-12-23 | 1987-07-01 | Messerschmitt-Bölkow-Blohm Gesellschaft mit beschränkter Haftung | Verfahren und Anordnung zur Erkennung von Teilen |
Non-Patent Citations (2)
Title |
---|
PATENT ABSTRACTS OF JAPAN vol. 008, no. 068 (P - 264) 30 March 1984 (1984-03-30) * |
SHIRAI Y ET AL: "EXTRACTION OF THE LINE DRAWING OF 3-DIMENSIONAL OBJECTS BY SEQUENTIAL ILLUMINATION FROM SEVERAL DIRECTIONS", PATTERN RECOGNITION, PERGAMON PRESS INC. ELMSFORD, N.Y, US, vol. 4, no. 4, 1972, pages 343 - 351, XP001108917, ISSN: 0031-3203 * |
Also Published As
Publication number | Publication date |
---|---|
US20070055406A1 (en) | 2007-03-08 |
DE50303239D1 (de) | 2006-06-08 |
EP1543471A1 (de) | 2005-06-22 |
EP1543471B1 (de) | 2006-05-03 |
US7844104B2 (en) | 2010-11-30 |
ATE325394T1 (de) | 2006-06-15 |
AU2003273903A1 (en) | 2004-04-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE2518077B2 (de) | System zum Feststellen der Winkelorientierung eines Durchlaufteiles relativ zu einem Probeteil | |
EP2128793B1 (de) | Verfahren und Vorrichtung zum Überprüfen von Druckerzeugnissen, Computerprogramm und Computerprogrammprodukt | |
EP1987348B1 (de) | Verfahren zum Erkennen von Fehlern auf der Oberfläche eines zylindrischen Objekts | |
DE2649224A1 (de) | System zur identifizierung eines charakteristischen merkmals einer flaeche | |
DE3943206A1 (de) | Schusswaffenmunitionsinspektionssystem | |
EP1158460B1 (de) | Verfahren und Vorrichtung zur Bildverarbeitung | |
WO2004029864A1 (de) | Erfassen und greifen von gegenständen | |
DE3809221A1 (de) | Verfahren zum detektieren von fehlstellen an pressteilen oder anderen werkstuecken und vorrichtung zur durchfuehrung des verfahrens | |
EP3628995A1 (de) | Kalibriervorlage und kalibrierverfahren zum geometrischen kalibrieren einer vielzahl von kameras eines kamera-arrays | |
DE3134748A1 (de) | "vorrichtung zum erkennen des profils eines gegenstandes" | |
DE69708341T2 (de) | Verfahren und Vorrichtung zum Vergleichen von Geschosshülsen | |
EP3048456B1 (de) | Verfahren zur lokalisierung von greifpunkten von objekten | |
DE10338323B4 (de) | Verfahren und Anordnung zur Bildaufnahme für die Lageerkennung von ungeordneten Teilen | |
DE102007018204A1 (de) | Vorrichtung und Verfahren zur Erfassung von Fehlerstellen in Tierhäuten | |
WO2019185184A1 (de) | Vorrichtung und verfahren zur optischen positionserkennung transportierter objekte | |
DE19612465A1 (de) | Automatische Konfiguration von Prüfsystemen | |
DE10128722C1 (de) | Vorrichtung zur Kontrolle von Objekten | |
DE4142614A1 (de) | Vorrichtung und verfahren zum erkennen von objekten, wie unregelmaessigkeiten in oberflaechen oder dergleichen | |
AT408377B (de) | Verfahren und vorrichtung zur prüfung bzw. untersuchung von gegenständen | |
EP1156445B9 (de) | Verfahren zur Erkennung von Objekten und Digitalkamera hierfür | |
DE10024559B4 (de) | Verfahren zur Erkennung von Objekten | |
EP3540507A1 (de) | Bildaufnahmevorrichtung und verfahren zum aufnehmen einer bildaufnahme eines dokuments und verwendung | |
EP2966593A1 (de) | Bilderfassungssystem zum Detektieren eines Objektes | |
DE102012211734A1 (de) | Verfahren und Vorrichtung zum Erfassen der Lage eines Objekts in einer Werkzeugmaschine | |
DE4238193A1 (de) | Verfahren und Vorrichtung zur Identifizierung von Gegenständen |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2003757868 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2003757868 Country of ref document: EP |
|
WWG | Wipo information: grant in national office |
Ref document number: 2003757868 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2007055406 Country of ref document: US Ref document number: 10529174 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: JP |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: JP |
|
WWP | Wipo information: published in national office |
Ref document number: 10529174 Country of ref document: US |