WO2003017744A1 - Procede de positionnement d'un objet sur une cible - Google Patents

Procede de positionnement d'un objet sur une cible Download PDF

Info

Publication number
WO2003017744A1
WO2003017744A1 PCT/DE2002/002123 DE0202123W WO03017744A1 WO 2003017744 A1 WO2003017744 A1 WO 2003017744A1 DE 0202123 W DE0202123 W DE 0202123W WO 03017744 A1 WO03017744 A1 WO 03017744A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
positioning
handling device
image
image capture
Prior art date
Application number
PCT/DE2002/002123
Other languages
German (de)
English (en)
Inventor
Gunther Bohn
Peter Schlaich
Stefan Dittrich
Original Assignee
Robert Bosch Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch Gmbh filed Critical Robert Bosch Gmbh
Publication of WO2003017744A1 publication Critical patent/WO2003017744A1/fr

Links

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K13/00Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
    • H05K13/08Monitoring manufacture of assemblages
    • H05K13/081Integration of optical monitoring devices in assembly lines; Processes using optical monitoring devices specially adapted for controlling devices or machines in assembly lines
    • H05K13/0812Integration of optical monitoring devices in assembly lines; Processes using optical monitoring devices specially adapted for controlling devices or machines in assembly lines the monitoring devices being integrated in the mounting machine, e.g. for monitoring components, leads, component placement
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/80Methods for connecting semiconductor or other solid state bodies using means for bonding being attached to, or being formed on, the surface to be connected
    • H01L2224/83Methods for connecting semiconductor or other solid state bodies using means for bonding being attached to, or being formed on, the surface to be connected using a layer connector

Definitions

  • the invention relates to a method for performing a precision positioning of an object to be mounted or positioned on a target, in particular for mounting electronic components, positions of the object and target being determined and processed using an image capture device and an image processing device, and using a handling device Object is positioned at the destination.
  • Image capture device to take a picture of the target, ie the position and the image of the target.
  • marks provided on the target are detected in a known manner, from which the position of the target can be determined by means of an image processing device.
  • An object to be positioned is then picked up by a gripper, the position of the object in the gripper either being known or being determined in various ways.
  • a displacement vector is computationally determined on the basis of the data about the position of the target and the gripper or the position of the component in the gripper, and then, under the control of the gripper by means of a robot, the positioning is carried out in accordance with the displacement vector previously determined.
  • the path of the robot runs three-dimensionally in the X, Y and Z directions.
  • the accuracy of the procedure. is determined by the mechanical inaccuracy of the robot, which is approximately proportional to the length of the path or the displacement vector. With a web length of . 100 mm, the achievable positioning accuracy is in the range of a few micrometers.
  • the object to be assembled is arranged or prepositioned in a position above the target.
  • a camera with split-field optics is then brought between the component and the target, which captures the position of the component and the position of the target in the X, Y direction. So the target is recorded from above and the component from below.
  • the positioning of the object to the target in the X, Y direction is corrected by means of a handling device.
  • the camera is then swiveled out and positioning is carried out in the Z direction. Since, after this correction, the object and target are only moved in the Z direction, that is to say over a considerably shorter distance than in the method described above, a higher positioning accuracy is hereby achieved.
  • the present invention is based on the object of improving a method for performing a precision positioning of an object on a target of the type described at the outset such that the positioning can be carried out quickly and regardless of the travel of the components and regardless of mechanical inaccuracies of the robot relative to a known one Procedure leads to higher positioning accuracy.
  • This object is achieved according to the invention in a generic method in that, during the positioning of the object, both the object and the target are detected simultaneously by means of the image detection device through a transparent area of the handling device and by carrying out a target / actual comparison of the positions from object and target the object is led to the target.
  • a position detection of both the component and the target is carried out simultaneously during the positioning process.
  • the position of the object is thus recorded together with the image of the target, that is to say directly opposite the target position, and it is then during the positioning.
  • a control process is carried out by the target / actual comparison of object and target position data.
  • the influence of the positioning inaccuracy of the robot can be eliminated by the controlled route guidance process. It proves to be advantageous if the image or the position of the component and the target are determined by the same optics of the image capturing device, since then deviations or errors of different optical systems do not have a falsifying effect or an error in the only optical system used for capturing the Target as the object applies equally and has no negative impact on positioning accuracy.
  • the arrangement of the optics of the image capture device is arranged on the side of the object facing away from the target. If one speaks above of a transparent area of the handling device, which is usually a gripping device for the object, this is to be understood in the broadest sense. It would therefore be possible, for example, for an area which grips the object, for example the gripper fingers of a gripping device, to be optically transparent, in particular transparent, for example made of glass or a material which is transparent to optical waves, in particular light, for example plexiglass or other plastics ,
  • the object it would also be conceivable for the object to be held or gripped by components of the handling device which only insignificantly restrict the field of vision of the imaging device because, for example, the gripper fingers of the gripping device acting from two sides are small compared to the dimensions of the object, so that it is possible is to capture the target with the object through the same optics so that the object and target are visible at the same time. In this case too, an area of the handling device is transparent.
  • the object is at a distance in a separate previous method step at the start of the positioning process . from the target (Z direction) and that as the object approaches the target (Z direction only) the position of the object in the X and Y directions relative to the target Control of the handling device is controlled according to the target / actual comparison.
  • Image capturing device is selected such that the object is already sharply imaged at the beginning of the positioning process and both object and target are sharply imaged during a last phase of the approximation of object and target. If the target is still far away at the beginning of the positioning process, it appears blurred because it lies outside the depth of field of the optical device of the image capture device. But the processing 'of the recovered image and position data can also be applied blurred image of the target (as well as the component) are determined, since it is determined in an advantageous manner on the position of a pictorial gravity ⁇ . Therefore, in a further embodiment of the invention, the target and / or the object is detected via marks provided on the target or object, for example in the form of simple geometric structures, such as circles, crosses, lines or the like. If such marks are also recorded in a fuzzy manner, it is nevertheless possible to correctly determine the focus of these marks and thus the position of the target or the object.
  • the image capturing device comprises a CMOS camera or a CCD high-speed camera, advantageously a recording time of 30 to 100, in particular 30 to 60, milliseconds is used.
  • a recording time of 30 to 100, in particular 30 to 60, milliseconds.
  • the positioning accuracy when performing the positioning or assembly method according to the invention can be improved to significantly less than 1 ⁇ m. Measurement inaccuracies of only 0.2 ⁇ m (6 S range) were achieved. This positioning inaccuracy is due to the measurement inaccuracy of the image acquisition, which can be in the range of approximately 0.2 ⁇ m.
  • the present invention is based on the further object of providing an apparatus for carrying out the method according to the invention. Thereafter, this device is designed with a handling device for positioning the object at the target, with an image capture device and one
  • Image processing device for capturing and processing position or image data of object and target, and with a control device for the handling device, the device being designed in such a way that the handling device is transparent in the image recording area of the image recording device, so that the current positions. object and target can be detected simultaneously through this transparent area.
  • the handling device for example a position-changeable gripping device, for the object and the image capture device or at least one optical device (detection optics) of the image capture device are connected to one another in such a way that they are jointly changed in the course of the positioning.
  • the image capturing device and the handling device for the object can, for example, be provided on a common robot flange.
  • Figure 1 is a schematic representation of the device according to the invention.
  • Figure 2 an image of object and target taken in the course of positioning.
  • FIG. 1 schematically shows the structure of a device according to the invention for carrying out a (micro) precision assembly of an object, in particular an electronic component, on a target, for example on a printed circuit board.
  • the device is designated overall by reference number 2, the object or component bears reference number 4 and the target number 6.
  • target 6 for example a printed circuit board, is placed on a flat surface 8.
  • the device 2 comprises a schematically illustrated image capturing device 10, which is arranged on the side of the object 4 facing away from the target 6. Furthermore, the device 2 comprises a likewise schematically indicated handling device 12 with a gripping device 14.
  • the gripping device 14 is a glass gripper, i. H. a gripping device with a transparent area 16 and with glass gripper fingers, not shown, which grasp the object 4.
  • the image capture device 10 is mounted on the handling device 12 in such a way that it can simultaneously capture both the object 4 and the target 6 through the transparent area 16 of the handling device 12. It is also moved together with the handling device 12.
  • a depth of field of the image capture device 10 is indicated.
  • the component 4 is gripped by means of the handling device 12 and its glass gripper 14 and prepositioned above the target 6 using a robot (not shown).
  • Both an image of the object 4 and an image of the target 6 are now captured by means of the image capture device.
  • Such an image can be seen in FIG. 2, the object 4 being recorded within the depth of field 18 and thereby appearing in focus, while the image of the target 6 recorded outside the depth of field is blurred.
  • the current position of the object 4 in the X and Y directions and of the target 6 in the X and Y directions is now determined by means of an image processing device.
  • a target / actual comparison of the object and target data is then carried out under computer control, and the position of the handling device 12 together with the image detection device 10 is accordingly corrected in a robot-controlled manner in the XY direction.
  • the handling device 12 is moved towards the target 6 in the Z direction.
  • the image or position data of object 4 and target 6 are preferably recorded and evaluated quasi continuously or at predetermined or predeterminable time intervals and subjected to the mentioned target / actual comparison, so that a controlled route guidance process is performed.
  • the target 6 reaches the indicated depth of field 18 of the image capture device 10. This can be used via the image processing device as a signal that the object 4 is in a certain proximity (Z direction) to the target 6.
  • the Z position of the gripper can also be determined by optical methods such as triangulation, auto focus, photogrammetry or interferometry.
  • a CMOS camera or a CCD high-speed camera can advantageously be used for image recording, which has a positive effect in greatly reduced image recording times.
  • the measuring time and, in consequence, the cycle time and the time for executing the. Positioning further reduced.
  • a pulsed light source such as e.g. B. a flash lamp can be used.
  • the component can advantageously be examined for damage using the image capturing device.
  • the invention can also be used when the object to be positioned is picked up by the handling device. If the object is picked up, in particular gripped, by the handling device, inaccuracies also occur here, which, however, do not occur when the method according to the invention is carried out in the course of the common detection of the object and target by the image recording device incorrectly affect the positioning, since the actual positions of the object and target are recorded directly. Nevertheless, the invention can also be used when recording an object. In this case, the transparent area of the handling device would carry a marking, which can then be compared by way of a target / actual comparison with a marking of the object to be picked up, which can be kept in a magazine, for example, in order to achieve an optimal gripping position.

Landscapes

  • Engineering & Computer Science (AREA)
  • Operations Research (AREA)
  • Manufacturing & Machinery (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Manipulator (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un procédé permettant de positionner avec précision un objet (4), à installer ou à positionner sur une cible (6), et en particulier destiné au montage de composants électroniques. Selon ce procédé, des positions de l'objet (4) et de la cible (6) sont déterminées et traitées, à l'aide d'un dispositif de saisie d'images (10) et d'un dispositif de traitement d'images, puis cet objet (4) est positionné sur la cible (6) à l'aide d'un manipulateur (12). Ce procédé se caractérise en ce que, pendant le positionnement de l'objet (4), cet objet (4) tout comme la cible (6) sont détectés simultanément à l'aide du dispositif de saisie d'images (12), à travers une zone transparente (16) du manipulateur (12), puis l'objet (4) est mené à la cible (6) à l'aide du manipulateur (12) par le biais d'une comparaison entre les positions de consigne et réelles de l'objet (4) et de la cible (6).
PCT/DE2002/002123 2001-08-11 2002-06-11 Procede de positionnement d'un objet sur une cible WO2003017744A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE10139596A DE10139596C2 (de) 2001-08-11 2001-08-11 Verfahren und Vorrichtung zum Postitionieren eines Objekts an einem Ziel
DE10139596.5 2001-08-11

Publications (1)

Publication Number Publication Date
WO2003017744A1 true WO2003017744A1 (fr) 2003-02-27

Family

ID=7695213

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/DE2002/002123 WO2003017744A1 (fr) 2001-08-11 2002-06-11 Procede de positionnement d'un objet sur une cible

Country Status (2)

Country Link
DE (1) DE10139596C2 (fr)
WO (1) WO2003017744A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107498286A (zh) * 2017-10-09 2017-12-22 上海玖锶自动化技术有限公司 一种适用于agv的装配方法及系统及流水线

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010028496A1 (de) * 2010-05-03 2011-11-03 Carl Zeiss Smt Gmbh Verfahren zur Positionierung und/oder Justage von Bauteilen und entsprechendes optisches Justagesystem
DE102014212104A1 (de) 2014-06-24 2015-12-24 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Vorrichtung und verfahren zur relativen positionierung einer multiaperturoptik mit mehreren optischen kanälen relativ zu einem bildsensor

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19536005A1 (de) * 1995-09-28 1997-04-03 Inst Mikrotechnik Mainz Gmbh Verfahren und Vorrichtung zum hochgenauen Erfassen und Positionieren von Mikrobauelementen
US5680698A (en) * 1994-12-07 1997-10-28 Lucent Technologies Inc. Method for precise alignment and placement of optoelectric components

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DD260142A1 (de) * 1987-04-28 1988-09-14 Werk Fernsehelektronik Veb Verfahren zur optoelektronischen erkennung von halbleiterchips zur steuerung einer positionseinrichtung bei der vollautomatischen chipmontage
DE4141226A1 (de) * 1991-12-15 1993-06-17 Wolf Henning Montageverfahren und vorrichtung fuer die praezise positionierung von bauelementen und baugruppen
JP3402681B2 (ja) * 1993-06-02 2003-05-06 サンエー技研株式会社 露光における位置合わせ方法

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5680698A (en) * 1994-12-07 1997-10-28 Lucent Technologies Inc. Method for precise alignment and placement of optoelectric components
DE19536005A1 (de) * 1995-09-28 1997-04-03 Inst Mikrotechnik Mainz Gmbh Verfahren und Vorrichtung zum hochgenauen Erfassen und Positionieren von Mikrobauelementen

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"ASSEMBLY TECHNIQUE FOR PLACING ELECTRONIC COMPONENTS ON PRINTED CIRCUIT WIRING PATTERNS", IBM TECHNICAL DISCLOSURE BULLETIN, IBM CORP. NEW YORK, US, vol. 31, no. 10, 1 March 1989 (1989-03-01), pages 222 - 228, XP000024644, ISSN: 0018-8689 *
BAARTMAN J P ET AL: "PLACING SURFACE MOUNT COMPONENTS USING COARSE/FINE POSITIONING AND VISION", IEEE TRANSACTIONS ON COMPONENTS,HYBRIDS,AND MANUFACTURING TECHNOLOGY, IEEE INC. NEW YORK, US, vol. 13, no. 3, 1 September 1990 (1990-09-01), pages 559 - 564, XP000149636, ISSN: 0148-6411 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107498286A (zh) * 2017-10-09 2017-12-22 上海玖锶自动化技术有限公司 一种适用于agv的装配方法及系统及流水线

Also Published As

Publication number Publication date
DE10139596A1 (de) 2003-03-06
DE10139596C2 (de) 2003-07-31

Similar Documents

Publication Publication Date Title
EP2227356B1 (fr) Procédé et système destinés au positionnement très précis d'au moins un objet dans une position finale dans la pièce
EP2126645B1 (fr) Procédé de calibrage du positionnement x-y d'un outil de positionnement et dispositif équipé d'un tel outil de positionnement
EP2255930A1 (fr) Procédé et système destinés au positionnement très précis d'au moins un objet dans une position finale dans l' espace
DE102007060653A1 (de) Positionsermittlung eines Objektes
EP3166312A1 (fr) Dispositif et procédé d'ajustage et/ou d'étalonnage d'un module multi-caméra et utilisation d'un tel dispositif
EP3160219B1 (fr) Procede et dispositif de mise en place de composants electroniques
AT518895B1 (de) Biegemaschine mit einer Arbeitsbereich-Bilderfassungsvorrichtung
CH698334A1 (de) Verfahren für die Entnahme von Halbleiterchips von einem Wafertisch und Verfahren für die Montage von Halbleiterchips auf einem Substrat.
DE102004007830B4 (de) Verfahren zur Lokalisierung von Fehlstellen und Markiersystem
EP3587044A1 (fr) Procédé de préhension d'objets dans une zone de recherche, unité de commande et système de positionnement
DE102005020149A1 (de) Verfahren zur automatischen Fehlererkennung in Prüfteilen mittels einer Röntgenprüfanlage
DE102015109960B4 (de) Vorrichtung und Verfahren zum optischen Bestimmen einer Position und/oder Orientierung eines Manipulators
DE102009023123A1 (de) Oberflächenmontagevorrichtung
DE10139596C2 (de) Verfahren und Vorrichtung zum Postitionieren eines Objekts an einem Ziel
WO2003078924A2 (fr) Procede et dispositif permettant d'acquerir au moins une section d'un outil ou un outil
DE102011116734A1 (de) Verfahren zum Ermitteln eines fokussierten Bildabstands eines optischen Sensors eines Koordinatenmessgeräts
DE102010042821A1 (de) Verfahren und Vorrichtung zur Bestimmung einer Basisbreite eines Stereo-Erfassungssystems
WO2009018894A1 (fr) Procédé et dispositif de détermination de données géométriques d'un objet mesuré
EP1471401A2 (fr) Procédé pour mesurer le système de coordonnées d'une camera robot par rapport au système de coordonnées du robot ou vice versa
EP1428060B1 (fr) Dispositif et procede d'orientation parallelepipedique d'une surface plane d'un objet a controler relativement a un plan de focalisation d'un objectif
DE10361920A1 (de) Vorrichtung und Verfahren zur Kontrolle von Werkzeugen
DE102013221850A1 (de) Verfahren und Vorrichtung zur Laserbeschriftung eines Bauteils
DE102014209342A1 (de) Verfahren zur Ermittlung von Geometriedaten eines Objektes mit einem Messmikroskop und Messmikroskop
DE102019205042A1 (de) Vorrichtung und Verfahren zur Positionierung eines Sensors oder Sensorteils
DE102004046752A1 (de) Messsystem

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU BR CA CN CZ HR HU ID IL IN JP MX NO NZ PL RO SG SI SK US

Kind code of ref document: A1

Designated state(s): AU BR CA CN CZ HR HU ID IL IN JP KR MX NO NZ PL RO SG SI SK US ZA

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB IE IT LU MC NL PT SE TR

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP