EP2047403A1 - Procédé de reconnaissance automatisée des objets 3d et détermination de position - Google Patents
Procédé de reconnaissance automatisée des objets 3d et détermination de positionInfo
- Publication number
- EP2047403A1 EP2047403A1 EP07786429A EP07786429A EP2047403A1 EP 2047403 A1 EP2047403 A1 EP 2047403A1 EP 07786429 A EP07786429 A EP 07786429A EP 07786429 A EP07786429 A EP 07786429A EP 2047403 A1 EP2047403 A1 EP 2047403A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- geometric elements
- sub
- regular geometric
- arrangement
- rule
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
- G06V20/653—Three-dimensional objects by matching three-dimensional models, e.g. conformal mapping of Riemann surfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/24—Aligning, centring, orientation detection or correction of the image
- G06V10/245—Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning
Definitions
- the invention relates to a method for the automatic detection of three-dimensional objects and a device for the treatment of individual objects which are present in a complex environment.
- the task is to determine the spatial position of any objects oriented in space, such as workpieces or components.
- the position of the objects is unknown and their position and orientation in space must be determined, for example, to be able to determine suitable gripping positions for a robot.
- Included in the problem is always the demand for an extensive automation of object recognition and orientation.
- An aggravation of the problem arises from other objects in the vicinity of the objects to be gripped.
- a common case in practice are, for example, workpieces stored in boxes which are present as completely disordered bulk material or in any packing order. Each individual object is surrounded by a multitude of other objects.
- the object to be recognized must be segmented from the background, i. all points of the point cloud that do not belong to the object must be separated from the points that belong to the object.
- photogrammetric methods such as "3D Robot Vision” from ISRA VISION SYSTEMS AG are used, although it is necessary for features to be present on the objects that make it possible to use photogrammetric methods
- photogrammetric methods For example, you might want to use CAD models to generate a large number of images from different views that represent the model, but you need 4000 to 12,000 images be taught, which is associated with a great deal of computing power, time and storage space.
- the object of the present invention is therefore to provide a method which allows the detection and orientation of objects and a treatment of the objects, without special features must be attached to the objects and without that a detailed CAD model of the recognizable Object must exist.
- the invention relates to objects characterized by regular geometric elements, e.g. Layers, cylinders or cones describe or at least partially can be described by rule geometric elements and can be identified by this.
- regular geometric elements e.g. Layers, cylinders or cones describe or at least partially can be described by rule geometric elements and can be identified by this.
- the scene in which the objects must be recognized is digitized by means of an appropriate detection system.
- a detection system for example, a 2.5D or 3D data acquisition unit, such as light section, fringe projection or time of flight (ToF) in question.
- These systems provide depth images or point clouds.
- 3D data In the following, such data will always be referred to as 3D data.
- the corresponding evaluation can be done with a computer system.
- geometric elements are understood to mean those elements whose shape or geometry is closed
- Formulas is writable, i. for example, by parametric or implicit formulas.
- 3D object recognition and position determination in space are realized on the basis of an automated recognition of individual rule geometric elements while observing certain geometric conditions, as far as the shape and position parameters of the individual objects are concerned.
- the method according to the invention serves to manipulate and / or measure, more generally to treat, at least one object present in a limited area in an arrangement that can be described by one or more regular geometric elements that are related to one another, preferably in a specific and / or known relationship is and whose position and position can be described by arrangement parameters.
- the arrangement parameters may include or be, for example, coordinates, rotation angles, tangency, orthogonality, parallelism, and / or angles to objects.
- the dimensions of the objects can also be described therein.
- the treatment may, for example, consist of gripping, moving, and even changing the object itself.
- the object can also be reshaped.
- the object is only too measured.
- the limited area is essentially given by the area of influence of the treatment device. However, it can also consist, for example, in a box in which the objects to be treated, selected or to be measured are accommodated.
- the object to be treated must first be selected.
- the selection first means the determination of an object among the several, for which then subsequently data are determined, which describe the shape and the position of the object.
- the recording of data describing the object and its location is also counted here for the selection of the object.
- the first step it is necessary to record data that maps the arrangement of the objects.
- data may be taken from the entire limited area or from a part of this area. They are preferably three-dimensional data describing the objects for each point of the room.
- the 3D measurement data are recorded using a corresponding 3D sensor. Any sensor that generates dense 3D point clouds can be used here.
- regular geometric elements can now be fitted into the three-dimensional data, by means of which the object to be selected can be described.
- individual regular geometric elements are automatically fitted using a best-fit method, thereby providing a first segmentation. taken.
- this substep one also contains the shape, position, rotation and / or arrangement parameters of the individual rule geometric elements and / or the respective 3D points associated with the element.
- a regular geometric element can now be selected as the starting element in the regular geometric elements fitted in the first sub-step.
- the arrangement parameters of this rule-geometric element can be determined automatically or are known. It is therefore possible to select a starting element in the regions of the geometrical single elements which have already been pre-segmented by the first sub-step.
- the object to be selected is then combined as a whole into all or part of those regular geometric elements which describe the object to be selected and / or of which in the optional first to third
- the arrangement parameters of the object to be selected are determined by the rule geometrical elements describing the object or the rule geometry elements selected in the third step, taking into account the relationships of the corresponding rule geometric elements in the object to be selected in the three-dimensional Data to be fitted.
- the object to be found From the object to be found, it may be known or ascertained which rule geometric elements belong to the object and in which spatial relationship they stand with one another. Based on this information, the appropriate conditions for the combined fit can be generated.
- the fitting of the entire object to be selected can thus be carried out taking into account the geometric constraints which result from the information about the geometric elements belonging to the object and their position relative to one another. In doing so, the object "combined" from the individual geometric elements is fitted as a whole selected object to be selected and achieved a better quality fitting result. As a result, a sufficiently accurate position and LübeStimmung of the workpiece can be achieved.
- This treatment device may be a gripper, a suction device for aspirating an object or else another device for changing or detecting the object.
- a device for the examination of objects is also possible.
- the method can be varied or extended in various ways.
- a start segment of the 3D measurement data is first set and divided become. This determination can be made, for example, using height information or, if present, pre-segmentation in the depth image. Height information and a depth image can, for example, be derived from the three-dimensional data or recorded with a suitable sensor.
- the optional first sub-step of the method can be further accelerated by performing a curvature analysis in the 3D data before the best fit. At each point, the curvature is approximated and then subdivided into regions of different curvature as pre-segmentation. Now in the pre-segmented parts a Besteinpassung individual Regelgeometrischer elements performed.
- individual rule-geometric elements are assigned to the object to be selected by using information about which rule-geometric elements belong to the object and in which spatial relationship they relate to one another.
- these geometric conditions that is to say the arrangement parameters (for example with respect to the shape and position of the geometric elements present in the object to be found)
- the selection of the starting element in the optional second sub-step can be supported by the use of additional information, which can result, for example, from preprocessing using depth images.
- the method according to the invention can be carried out on a system which has a 2.5D or 3D data acquisition unit, such as light section, fringe projection or Time of Flight (ToF), which receives the parts from one or more views and digitally available.
- the system may include a computer on which the evaluation method is implemented.
- a system is described with which disordered parts, which consist of several regular geometric elements, can be identified in their position and with which for these gripping points can be determined.
- Such parts may be, for example, castings, such as scaffold supports.
- the following evaluation procedure is implemented.
- the 3D data is divided into different regions and a start region selected.
- the individual geometric objects are automatically adjusted, for example for a frame holder, cylinders and planes.
- a found rule geometric element is selected as a starting element, for example the topmost, ie the most accessible one.
- an environment searches for further geometric objects belonging to the object, that is to say in the case of a framework holder for surrounding cylinders and planes.
- the type and number of the control geometry near the starting element are initially used as cutting criteria. individual elements.
- geometric constraints are determined in advance based on the geometry of the object to be detected so that the position of the object can be determined from different views.
- these geometric conditions may refer, for example, to three planes that must be perpendicular to each other or to cylinders of the same radius and parallel axes that are perpendicular to the ground plane.
- the determination of the geometric constraints is automatically based on an existing CAD model or by an object is scanned in advance and using this scan data, an automatic determination of the geometric conditions between individual rule geometric elements. It is successively determined which and how many elements are necessary in order to be able to unambiguously determine the position of the object to be recognized.
- the starting element is selected using depth image information.
- depth image information e.g. Pre-segmentation with a region-wide method in combination with height information will be helpful.
- the 3D measurement a curvature analysis. For each point, the curvature is calculated approximately and then areas with similar curvature are summarized and absegmentiert. Only in the pre-segmented data in this way, the automatic Besteinpassung individual Cannabisgeometrischer elements. This speeds up the process and prevents excessive expansion of the individual control geometric elements.
- the further procedure corresponds to that described in the first example.
- the application possibilities of the method according to the invention range from gripping disordered objects to 3D scene analysis to metrological applications (for example, shape and position testing), which are summarized here among other applications under the term of treatment.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Processing Or Creating Images (AREA)
- Image Processing (AREA)
Abstract
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102006036346A DE102006036346B4 (de) | 2006-08-03 | 2006-08-03 | Verfahren zur automatisierten 3-D-Objekterkennung und Lagebestimmung |
PCT/EP2007/006728 WO2008014960A1 (fr) | 2006-08-03 | 2007-07-30 | Procédé de reconnaissance automatisée des objets 3d et détermination de position |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2047403A1 true EP2047403A1 (fr) | 2009-04-15 |
Family
ID=38606438
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP07786429A Ceased EP2047403A1 (fr) | 2006-08-03 | 2007-07-30 | Procédé de reconnaissance automatisée des objets 3d et détermination de position |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP2047403A1 (fr) |
DE (1) | DE102006036346B4 (fr) |
WO (1) | WO2008014960A1 (fr) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2385483A1 (fr) | 2010-05-07 | 2011-11-09 | MVTec Software GmbH | Reconnaissance et détermination de la pose d'objets en 3D dans des scènes en 3D en utilisant des descripteurs de paires des points et de la transformée généralisée de Hough |
EP2720171A1 (fr) | 2012-10-12 | 2014-04-16 | MVTec Software GmbH | Reconnaissance et détermination de la pose d'objets en 3D dans des scènes multimodales |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102008020579B4 (de) | 2008-04-24 | 2014-07-31 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Verfahren zur automatischen Objektlageerkennung und Bewegung einer Vorrichtung relativ zu einem Objekt |
DE102009009569B4 (de) * | 2009-02-19 | 2019-12-19 | Daimler Ag | Verfahren zum Ermitteln einer Teilfläche eines Bauteils |
DE102011100919A1 (de) * | 2011-05-09 | 2012-11-15 | Lufthansa Technik Ag | Verfahren zur automatisierten Detektion von Einzelteilen einer komplexen differenziellen Struktur |
US9233469B2 (en) | 2014-02-13 | 2016-01-12 | GM Global Technology Operations LLC | Robotic system with 3D box location functionality |
DE102014005181A1 (de) | 2014-04-03 | 2015-10-08 | Astrium Gmbh | Positions- und Lagebestimmung von Objekten |
DE102015100983A1 (de) * | 2015-01-23 | 2016-07-28 | Sick Ag | Verfahren zur Lokalisierung von Greifpunkten von Objekten |
DE102021210903A1 (de) | 2021-09-29 | 2023-03-30 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zum Aufnehmen eines Objekts mittels einer Robotervorrichtung |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5988862A (en) * | 1996-04-24 | 1999-11-23 | Cyra Technologies, Inc. | Integrated system for quickly and accurately imaging and modeling three dimensional objects |
-
2006
- 2006-08-03 DE DE102006036346A patent/DE102006036346B4/de not_active Expired - Fee Related
-
2007
- 2007-07-30 EP EP07786429A patent/EP2047403A1/fr not_active Ceased
- 2007-07-30 WO PCT/EP2007/006728 patent/WO2008014960A1/fr active Application Filing
Non-Patent Citations (1)
Title |
---|
FAUGERAS O D ET AL: "THE REPRESENTATION, RECOGNITION, AND LOCATING OF 3-D OBJECTS", INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, SAGE SCIENCE PRESS, THOUSAND OAKS, US, vol. 5, no. 3, 1 September 1986 (1986-09-01), pages 27 - 52, XP008018683, ISSN: 0278-3649 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2385483A1 (fr) | 2010-05-07 | 2011-11-09 | MVTec Software GmbH | Reconnaissance et détermination de la pose d'objets en 3D dans des scènes en 3D en utilisant des descripteurs de paires des points et de la transformée généralisée de Hough |
EP2720171A1 (fr) | 2012-10-12 | 2014-04-16 | MVTec Software GmbH | Reconnaissance et détermination de la pose d'objets en 3D dans des scènes multimodales |
Also Published As
Publication number | Publication date |
---|---|
DE102006036346B4 (de) | 2010-12-30 |
WO2008014960A1 (fr) | 2008-02-07 |
DE102006036346A1 (de) | 2008-02-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2047403A1 (fr) | Procédé de reconnaissance automatisée des objets 3d et détermination de position | |
DE102014102943B4 (de) | Robotersystem mit Funktionalität zur Ortsbestimmung einer 3D- Kiste | |
DE60308413T2 (de) | Verfahren und vorrichtung zum bearbeiten und analysieren von digitalen gelände daten. | |
DE112011103794B4 (de) | Aufnehmervorrichtung für Werkstücke | |
DE19936364A1 (de) | Verfahren zur Identifizierung und Lokalisierung von Marken in einem 3D-Volumendatensatz | |
EP2019283A2 (fr) | Procédé et dispositif de mesure des données de mesure réelles d'un composant | |
DE19633693C1 (de) | Verfahren und Vorrichtung zur Erfassung von Targetmustern in einer Textur | |
EP4089363B1 (fr) | Procédé et dispositif de génération d'un plan d'essai pour l'essai d'un objet de mesure, procédé et dispositif d'essai d'un objet de mesure, ainsi que produit de programme informatique | |
EP2058765A1 (fr) | Procédé et dispositif destinés à texturer un objet d'un modèle géométrique tridimensionnel virtuel | |
DE102021103726A1 (de) | Messparameter-Optimierungsverfahren und -Vorrichtung sowie Computersteuerprogramm | |
DE102016201741A1 (de) | Verfahren zur Höhenerkennung | |
WO2009046781A1 (fr) | Procédé et dispositif pour enregistrer des informations d'un outil | |
DE102015220031A1 (de) | Verfahren zur Konfidenzabschätzung für optisch-visuelle Posenbestimmung | |
EP1098268A2 (fr) | Méthode pour la mésure optique tridimensionelle de surfaces d'objets | |
DE102011103510A1 (de) | Verfahren zum Erstellen einer dreidimensionalen Repräsentation einer Objektanordnung | |
DE102006036345B4 (de) | Verfahren und Vorrichtung zur Lagebestimmung von Objekten im dreidimensionalen Raum | |
EP0364614B1 (fr) | Méthode de reconnaissance de la position et de l'orientation spatiale d'objets déjà connus | |
DE102008020579B4 (de) | Verfahren zur automatischen Objektlageerkennung und Bewegung einer Vorrichtung relativ zu einem Objekt | |
DE102006005990A1 (de) | Werkstückvermessung für 3-D Lageerkennung in mehreren Multi-Roboter-Stationen | |
DE112015006181T5 (de) | Ladungsträgerstrahlvorrichtung, Ausrichtungsverfahren für die L.adungsträgerstrahlvorrichtung, Ausrichtungsprogramm und Speichermedium | |
DE102009007024A1 (de) | Verfahren und Vorrichtung zum Vereinzeln von Bauteilen | |
DE102017122627B4 (de) | Optisches Messsystem und Messverfahren | |
DE102008049859B4 (de) | Verfahren und Prüfsystem zur optischen Prüfung einer Kontur eines Prüfobjekts | |
DE102019220364A1 (de) | Kalibriereinrichtung und Verfahren zum Kalibrieren einer Vorrichtung | |
DE102019110185A1 (de) | Verfahren und System zum Registrieren eines Konstruktionsdatenmodells in einem Raum |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20090102 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC MT NL PL PT RO SE SI SK TR |
|
AX | Request for extension of the european patent |
Extension state: AL BA HR MK RS |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: DUNKER, THOMAS Inventor name: HUETTEL, MARKUS Inventor name: EFFENBERGER, IRA Inventor name: STOTZ, MARTIN |
|
17Q | First examination report despatched |
Effective date: 20090921 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20110111 |