EP1718926A1 - Dispositif et procede pour determiner les coordonnees spatiales d'un objet - Google Patents

Dispositif et procede pour determiner les coordonnees spatiales d'un objet

Info

Publication number
EP1718926A1
EP1718926A1 EP05716706A EP05716706A EP1718926A1 EP 1718926 A1 EP1718926 A1 EP 1718926A1 EP 05716706 A EP05716706 A EP 05716706A EP 05716706 A EP05716706 A EP 05716706A EP 1718926 A1 EP1718926 A1 EP 1718926A1
Authority
EP
European Patent Office
Prior art keywords
pattern
spatial coordinates
camera
image
projection data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP05716706A
Other languages
German (de)
English (en)
Inventor
Frank Forster
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens AG
Original Assignee
Siemens AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens AG filed Critical Siemens AG
Publication of EP1718926A1 publication Critical patent/EP1718926A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2545Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2509Color coding

Definitions

  • the invention relates to a device for determining spatial coordinates of an object, comprising: a projector which projects a pattern with known projection data onto the object; a camera that generates an object image of the pattern projected onto the object; a data processing unit connected downstream of the camera, which determines spatial coordinates of the object from the object image and the known projection data.
  • the invention further relates to a method for determining spatial coordinates of an object with the method steps: projecting a pattern with known projection data onto an object, generating an object image with the aid of a camera and determining the spatial coordinates from the known projection data in a data processing unit.
  • Such a device and such a method are known from DE 199 63 333 AI.
  • a two-dimensional color pattern is projected onto the surface of the object to be examined by a projector.
  • a camera the position of which is known to the projector, captures the color pattern projected onto the object.
  • the triangular coordinates of a point on the surface of the object can then be calculated using a triangulation method.
  • the known device and the known method are particularly suitable for measuring large-area monochrome objects. If the surface of the object to be measured However, if the object is structured in small parts in terms of space or in relation to the coloring of the object, it is often difficult to analyze the object image, since either the projected pattern is only incompletely contained in the object image due to shadowing or edges, or because the projected color pattern is due to the coloring of the surface of the object to be measured is falsified. In addition, the spatial resolution of the known method is restricted, since color areas with a certain spatial extent must be used to encode the projection data in the color pattern.
  • the object of the invention is to create a method and a device with which even surfaces of an object to be measured that are structured in small parts can be detected with great accuracy.
  • Data processing unit determines additional spatial coordinates of the object from the object images by means of a triangulation method.
  • the spatial coordinates can be determined in two ways. On the one hand, it is possible to evaluate the pattern images independently of one another on the basis of the known projection data of the projected pattern.
  • the spatial coordinates are preferably determined from the pattern images on the basis of the projection data of the projected pattern. Only if no spatial coordinates are assigned to a pixel in one of the two sample images can be found in the two sample images corresponding pixels and tried from the image coordinates using a triangulation to determine the missing spatial coordinates.
  • the corresponding image points are searched along so-called epipolar lines.
  • the epipolar lines are the projection of the line of sight assigned to a pixel of a sample image into another sample image.
  • the pattern projected onto the object to be measured is preferably designed in such a way that the epipolar lines traverse a large number of pattern areas, so that the location information encoded in the projected pattern can be used when searching along the epipolar lines.
  • the pattern projected onto the object contains redundantly coded location information. This allows errors in decoding the pattern to be eliminated.
  • FIG. 1 shows a device for determining the spatial structure of an object
  • FIG. 2 shows the device from FIG. 1 with lines of sight and image coordinate systems drawn in.
  • the measuring device 1 shows a measuring device 1 for determining the spatial structure of an object 2.
  • the measuring device 1 comprises a projector 3 which projects a pattern 4 onto a surface 5 of the object 2.
  • ras 6 arranged, which capture the pattern 4 projected onto the object 2.
  • the cameras 6 are each connected to a computer 7.
  • the cameras 6 generate the sample images 8 and 9 shown in FIG. 2.
  • the positions of image points Si and S r in the sample images 8 and 9 are described with the aid of image coordinate systems 10 and 11.
  • FIG. 2 shows lens coordinate systems 12 and 13 which illustrate the position of lenses of the cameras 6.
  • the pattern images 8 and 9 are located behind the lenses of the cameras 6 and 7 in the beam direction.
  • the pattern images 8 and 9 are drawn in front of the lens coordinate systems 12 and 13 in the beam direction in FIG. 2. However, this does not change the geometric relationships.
  • lines of sight 14 and 15 are drawn in, each of which extends from an object point S on the surface 5 of the object 2 to an origin Oi of the objective coordinate system 12 and to an origin O r of the objective coordinate system 13.
  • the object point S is imaged on the image point Si in the pattern image 8 and on the image point S r in the pattern image 9.
  • the pixels S ⁇ and S r are also referred to as corresponding pixels.
  • the mutually corresponding image points Si and S r lie on epipolar lines 16 and 17, which are respectively the projection of the lines of sight 14 and 15 into the respective other pattern image 8 and 9.
  • the surface coordinates of the surface 5 of the object 2 can be determined in the measuring device 1 on the one hand according to the structured-light approach.
  • the object to be measured is illuminated with a stripe pattern.
  • the plane For each pixel in the sample images 8 and 9, the plane must now be identified in which the object point S lies, which corresponds to the Pixel Si or pixel S r corresponds.
  • This task is also known as an identification problem. Since the angles at which a stripe of the pattern 4 is projected onto the object 2 are known, the angle of the line of sight 14 or 15 can be determined after identification of the respective plane or the respective stripe in the pattern image 8 or 9. Furthermore, since the distance between the projector 3 and the respective camera 6 is known, the distance of the object point S can be determined from one of the sample images 8 or 9 by triangulation.
  • the identification problem is solved by successively projecting different patterns 4 composed of strips onto the object 2, the strip widths of the patterns 4 varying.
  • a sample image 8 or 9 is recorded for each of these projections and the respective color is determined for each pixel in the sample image 8 or 9.
  • the determination of the color is limited to the determination of whether the respective object point appears light or dark.
  • the determination of the color assumed for a specific projection now results in a multi-digit code by means of which the plane in which the associated object point S lies can be identified.
  • the respective levels are coded spatially in one- or two-dimensional patterns by the project data or location information can be encoded by groups of adjacent stripes or rectangles of different colors or by different symbols.
  • the groups of adjacent stripes or rectangles of different colors that contain location information are referred to below as marks.
  • Such a mark consists, for example, of the horizontal sequence of four adjacent colored stripes, the individual marks also being able to overlap.
  • the spatial marks contained in the sample images 8 and 9 are decoded in the computer 7 and the location information is thereby recovered. If the marks are completely visible in the sample images 8 and 9, the coordinates of the surface 5 of the object can in principle also be obtained with this method when the object 2 is moving. The reliability in the decoding of the marks can be increased even further by using redundant codes for coding the marks, which allow the detection of errors.
  • Such codes can be decoded in real time using commercially available workstation computers 7, since only a limited environment has to be analyzed for each pixel of the sample image 8 or 9.
  • the surface 5 to be measured has spatial structures that are smaller than the projected marks, difficulties can arise in the decoding, since marks may not be completely visible under certain circumstances.
  • the reflection on the surface 5 can also be disturbed.
  • the surface 5 itself can show a stripe pattern which strongly disturbs the pattern 4 projected onto the surface 5.
  • Such a pattern which is highly disruptive to the projected pattern 4 is, for example, the stripe pattern of a bar code.
  • inaccuracies frequently occur at the edges of the object 2 when determining the spatial coordinates, since the marks abruptly break off along the edges of the object.
  • a multitude of cameras 6 are provided in the measuring device 1 to solve these problems. If necessary, more than two cameras 6 can also be used in a measuring device of the type of measuring device 1.
  • n depth maps In general, however, areas occur in these depth maps in which no depth value could be determined for the reasons mentioned above. In most cases, the proportion of problem areas in which no depth values can be determined is relatively small in relation to the total area.
  • stereo processing is now carried out according to the principle of stereo viewing.
  • the coordinates of the surface 5 of the object 2 can be obtained by recording the surface 5 from the cameras 6, the positions of the cameras 6 being precisely known. If, as shown in FIG. 2, the image points Si and S r assigned to an object point S can be identified in the pattern images 8 and 9, the spatial position of the object point S follows from the intersection of the at least two lines of sight 14 and 15. Two positions each the cameras 6 and the object point S form a triangle with a base 18 of known length and known base angles ⁇ i and ⁇ r . The coordinates of the object point S on the surface 5 can thus be determined with the help of the so-called triangulation.
  • the solution to the correspondence problem is simplified in that the known pattern 4 is projected onto the object 2.
  • the measuring device 1 it is therefore only necessary to search for corresponding brand parts along the epipolar lines 16 and 17. This is a major advantage, especially for single-color surfaces.
  • the stereo processing step is only carried out in the problem areas in which the structured light approach could not provide spatial coordinates of object 2.
  • the problem areas are areas with a pronounced optical structure, which is exacerbated by the projection of the pattern 4.
  • the problem areas are therefore generally well suited for processing according to the principle of stereo viewing.
  • the stereo processing step can be used to increase the spatial resolution, since correspondence points can also be determined within the brands. So with the combined process it is possible not only the brand boundaries or other brand features, but to assign an exact depth value to each pixel or the image point of the cameras 6.
  • shadowing can be avoided by using the measuring device 1, because the depth values can already be calculated when an area of the surface 5 lies in the common field of view of at least two cameras 6 or one camera 6 and the projector 3.
  • measuring device 1 in contrast to conventional measuring devices, it is possible with measuring device 1, even with very small or very colorful objects with many depth jumps under uncontrolled recording conditions, for example in strong ambient light, with a single pair of sample images 8 and 9 with precise three-dimensional data of very high Win resolution.
  • three-dimensional data of moving objects 2 can be determined, such as, for example, people passing by or objects on a conveyor belt.
  • the data supplied by the cameras 6 can be evaluated in real time on a commercially available workstation computer.
  • the measuring device 1 Compared to a device that works solely on the principle of stereo viewing, the measuring device 1 is significantly more efficient and, because of the redundant coding of the pattern 4, is considerably more reliable. In addition, the measuring device 1 delivers reliable data even on optically unstructured surfaces and contributes to the reduction of shadowing.
  • the measuring device 1 delivers more precise data for object edges and small surfaces 5. Furthermore, accurate data is also generated when the reflection of the marks is disturbed. Finally, a higher spatial resolution can also be achieved. Also are better suppressed compared to the prior art.
  • the measuring device 1 described here is suitable for the robust detection of finely structured surfaces in
  • the measuring device 1 can also be used in quality assurance.
  • the measuring device 1 is also suitable for the identification and authentication of people on the basis of biometric features, for example for face recognition or three-dimensional verification by checking the hand geometry.
  • the measuring device 1 can also be used for tasks such as quality control of food or three-dimensional detection of objects - for the modeling of objects for virtual realities in the multimedia and gaming sector.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un dispositif de mesure (1) qui sert à déterminer des données d'objet tridimensionnelles et qui comprend, outre un projecteur (3), au moins deux appareils de prise de vue (6) qui prennent des images d'objet différentes de l'objet (2). Ces images d'objet peuvent être traitées dans une unité de traitement de données (7) selon l'approche lumière structurée et selon le principe de la vision stéréo. Il est ainsi possible d'accroître considérablement la fiabilité des données obtenues.
EP05716706A 2004-02-24 2005-02-16 Dispositif et procede pour determiner les coordonnees spatiales d'un objet Withdrawn EP1718926A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102004008904A DE102004008904A1 (de) 2004-02-24 2004-02-24 Vorrichtung und Verfahren zur Bestimmung von Raumkoordinaten eines Objekts
PCT/EP2005/050669 WO2005080916A1 (fr) 2004-02-24 2005-02-16 Dispositif et procede pour determiner les coordonnees spatiales d'un objet

Publications (1)

Publication Number Publication Date
EP1718926A1 true EP1718926A1 (fr) 2006-11-08

Family

ID=34833002

Family Applications (1)

Application Number Title Priority Date Filing Date
EP05716706A Withdrawn EP1718926A1 (fr) 2004-02-24 2005-02-16 Dispositif et procede pour determiner les coordonnees spatiales d'un objet

Country Status (4)

Country Link
US (1) US20080319704A1 (fr)
EP (1) EP1718926A1 (fr)
DE (1) DE102004008904A1 (fr)
WO (1) WO2005080916A1 (fr)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7689003B2 (en) 2006-03-20 2010-03-30 Siemens Energy, Inc. Combined 2D and 3D nondestructive examination
US8244025B2 (en) 2006-03-20 2012-08-14 Siemens Energy, Inc. Method of coalescing information about inspected objects
US8477154B2 (en) 2006-03-20 2013-07-02 Siemens Energy, Inc. Method and system for interactive virtual inspection of modeled objects
US20080156619A1 (en) 2006-12-01 2008-07-03 Mehul Patel Range finder
DE102006061712A1 (de) * 2006-12-28 2008-07-03 Tropf, Hermann Erstellung eines Abstandsbildes
ES2338259T3 (es) * 2007-02-09 2010-05-05 Siemens Aktiengesellschaft Procedimiento para la elaboracion de un objeto, de un sistema o de una instalacion y dispositivo de tratamiento correspondiente.
DE102011121696A1 (de) * 2011-12-16 2013-06-20 Friedrich-Schiller-Universität Jena Verfahren zur 3D-Messung von tiefenlimitierten Objekten
JP2013210254A (ja) * 2012-03-30 2013-10-10 Canon Inc 三次元計測装置、三次元計測方法及び三次元計測プログラム
DE102012013079B4 (de) 2012-06-25 2023-06-22 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Verfahren und Vorrichtung zum berührungslosen Erfassen einer dreidimensionalen Kontur
DE102012222505B4 (de) * 2012-12-07 2017-11-09 Michael Gilge Verfahren zum Erfassen dreidimensionaler Daten eines zu vermessenden Objekts, Verwendung eines derartigen Verfahrens zur Gesichtserkennung und Vorrichtung zur Durchführung eines derartigen Verfahrens
DE102013111761B4 (de) * 2013-10-25 2018-02-15 Gerhard Schubert Gmbh Verfahren und Scanner zum berührungslosen Ermitteln der Position und dreidimensionalen Form von Produkten auf einer laufenden Fläche
US10349037B2 (en) 2014-04-03 2019-07-09 Ams Sensors Singapore Pte. Ltd. Structured-stereo imaging assembly including separate imagers for different wavelengths
CN105783770A (zh) * 2016-01-22 2016-07-20 西南科技大学 一种基于线结构光的冰形轮廓测量的方法
WO2024026155A2 (fr) * 2022-04-11 2024-02-01 Virginia Tech Intellectual Properties, Inc. Scanner à lumière structurée à résolution spatiale ultra-élevée et ses applications

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4357108A (en) * 1980-06-06 1982-11-02 Robotic Vision Systems, Inc. Method for reproducton of object surfaces
US4842411A (en) * 1986-02-06 1989-06-27 Vectron, Inc. Method of automatically measuring the shape of a continuous surface
US5589942A (en) * 1990-04-05 1996-12-31 Intelligent Automation Systems Real time three dimensional sensing system
NO302055B1 (no) * 1993-05-24 1998-01-12 Metronor As Fremgangsmåte og system for geometrimåling
US6028672A (en) * 1996-09-30 2000-02-22 Zheng J. Geng High speed three dimensional imaging method
DE19623172C1 (de) * 1996-06-10 1997-10-23 Univ Magdeburg Tech Verfahren zur dreidimensionalen optischen Vermessung von Objektoberflächen
US7068825B2 (en) * 1999-03-08 2006-06-27 Orametrix, Inc. Scanning system and calibration method for capturing precise three-dimensional information of objects
DE19963333A1 (de) * 1999-12-27 2001-07-12 Siemens Ag Verfahren zur Ermittlung von dreidimensionalen Oberflächenkoordinaten
CA2306515A1 (fr) * 2000-04-25 2001-10-25 Inspeck Inc. Vision stereo internet, numerisation 3d et camera de saisie de mouvement
JP2002157576A (ja) * 2000-11-22 2002-05-31 Nec Corp ステレオ画像処理装置及びステレオ画像処理方法並びにステレオ画像処理用プログラムを記録した記録媒体
EP1750426A1 (fr) * 2000-12-07 2007-02-07 Sony United Kingdom Limited Procédé et appareil d'incorporation de données et de détection et récupération de données incorporées
US6492651B2 (en) * 2001-02-08 2002-12-10 3D Systems, Inc. Surface scanning system for selective deposition modeling
US6868194B2 (en) * 2001-12-19 2005-03-15 General Electric Company Method for the extraction of image features caused by structure light using image reconstruction
DE10232690A1 (de) * 2002-07-18 2004-02-12 Siemens Ag Verfahren und Vorrichtung zur dreidimensionalen Erfassung von Objekten sowie Verwendung der Vorrichtung und des Verfahrens
US20050068544A1 (en) * 2003-09-25 2005-03-31 Gunter Doemens Panoramic scanner
US20070115484A1 (en) * 2005-10-24 2007-05-24 Peisen Huang 3d shape measurement system and method including fast three-step phase shifting, error compensation and calibration

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
None *
See also references of WO2005080916A1 *

Also Published As

Publication number Publication date
US20080319704A1 (en) 2008-12-25
DE102004008904A1 (de) 2005-09-08
WO2005080916A1 (fr) 2005-09-01

Similar Documents

Publication Publication Date Title
EP1718926A1 (fr) Dispositif et procede pour determiner les coordonnees spatiales d'un objet
EP1523654B1 (fr) Procede et dispositif de detection tridimensionnelle d'objets et utilisation dudit dispositif et dudit procede
DE69726025T2 (de) Verfahren und vorrichtung zum erhalten einer dreidimensionalen form
EP0970435B1 (fr) Procede d'identification tridimensionnelle d'objets
DE10020893B4 (de) Verfahren zur optischen Formerfassung von Gegenständen
DE102017116952A1 (de) System und verfahren für verbessertes scoring von 3d-lagen und entfernen von störpunkten in 3d-bilddaten
DE10137241A1 (de) Registrierung von Tiefenbildern mittels optisch projizierter Marken
DE102007054906A1 (de) Verfahren zur optischen Vermessung der dreidimensionalen Geometrie von Objekten
DE112010004767T5 (de) Punktwolkedaten-Verarbeitungsvorrichtung, Punktwolkedaten-Verarbeitungsverfahren und Punktwolkedaten-Verarbeitungsprogramm
DE102006055758A1 (de) Verfahren zur Kalibrierung von Kameras und Projektoren
EP3775767B1 (fr) Procédé et système de mesure d'un objet par stéréoscopie
DE19634254B4 (de) Optisch-numerisches Verfahren zur Ermittlung der gesamten Oberfläche eines dreidimensionalen Objektes
DE112016006262T5 (de) Dreidimensionaler Scanner und Verarbeitungsverfahren zur Messunterstützung für diesen
DE102016100132B4 (de) Verfahren und Vorrichtung zum Untersuchen eines Objekts unter Verwendung von maschinellem Sehen
EP1821064B1 (fr) Procédé et dispositif destinés à la détection d'un contour d'une surface réfléchissante
DE10317078B4 (de) Verfahren und Vorrichtung zur Analyse reflektierender Oberflächen
EP3628995A1 (fr) Modèle d'étalonnage et procédé d'étalonnage destinés à l'étalonnage géométrique d'une pluralité de caméras d'un réseau de caméra
DE102017010683B4 (de) Verfahren zur automatischen Wiederherstellung eines eingemessenen Zustands eines Projektionssystems
EP3274652B1 (fr) Procede de projection de franges, dispositif de projection de franges et programme informatique correspondant
DE19953063A1 (de) Verfahren zur dreidimensionalen optischen Vermessung von Objektoberflächen
DE102006061712A1 (de) Erstellung eines Abstandsbildes
DE102019208474A1 (de) Verfahren und System zum optischen Vermessen eines Objekts mit spiegelnder und/oder teilspiegelnder Oberfläche sowie entsprechende Messanordnung
DE4301546A1 (de) Einrichtung zum Prüfen von Oberflächen von Werkstücken
DE102009006089B4 (de) Verfahren zur Zuordnung von Bildpunkten
DE102007033835A1 (de) Bildaufnahmetechnik zur direkten Objektsegmentierung in Bildern

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20060717

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): DE FR GB IT

17Q First examination report despatched

Effective date: 20070112

DAX Request for extension of the european patent (deleted)
RBV Designated contracting states (corrected)

Designated state(s): DE FR GB IT

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20120901