WO2008098759A1 - Three-dimensional image acquisition method - Google Patents
Three-dimensional image acquisition method Download PDFInfo
- Publication number
- WO2008098759A1 WO2008098759A1 PCT/EP2008/001121 EP2008001121W WO2008098759A1 WO 2008098759 A1 WO2008098759 A1 WO 2008098759A1 EP 2008001121 W EP2008001121 W EP 2008001121W WO 2008098759 A1 WO2008098759 A1 WO 2008098759A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- space
- corresponding pixels
- camera
- depicted
- areas
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S11/00—Systems for determining distance or velocity not using reflection or reradiation
- G01S11/12—Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
Definitions
- the invention relates to a method of three-dimensional image acquisition (depth image acquisition) of two-dimensional image data, as it is produced by digital cameras.
- triangulation-based methods are known, wherin active and passive methods can be distinguished.
- the active methods employ the projecting of patterns, e.g. fringes, onto the sample.
- a camera images the reflected light and a computer analyzes the deformation of the pattern.
- Ambiguities in the interpretation of images can be avoided by usage of different patterns and analysing the appropriate images.
- Passive triangulation-based methods use images, which were taken from different perspectives.
- a software searches for corresponding regions and analyzes their location. This search of corresponding pixels is very time-consuming and in some cases extremely hard or even impossible, e.g. when the measuring object contains periodic structures that cause ambiguities. Furthermore, there may occur an occlusion problem, when parts of the object are occluded by others: the calculation of the depth of points which are seen from only one camera is not possible. Intention
- the intention of the invention is to propose a method for triangulation-based acquisition of depth images where no correspondence problem occurs.
- the rejection of the search for corresponding pixels enables the generation of three-dimensional data in (to the number of pixels) proportional time, which enables the possibility of real time depth image acquisition with manageable effort.
- the two-dimensional images, used for generating the three-dimensional images are still available and can be used e.g. as a texture for the generated 3D-Object.
- the mentioned occlusion problems of known triangulation-based methods basically do not occur, because the recorded images are almost identical.
- Fig.l shows a setup according to claim 2, where one camera (1) is shifted orthogonally to its optical axis (4) towards (7).
- the area between the straight lines (3) and (4) is depicted on a recognized pixel, before the camera is moved. After the motion, the area between (5) and (6) corresponds to the fixed pixel.
- a computation of their distance is not possible.
- An farther away located object causes, if its surface features a sufficient contrast, the measurement of different intensities of the fixed pixel.
- the relative change of measured intensity is inversely proportional to the width of the area, which is depcited on an pixel in a given distance. Because this width is proportional to the distance between depicted object and camera, the relative change of measured intensity is inversely proportional to the distance.
- the precision of the shown method may be further improved.
- the attainable depth-resolution is in the same order as the resolution of intensity of the used camera or cameras respectively, while the computational effort only scales linear with the number of pixels.
Abstract
The acquisition of three-dimensional images (depth-images) of two-dimensional digital image data is done by arranging the optical system in such a way, that their optical axes are located so closely, that areas of the measured space, which are depicted by corresponding pixels, partly overlap in the entire measuring space. The depth-information is gained by analysing the differences of intensities of corresponding pixels. Hence it is possible to produce depth- information in to the number of pixels linear time, which enables real-time applications. Solving the correspondency problem, which is typical for stereoscopic methods, is not necessary and occlusion problems do not occur.
Description
Three-dimensional image acquisition method
Description of the invention
Field of the Invention
The invention relates to a method of three-dimensional image acquisition (depth image acquisition) of two-dimensional image data, as it is produced by digital cameras.
Description of the Prior Art
The generation of three-dimensional image data by analysing the time of flight which the light needs to cover the distance from a light source to the sample and back to the camera is known. There exist methods, which use pulses of light i.e. switch a light source on and off, and phase- based methods, which utilise a modulation of the amplitude of a light source. Both methods measure the time difference between the signal at the source and the changing lens at the camera; this time divided by two and multiplicated with the speed of light gives the distance between the point of the object and the camera.
Furthermore, triangulation-based methods are known, wherin active and passive methods can be distinguished. The active methods employ the projecting of patterns, e.g. fringes, onto the sample. A camera images the reflected light and a computer analyzes the deformation of the pattern. Ambiguities in the interpretation of images can be avoided by usage of different patterns and analysing the appropriate images.
Passive triangulation-based methods use images, which were taken from different perspectives. A software searches for corresponding regions and analyzes their location. This search of corresponding pixels is very time-consuming and in some cases extremely hard or even impossible, e.g. when the measuring object contains periodic structures that cause ambiguities. Furthermore, there may occur an occlusion problem, when parts of the object are occluded by others: the calculation of the depth of points which are seen from only one camera is not possible.
Intention
Based on the known state of art, the intention of the invention is to propose a method for triangulation-based acquisition of depth images where no correspondence problem occurs. The rejection of the search for corresponding pixels enables the generation of three-dimensional data in (to the number of pixels) proportional time, which enables the possibility of real time depth image acquisition with manageable effort. Furthermore, the two-dimensional images, used for generating the three-dimensional images, are still available and can be used e.g. as a texture for the generated 3D-Object. The mentioned occlusion problems of known triangulation-based methods basically do not occur, because the recorded images are almost identical.
Execution examples
Execution examples are explained using the following figures.
Fig.l shows a setup according to claim 2, where one camera (1) is shifted orthogonally to its optical axis (4) towards (7). The area between the straight lines (3) and (4) is depicted on a recognized pixel, before the camera is moved. After the motion, the area between (5) and (6) corresponds to the fixed pixel. For objects being located between (8) and the camera, a computation of their distance is not possible. An farther away located object causes, if its surface features a sufficient contrast, the measurement of different intensities of the fixed pixel. The relative change of measured intensity is inversely proportional to the width of the area, which is depcited on an pixel in a given distance. Because this width is proportional to the distance between depicted object and camera, the relative change of measured intensity is inversely proportional to the distance.
If you eye a pixel and its adjacent pixels (in the direction of the shift of the camera) of the first image and the pixel with the same coordinates of the second image with the intensities p left, p mid, p_right and p mid two, you find the quantity (p_mid - p mid two) / (pjeft - p right) in inverse proportion to the distance between the object imaged by the pixel p mid and the camera.
With the setup shown in Fig.2, which combines the fields of vision of two cameras (1) and (2) by using beamsplitters (7) and (8), the same measurement can be done in less time, because no parts
need to be moved and both images can be acquired at the same time.
By using more efficient mathematical methods for estimating the distance of pixels, e.g. quadratical instead of mentioned linear fit of intensities of adjacent pixels, the precision of the shown method may be further improved.
The attainable depth-resolution is in the same order as the resolution of intensity of the used camera or cameras respectively, while the computational effort only scales linear with the number of pixels.
Claims
1. Method of depth image acquisition based on digital image data of two or more perspectives,
characterized by the fact, that the optical axes of the optical systems are located so closely, that areas of the measured space, which are depicted by corresponding pixels, partly overlap in the entire measuring space.
2. A method as claimed in claim 1,
characterized by the fact, that one camera is used, which after acquiring one first image is shifted orthogonal and/or parallel to its optical axis and acquires at least one more image, in such a way, that areas of the measured space, which are depicted by corresponding pixels, partly overlap in the entire measuring space.
3. A method as claimed in claim 1,
characterized by the fact, that a beamsplitter is used to arrange the optical axes of at least two cameras in space, so that areas of the measured space, which are depicted by corresponding pixels, partly overlap in the entire measuring space.
4. A method as claimed in claim 1,
characterized by the fact, that between at least one camera and the sample an appropriate optical element, e.g. a slice of glass, which can be rotated on an axis which is perpendicular to the optical axis/axes of the camera(s), is positioned, and manipulates the extension of the optical axis/axes into the measuring volume in that way, that areas of the measured space, which are depicted by corresponding pixels, partly overlap in the entire measuring space.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102007007775.2 | 2007-02-12 | ||
DE102007007775A DE102007007775A1 (en) | 2007-02-12 | 2007-02-12 | Method for three-dimensional image data acquisition |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2008098759A1 true WO2008098759A1 (en) | 2008-08-21 |
Family
ID=39597686
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2008/001121 WO2008098759A1 (en) | 2007-02-12 | 2008-02-09 | Three-dimensional image acquisition method |
Country Status (2)
Country | Link |
---|---|
DE (1) | DE102007007775A1 (en) |
WO (1) | WO2008098759A1 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5907434A (en) * | 1995-03-20 | 1999-05-25 | Canon Kabushiki Kaisha | Image pickup apparatus |
US20020008757A1 (en) * | 2000-05-17 | 2002-01-24 | Seiji Sato | Stereoscopic picture image forming apparatus |
-
2007
- 2007-02-12 DE DE102007007775A patent/DE102007007775A1/en not_active Withdrawn
-
2008
- 2008-02-09 WO PCT/EP2008/001121 patent/WO2008098759A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5907434A (en) * | 1995-03-20 | 1999-05-25 | Canon Kabushiki Kaisha | Image pickup apparatus |
US20020008757A1 (en) * | 2000-05-17 | 2002-01-24 | Seiji Sato | Stereoscopic picture image forming apparatus |
Also Published As
Publication number | Publication date |
---|---|
DE102007007775A1 (en) | 2008-08-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Salvi et al. | A state of the art in structured light patterns for surface profilometry | |
Gühring | Dense 3D surface acquisition by structured light using off-the-shelf components | |
US5175601A (en) | High-speed 3-D surface measurement surface inspection and reverse-CAD system | |
CA2538162C (en) | High speed multiple line three-dimensional digitization | |
EP1643210A1 (en) | Method and apparatus for measuring shape of an object | |
CN1176351C (en) | Method and device of 3D digital imaging with dynamic multiple resolution ratio | |
Watanabe et al. | Real-time computation of depth from defocus | |
Hu et al. | Microscopic 3D measurement of shiny surfaces based on a multi-frequency phase-shifting scheme | |
CN110692084B (en) | Apparatus and machine-readable storage medium for deriving topology information of a scene | |
US10001405B2 (en) | Measurement device for obtaining amplitude information of an object | |
NO20200134A1 (en) | Motion compensation in phase-shifted structured light illumination for measuring dimensions of freely moving objects | |
US20040150836A1 (en) | Method and device for determining the absolute coordinates of an object | |
CN112461158B (en) | Three-dimensional measuring method and device for speckle projection phase shift high-frequency stereo vision | |
CN113763540A (en) | Three-dimensional reconstruction method and equipment based on speckle fringe hybrid modulation | |
Vianello et al. | Robust hough transform based 3d reconstruction from circular light fields | |
Mada et al. | Overview of passive and active vision techniques for hand-held 3D data acquisition | |
Wang et al. | Close-range photogrammetry for accurate deformation distribution measurement | |
Ahlers et al. | Stereoscopic vision-an application oriented overview | |
WO2008098759A1 (en) | Three-dimensional image acquisition method | |
Kayaba et al. | Non-contact full field vibration measurement based on phase-shifting | |
Bender et al. | A Hand-held Laser Scanner based on Multi-camera Stereo-matching | |
Lu et al. | Parallax correction of texture image in fringe projection profilometry | |
Godding et al. | 4D Surface matching for high-speed stereo sequences | |
Hochrainer | A cost effective DIC system for measuring structural vibrations | |
Pawiowski et al. | Shape and position determination based on combination of Photogrammetry with phase analysis of fringe patterns |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08707717 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 08707717 Country of ref document: EP Kind code of ref document: A1 |