CA2373284A1 - Color structured light 3d-imaging system - Google Patents

Color structured light 3d-imaging system Download PDF

Info

Publication number
CA2373284A1
CA2373284A1 CA002373284A CA2373284A CA2373284A1 CA 2373284 A1 CA2373284 A1 CA 2373284A1 CA 002373284 A CA002373284 A CA 002373284A CA 2373284 A CA2373284 A CA 2373284A CA 2373284 A1 CA2373284 A1 CA 2373284A1
Authority
CA
Canada
Prior art keywords
image
light
data
color
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002373284A
Other languages
French (fr)
Inventor
Taiwei Lu
Jianzhong Zhang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
3D METRICS Inc
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of CA2373284A1 publication Critical patent/CA2373284A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2509Color coding

Abstract

A method and apparatus for the imaging of three-dimensional objects is described that includes a structured light source projecting a focused image onto an object by passing either continuous or flashes of collimated source light through an optical grating and projection lens. Apertures in the grating, optionally transmitting a plurality of distinct colors, impose a known pattern on the projected light, separated by opaque areas which reduce color crosstalk to enhance accuracy. A camera responsive to the projected image captures an image of the projected light reflected from an object. Through short-duration image capture, particularly using a short-duration structured light flash synchronized to a camera using a fast shutter speed, high accuracy static 3D
images and measurements of moving or living objects, including humans, can be obtained. The data of the captured image is analyzed to establish and refine the apparent location of points of the reflected pattern.

Description

wo oono3as 1 rc~rnJS~nms6 THREE DIMENSIONAL IMAGING SYSTEM
BACKGROUND OF THE INVENTION
The present application is a continuation-in-part of U.S. Patent Application 091080,135 filed May 15, 1998, and incorporates by this reference the subject matter of that application.
1. Field of the Invention:
The present invention relates to a method and apparatus for three-dimensional surface profile imaging and measurement and, more particularly, to distance profile measurement of objects based upon two-dimensional imaging of the objects reflecting structured illumination.
Three-dimensional (hereinafter also referred to as either "3D" or "3-D") imaging and measurement systems are known. In general, the purpose is to determine the shape of an object in three dimensions, ideally with actual dimensions. Such imaging and measurement systems fall into two basic categories: 1) Surface Contact Systems and 2) Optical Systems.
Optical Systems are further categorized as using Laser Triangulation, Structured Illumination, Optical Moir~
Interferometry, Stereoscopic Imaging, and Time-of-Flight Measurement.
Optical Moir~ Interferometry is accurate, but expensive and time-consuming.
Stereoscopic Imaging requires comparison of images from two cameras, or two different image captures, to map the 3D surface of an object. Time-of-Flight Measurement calculates the time for a laser beam to reflect from an object at each point of interest, and requires an expensive scanning laser transmitter and receiver.
The present invention is an Optical System based upon Structured Illumination, which means that it determines the three dimensional profile of an object which is illuminated with light having a known structure, or pattern. The structured illumination is projected onto an object from a point laterally separated from a camera. The camera captures an image of the structured light pattern, as reflected by the object. The object can be profiled in three dimensions where the structured light pattern reflected by the object can be discerned clearly. The shift in the reflected pattern, as compared to that which would be expected from projection of the same pattern onto a reference plane, may be triangulated to calculate the "z" distance, or depth.
2. Description of Prior Art:
Three dimensional imaging and measurement systems and methods are known. For 3 0 example, the following patents describe various types of these devices:

wo oono~ 2 permsr~no~ss U.S. Patent No. 3,589,815 to Hostetman;U.S. Patent No. 4,935,635 to O'Hatra;

U.S. Patent No. 3,625,618 to Bickel;U.S. Patent No. 4,979,815 to Tsikos;

U.S. Patent No. 4,247,177 to Marks U.S. Patent No. 4,983,043 et al; to Handing;

U.S. Patent No. 4,299,491 to ThomtonU.S. Patent No. 5,189,493 et al; to Handing;

U.S. Patent No. 4,375,921 to Morander;U.S. Patent No. 5,367,378 to Boehnlein et al;

U.S. Patent No. 4,473,750 to Isoda U.S. Patent No. 5,500,737 et al; to Donaldson et al;

U.S. Patent No. 4,494,874 to DiMatteoU.S. Patent No. 5,568,263 et al; to Hanna;

U.S. Patent No. 4,532,723 to KellieU.S. Patent No. 5,646,733 et al; to Bieman;

U.S. Patent No. 4,594,001 to DiMatteoU.S. Patent No. 5,661,667 et al; to Bordignon et al;

U.S. Patent No. 4,764,016 to U.S. Patent No. 5,675,407 Johansson; to Geng.

A variety of t~hnical papers address this subject as well. Color-encoded structured light has been proposed to achieve fast active 3D imaging, as for example by K. L.
Boyer and A. C.
Kak, "Color-encoded structured light for rapid active ranging," IEEE
transactions on pattern analysis and machine intelligence, Vol. PAMI-99, pp. 14-28, 1987. The color of the projected structured light is used to identify the locations of stripes and thereby reduce ambiguity when interpreting the data.
Since then, several groups have worked on different color-encoding methods for imaging, such as J. Tajima, M. Iwakawa, "3-D data acquisition by rainbow range finder," Proc. Of the 10~ International Conference on Pattern Recognition, pp. 349-313, 1990. A
similar color-encoding method utilizing a color CCD camera and a Linear Variable Wavelength Filter, and needing only one image capture for each measurement, was proposed by Z. J.
Geng, "Rainbow three-dimensional camera: new concept of high-speed three-dimensional vision systems," Opt.
Eng. ~, pp. 376-383, 1996. The measurement accuracy of the Geng system depends on the color distinguishing abilities of the camera, and is impaired by cross-talk between colors.
A different system, using a color fringe comprising three overlapping sinusoids, having the speed and single-image advantages of the Geng system but also having limited accuracy, was proposed by C. Wust and D. W. Capson, "Surface profile measurement using color fringe projection," Machine Vision and Applications, 4_, pp. 193-203, 1991.
Results of other experiments with color-encoded structured light sources have been reported by T. P. Monks, J. N. Carter, and C. H. Shadie, "Color-encoded structured light for real-time 3D digitization," in IEE 4~ International Conference on Image Processing," Maastricht, The Netherlands, April 7-9, 1992, by T. P. Monks and J. N. Carter, "Improved stripe matching for color encoded structured light," in Proceedings of International Conference on Computer Analysis of Images and Patterns, pp. 476-485, 1993.
Calibration schemes have been used to improve the accuracy of the structured light based 3D imaging, for example as discussed by E. Trucco, R. B. Fisher, A. W.
Fitzgibbon, and D. K.
Naidu, "Calibration, data consistency and model acquisition with laser stripers," Int. J. Computer Integrated Manufacturing, l l, pp.293-310, 1998.

WO 00/70303 3 PCTIUS99110'T56 There are reports of using combinations of different color-encoding techniques, as well as combining them with other techniques, such as by E. Schubert, H. Rath, and J.
Klicker, "Fast 3D
object recognition using a combination of color-coded phase-shift principle and color-coded triangulation," SPIE Vol. 2~, pp.202-213, 1994, and by C. Chen, Y. Hung, C.
Chiang, and J.
Wu, "Range data acquisition using color stntctured lighting and stereo vision," Image and Vision Computing ~, pp. 445-456, 1997. These combinations offer improved transverse spatial resolution, but the relative noise level appears to be a rather high 5% of range.
The prior art systems are often inaccurate, or require multiple exposures or expensive equipment to obtain satisfactory accuracy. A need therefore exists to provide a 3D profiling system which is accurate, yet easy to use and inexpensive.
SUMMARY OF THE INVENTION
intone aspect, the present invention provides a three dimensional (3D) imaging system requiring only a single image capture by substantially any presently-manufactured single camera using a single structured light source. In another aspect it provides a 3D
imaging system which is inexpensive to manufacture and easy to use. In yet another aspect, it permits 3D imaging and measurement using a structured source of visible, infrared, or ultraviolet light. In a further aspect the invention provides a 3D imaging and measurement system which reduces crosstalk between reflections of a color-encoded structured light source. In another aspect it provides 3D imaging using any combination of enhancing a color-encoded structured light source, algorithmically enhancing the accuracy of raw data from an image reflecting a structured light source, comparing the image data to a measured reference image, and algorithmically enhancing the calculated depth profile. In a further aspect the invention permits 3D imaging and measurement using either a pulsed light source, such as a photographic flash, or a continuous light source. In yet a further aspect, the invention provides a light structuring optical device accepting light from a standard commercial flash unit synchronized to a standard commercial digital camera to provide data from which a 3D image may be obtained. In another aspect the invention permits 3D
imaging and measurement using a structured light source which modulates the intensity and/or the spectrum of light to provide black and white or mufti-color structured lighting pattern.
In one aspect the invention provides a method to project a structured illumination pattern onto an object. Another aspect provides 3D imaging using an improved color grating. In a further aspect, the invention provides a 3D imaging system that can be used to project an image onto an object. In one aspect the invention provides a 3D imaging system that can be used with a moving object, and with a living object. In another aspect the invention provides a 3D imaging and measurement system having a camera and light source which can be integrated into a single body.
In yet another aspect, 3 5 the invention permits accurate 3D imaging and measurement of objects having a surface color texture, by using two images reflecting different lighting.
By employing a combination of improvements to structured light sources and image data wo oono~ 4 Pc rnJS99no~s6 interpretive algorithms, the present invention enables accurate three dimensional imaging using any off the-shelf digital camera, or indeed substantially any decent camera, taking a single image of an object under structured lighting. The structured lighting may be provided by adding a fairly simple pattern-projecting device in front of an off the-shelf photographic flash unit. The improvements - encompassed by the present invention wok together to make full realization of the advantages of the invention possible; however, one may in some cases omit individual improvements and yet still obtain good quality 3D profile information. The structured light source is improved by separating color images to reduce color crosstalk, and by adaptation for use with off-the-shelf flash units.
The image data interpretive algorithms reduce the effects of color cross-talk, improve detection of light intensity peaks, enhance system calibration, and improve the precision of identifying the location of adjacent lines.
A three dimensional imaging system for use in obtaining 3D information about an object, constnzcted in accordance with the principles of the present invention, has a structured light source including a scarce of illumination which is transmitted through.a.black and white or color grating and then projected onto the object. The grating includes a predetermined pattern of light transmitting areas or apertures, which are typically parallel transmissive bars disposed a predetermined distance apart from each other. In some embodiments the grating will include an opaque area intermediate each of a plurality of differently colored bars. The imaging system also includes a camera, or other image capturing device, for capturing an image of an object reflecting light from the structured light source. The camera may take short duration exposures, and/or the light source may be a short duration flash synchronized to the exposure, to enable capture clear 3D
images of even moving and living objects. The system may include a means for digitizing the captured image into computer-manipulable data, if the camera does not provide digital data directly.
A bias adjusted centroid light peak detection algorithm aspect of the present invention may be employed to enhance accuracy of the detected image, and system calibration methods are an aspect of the invention which can reduce errors by comparing the detected object image to an actual reference image fakeri using the same system setup. For plural color gratings, color cross-talk compensation aspects of the inventian include employing either or both using opaque areas between different colors in the grating, and a color compensation algorithm which is the inverse of a determined color cross-talk matrix. A center weighted line average algorithm aspect of the present invention is also particularly useful for plural color gratings. A
three-dimensional imaging system constructed and operated in accordance with the principles of the present invention would employ a combination of these mechanical and algorithmic aspects, and, in conjunction with well known calculations performed upon the derived image data, would determine information about an imaged object along three dimensional planes x,y, and z.

BRIEF DESCRIPTION OF THE DRAWINGS
Fig. l shows structured lighting for a 3D imaging system using a CCD video camera.
Fig. 2 shows some details of another 3D imaging system.
Fig. 3 is an improved grating for use with a 3D imaging system.
Fig. 4 shows image contours of an object reflecting structured light.
Fig. 5 is a perspective view of a three-dimensional profile obtained from Fig.
4 data.
Fig. 6 is a graph showing determination, by threshold, of regions as being of a color or not.
Fig. 7a is a portion of an image reflecting a three-color structured light source.
Fig. 7b is a graph of color intensities measured from Fig. 7a image.
Fig. 8 is a flowchart of a color cross-talk compensation image processing procedure.
Fig. 9 is a graph of color-compensated color intensities from Fig. 7a image.
Fig.10 graphically shows bias-adjusted centroid peak detection.
Fig. I1 is a flowchart of a system calibration procedure.
Fig.12 shows details of 3D image system using system calibration.
1 S Figs.13 a-a shows measured proFiles after progressively enhancing accuracy.
Fig. 14a shows a human face.
Fig.14b shows the human face illuminated by structured light.
Fig.14c shows a reconstruction of the 3D profile of the human face.
Fig. 14d shows a cross-section of the 3D profile of the human face.
DETAILED DESCRIPTION OF THE INVENTION
A three dimensional imaging system is shown in Fig. 1, identified in general by the reference numeral 10. A modified 3D imaging system, identified in ~;::~~Lral by the reference numeral 12, is shown with somewhat more detail in Fig. 2, and a grating, identified in general by the reference numeral 14 is shown in Fig. 3.
2S ' The 3-D imaging system of Fig: 1 shows structured illumination source 16 projecting patterned light (structured illumination) toward object 18. The pattern of light may be color encoded or not, and may be patterned in any way which is known and can be readily recognized in a image. A simple and preferred pattern consists of parallel bars of light.
Light pattern 20 displays a preferred pattern of light projected from structured light source 16, seen where the light passes through plane O-X (perpendicular to the plane of the paper) before reaching object 18. In practice, light pattern 20 would continue on to object 18, from whence it would be reflected according to the contours of object 18. An image so reflected, indicated generally by reference numeral 32, will be captured by camera 30. Object 18 is only a cross-section, but the entire object is the face of a statue of the goddess Venus. A
representation of image 32, as seen by camera 30, is shown in Fig. 4. In the image of Fig. 4, it can be seen that the parallel bars of light from the structured light source are shifted according to the contours of the statue. A representation of the statue as determined using the present invention is shown in Fig. 5.

In Fig. l, each light area of light pattern 20 is a bar of light which is oriented perpendicular to the page and thus is secn in cross-section. Dark areas 22 are disposed between each light area 24a, 26a, 28a, 24, 26 and 28. Distance 21 is the spacing P between the centers of like colors in a three-color embodiment, and distance 23 is the spacing P' between adjacent light bats. In the preferred embodiment, the same distance 23 is the distance between the centers of adjacent dark areas 22. Bars of light 24, 26 and 28 tray all be white, or all of the same color or mixture of colors, or they may be arranged as a pattern of different colors, preferably repeating.
Dark areas 22 are preferred intermediate each of light bars 24, 26, 28, etc.
The preferred proportion of dark area to light area depends upon whether or not a plurality of distinct colors are used. In embodiments not using a plurality of distinct colors, it is preferred that dark areas 22 are about equal to light areas. In embodiments using distinct colors, dark areas 22 are preferably as small as possible without permitting actual intetrnixing of adjacent colors (which will occur as a result of inevitable defocusing and other blurring of the reflected structured illumination). Dark areas 22 'greatly reduce erosstalk which would otherwise corrupt reflections from object 18 of 1 S projected light pattern 20.
Without dark areas 22, the light from adjacent bars 24, 26, 28 would tend to interfere with each other, causing loss of accuracy in locating the projected light pattern in image 32 of object 18.
This inaccuracy would in turn impair the accuracy of the 3-D profile calculated for the recorded image.
For many applications, a plurality of distinct different colored bars are preferred. Any number of distinct colors can be used, but three is the most preferred number of different colors, and red, green and blue are a preferred choice for three particular colors.
Thus, in the light bars shown in cross-section in Fig. 1, bar 24 might be red, bar 26 green, and bar 28 blue. It can be seen that red colored bats 24 repeat at predetermined interval distance 21 such that the centers of like colors are separated by distance 21 (or P). Similarly to red, green colored bars 26, 26a repeat at period P (21) and blue colored bars 28, 28a also repeat at period P.
Note that light color used with the present invention need not be in the visible spectrum, as long as a structured light source can accurately provide a pattern in that color, and the image-capturing device can detect the color. Thus, the use of light from infrared at least through ultraviolet is well within the scope of the present invention.
Structured Light Source Grating System 10 includes structured light source 16. Grating 14, although not shown in Fig. 1, is contained within the optical system of structured light source 16, and determines the pattern to be projected onto object 18. Fig. 2 shows details of a light source 16. The light from light source 34 is collimated by collimating lens 32. The collimated light passes through grating 14, which imposes structure (or patterning) on the light. The stnictured light from grating 14 is focussed by projecting lens 38, so that it may be accurately detected by camera 44 when the structured light reflects from object 40 to form image 42. Data 48 representing image 42 as captured by camera 44 WO 00170303 7 PCTlUS99/10756 is conveyed to processor 46 so that calculations may be performed to extract the 3D information from image data 48.
It should be understood that any predetermined pattern may be used for grating 14. The requirements for the pattern are that it be distinct enough and recognizable enough that it can be identified after reflection from object 40. Parallel bars are preferred for the structured light pattern, and are primarily described herein.
Referring now to Fig. 3, grating 14 includes a repetitive pattern of parallel apertures 4, 6, 8 etc. disposed a predetermined center to center distance 5 apart from each other. "Aperture" as used herein means a portion of the grating which transmits light, in contrast to opaque areas which block light transmission. The apertures of grating 14 may transmit a plurality of distinct colors of light.
A structured light pattern using a plurality of distinct colors is considered to be color-encoded.
The grating modulates the color of the light.
Alternatively to transmitting a plurality of distinct colors, the apertures of grating 14 may each transmit the same color of light. (Any particular mix of light frequencies may be understood . . _ .
to be a "color" as used here, so that if all apertures transmit white light they are considered not to transmit a plurality of distinct colors). If the grating does not vary (or "modulate") the color of the transmitted light, then it must at least modulate the intensity of the single color of light transmitted so that a recognizable pattern of light is projected.
Turning now to details of preferred grating 14 as shown in Fig. 3, opaque areas 2 having width 7 are disposed between adjacent apertures 4 and 6, 6 and 8, 4a and 6a, etc. Each aperture preferably has width 9. Although a plurality of colors is not necessary, a preferred embodiment employs three different colors 4 (e.g. red), 6 (e.g. green), and 8 (e.g.
blue), repeating as 4a, 6a, 8a and again as 4b, 6b, 8b, etc. When using three colors, interval distance 1 is the spacing between centers of like-colored apertures. Opaque area interval distance 3 between the centers of opaque areas 2 is equal to adjacent aperture center spacing distance 5. These spacings, in conjunction with the projection focussing and the distance to the target, will control the spacing of projected light bars as shown in Fig. 1.
The size of the periods P (21) and P' (23) on projected image 20 of Fig. 1, and of interval distances 3 and 1 in grating 14 in FIG. 3, has been exaggerated to provide improved clarity. The size of these periods is a design variable, as are the use of different colors, the number of colors used, the actual colors used, the dimensions of the colored bars, and the size of grating 14. The actual spacings of grating 14 may be readily determined when the focussing characteristics of projector lens 38 are known, in order to produce a desired pattern of structured light, as explained below.
3 S A preferred aperture interval distance 5 for a plural-color grating is such as to cause light bar interval distance 23 to be about 1.5 mm - 2 mm at L distance 25 = 1000 mm.
For gratings not primarily employing a plurality of colors, aperture interval distance 5 is preferably set to yield light bar distance interval 23 of about 4 mm at L distance 25 = 1000 mm. For a plural-color grating, width 7 of opaque areas 2 is preferably about 2J3 of aperture interval distance 5, while for gratings WO 00/70303 ti PCTNS99/10756 not using a plurality of distinct colors it is preferred that width 7 be about 4/5 of aperture interval distance 5.
In order to distinguish a feature, something must change or modulate. In a system not employing a plurality of colors, usually the intensity of the light is modulated to create a recognizable pattern. Such patterns can only be so close to each other before the light simply becomes continuous, and no longer has adequate modulation. However, features can be distinguished on the basis of a modulation of the color of the image. Even if the light is constant in intensity, and thus has no significant intensity modulation, the modulation in color may create a recognizable border of a feature.
For example, if a camera can accurately distinguish different colors, then the total image can be separated into a plurality of different color images. The different color images are for practical purposes like three different images taken at the same time; each can be analyzed for feature location. These distinct images might be distinguished even if the overall light intensity is constant, indeed even if the light of a pattern in one color actually overlaps a.
different-colored pattern.
Accordingly, since the light can be continuous (though of different colors), or even overlap, it is possible to have different-color image features more closely spaced than would be distinguishable in same-color light images. Same-color image features must be separated by dark areas to provide a recognizable bonier of a feature. By distinguishing closely spaced images in different colors, a camera can recognize more closely-spaced, and accordingly higher resolution, patterns.
For typical CCD motion picture camera 30, and for typical digital camera 44, the receptor characteristics are such as to best distinguish red, green and blue. The ability to distinguish the colors used to encode a plural-color structured light pattern determines how closely spaced the pattern may be and still be clearly resolved. Accordingly, red, green and blue are generally preferred for the encoding colors of plural color structured light sources.
However, it should be understood that more or less distinct colors could be used, and could well be adjusted to correspond to the best color selectivity of a particular camera.
If using film, the film image must be "digitized" to provide data which can be processed by a computer - that is, processable data. Digitizers typically are also typically best able to distinguish red, green and blue.
Registration It is important to know which part of a captured image corresponds to which part of the projected structured light pattern, in orc~r to extract the information as to how far the pattern has shifted which ultimately conveys the depth information. Information relating to identifying which part of the pattern is which is generally referred to as "registration."
Accurate registration is necessary to prevent spatial ambiguity due to not knowing what part of the pattern is being reflected from the object under study.
To register the preferred pattern of parallel bars, a preferred method is to make one central line identifiable, and then to count lines from there. In color-encoded systems a central line can be made identifiable by making it white, as opposed to the other distinct colors.
In non-plural-color systems another marking should be used, such as periodically adding "ladder steps," perpendicular apeatunes across what would be an opaque area in the rest of the pattern.
Alternatively, a single different-color line could be used for registration.
From certain identified lines, the remainder of the lines are registered by counting. When the 3D profile is steep, the projected lines reflected by the object may become very close together, and become lost or indistinguishable. This can interfere with registration.
Color-encoded parallel-bar patterns have an advantage in registration, because it is easier to count lines. Given three colors, two adjacent colored lines can be indistinguishable and still not interfere with the abifity to count lines (and thus keep track of registration). In general, for n colors, n-1 adjacent lines can be missing without impairing registration.
Thus, color-encoded systems generally have more robust registration.
The use of a structured light source constructed according to the present invention with ..
opaque areas 2 intermediate each of color bars 4, 6 and 8 in grating 14 to create projected dark areas 22, enables a single exposure of object 22, recorded by essentially any commercially available image recorder (e.g. camera} to provide the necessary information from reflected image 32, sufficiently free from unwanted crosstalk, to permit proper detection and registration of the light bar lines, and hence to enable practical 3-D profilometry.
The principles of the present invention are intended for use with a variety of structured light sources and cameras, each of which has varying properties. In particular, stmctured light sources will vary in the accuracy of the projected image, in the difference in intensity between intended light and dark areas, in the colors projected and their spectral purity. Cameras may be either film-type photographic units, requiring scanning of the film image to obtain dig~!a'..
data, or digital cameras which provide digital information more directly. Many variations exist between different digital cameras. As noted elsewhere, some cameras have a high ability to distinguish different colors with minimal interference; this is most often true of cameras employing three separate monochromatic CCD receptors for each color pixel. Other cameras employ broad-band CCD
receptors, and deduce the received colors through internal data manipulation which may not be accessible externally. The present 3-D imaging system invention may be practiced in a variety of aspects, with various features of the invention employed as appropriate for a particular camera and a particular structured light source.
The following sections describe algorithmic procedures which can enhance the accuracy of 3-D profiles determined in accordance with the present invention.
Region detection and color-cross-talk compensation Whether single or multiple colors are used, the resulting image data must be examined to assign regions as either light or dark. Ultimately, since in the preferred embodiment a light bar (or wo oono3o3 10 Prriuss~no~s6 "line") location is the feature which must be determined to calculate the 3D
profile, the center of these lines must be located as accurately as possible. To accomplish this, noise and interference should be eliminated as far as possible. One of the major noise sources for the color-encoding based 3D imaging is color crosstalk noise, which comes from the color grating of plural-color embodiments of grating 14, from object colors, and from color detectors used by camera 30 or 44 (or by the digitizer of film image information). Accordingly, a sp~:ial procedure may be employed to assign regions by color. Color crosstalk compensation removes a great deal of undesirable interference from this region determination.
Color crosstalk may be understood by examining the intensities of color-encoded light from a structured light source as detected by a camera. First, the grating which produces the color pattern should be understood.
Figure 7a shows a color grating including red, green, arid blue color lines, which is a preferred color version of grating 14 in Fig. 3. A real color grating is created-therefrom by writing the designed color pattern on a high resolution film (e.g. Fujichrome Velvia) using a slide maker IS (e.g. Lasergraphics Mark llI ultrahigh resolution, 8000 x 16000). To detect the color spectrum of this fabricated color grating, a uniform white light (i.e. a light having a uniform distribution of intensity across at least the visible spectrum) is used to illuminate it, and a digital camera (e.g.
Kodak DC260) is employed to take a picture of the grating. The color spectrum of the grating is obtained by analyzing the intensity distribution of different color lines of the recorded digital image, as shown in Fig. 7b.
Fig. 7b shows severe color cross-talk among different color channels. For example, the intensity of typical false blue peak 71 in the location of a green line (indicated by green peak 73} is comparable to the intensity of typical true blue peak 75. Thus, the color cross-talk noise (e.g., an apparent but false color detected where it should not be) is about the same level as the color signal itself. This can lead to false line detection, in which one would detect a line (blue, in this case) that actually did not exist. ' hi this example, typical red peak 77 does not cause a significant false peak in another color.
However, even when the cross-talk noise is sufficiently lower than the signal to avoid false line registration, color cross-talk can substantially shift the apparent peak locations of color lines from their actual locations. Since a shift in the peak location will result in errors in the depth calculation, it is important to compensate this shifting effect.
To effectively compensate color cross-talk, first, the color spectrums of the color grating are collected by taking pictures with different objects and different digital cameras. The objects include both neutral color objects such as a white plane, a white ball, and a white cube, and lightly-3 5 colored objects such as human faces with different skin colors including white, yellow, and black.
Several popular digital cameras, including Nikon CoolPix 900, Algofa 1280, Kodak DC260, Fuji 300, and Minolta RD 175, have been tested, and among these cameras the green lines virtually never have false peaks inside other color areas. Peak 73 in Fig. 7b is a typical green peak, properly wo oono3o3 11 pc~rms~n ms6 located in a green area, and false peaks in other color areas are not seen in Fig. 7b. Since false green peaks are the least common, green is the most reliable color.
Accordingly, the color-compensation algorithm starts with the green lines. Starting with other colors would be called for in cameras in which the described testing showed another color being the most reliable. After green, red is typically the most reliable, and blue the least. Other color encoding colors would require selection of an order in which to process out false peaks.
Fig. 6 shows a graph of a portion of intensity data for light. It should be understood that the continuous line is an idealization of data from a multitude of pixels, or points, tied together. In reality there may be as few as three pixels in a region 63 which exceeds threshold level ('TL) 61.
As long as this convention is understood, the graphs showing continuous intensity data can be properly understood.
Fig. 8, a flowchart for color crosstalk compensation, will be described with reference also to the reference designators of Fig. 6.
Step 81: The image-data is captured of an object reflecting color-encoded smrctur~d.light. -As a_ practical matter (see Step 83), the color compensation algorithm is generally applied to only a portion of the image at a time, preferably to a square region covering about 10 colored lines; the size of the regions analyzed is a design variable. Thus it must be understood that this procedure may be repeated in toto for many sub-images.
Step 82: For each subject color successively, peak values 69 are determined.
Step 83: Subject color threshold level (TL) 61 is established at a design value which is roughly 75~ of the peak amplitudes of the image under analysis. Note that if this algorithm is performed over too large a region having substantial variation in reflected intensity, then TL
will not be appropriate for all peaks in the region, motivating the use of sub-images as indicated in Step 81.
Step 84: Areas are tentatively identified assigned as peak areas of the subject color where the intensity of the subject color light exceeds TL 61.
Step 85: If areas have already ti~en assigned to other colors, a tentative peak area located in an area previously assigned to another color will be rejected as invalid, and will skip the assignment step of Step 86.
Step 86: For tentative peak areas of the subject color which are not in regions previously assigned to another color, the area will lx assigned to the subject color.
Step 87: Repeat Steps 84-86 until assignment of the subject color is complete.
Step 88: Repeat Steps 82-87 until each of the encoding colors has been tested.
Step 89: Calculate a Color CrosstaIk Matrix (CCM) for the image (or sub-image). In a 3 X 3 matrix for three different colors, which may readily be adjusted to N X N for N different colors, CCM is defined as err erg orb CCM = k agr ass asb (1) abr abg ebb WO 00170303 12 PCTlUS99I10756 where k is a normalization constant which is adjusted to avoid saturation which may be caused by limited-dynamic range arithmetic operations on the intensity data, and a;~ [ i, j E (encoding colors, e.g. r, g, b) ) is defined as 1 color ~eolor (2) all number of pixels in j~otor regions S where 1 i~olor represents the light intensity of i~o~or in j~otor regions.
Step 90: the compensation for color cross-talk can be implemented by using the inverse of CCM, as given by r' r g~ = j CCM]-' g (3) b' b where r', g', b' represent the red, green, and blue colors after cross-talk compensation.
As noted in Step 89, the matrix is readily adjusted to accommodate N colors.
Here, the colors are represented as r, g and b by way of example.
Fig. 9 shows the color spectrum after the color cross-talk compensation for the same color grating and same digital camera used to obtain Fig. 7(b). By comparing Figs. 9 and 7b, one can clearly see that the color cross-talk noise is substantially reduced. Typical green, blue and red peaks 93, 95 and 97 of Fig. 9 are now almost equal to each other in intensity, and false peaks such as 71 in Fig. 7b are virtually eliminated.
Bias adjusted centroid peak detection method To minimize the noise of 3D imaging, it is important to accurately find the peak center location of the structured lines. Due to the color cross-talk and the color of the object itself, the apparent center locations of the stiuctured lines may be shifted, so th9t the detected peak locations may not be the actual locations. By analyzing the peak locations from the color encoded data of over 500 images, using centroid detection as usually performed in the art, it has been determined that the average error of peak location C is about 0.4 pixel when the distance between adjacent Iight bars is about 5 pixels.
3D imaging systems according to the present invention may employ bias adjusted centroid peak detection to determine the centers of light patterns, as shown in Fig. 10 and described below.
The description also references Figs. 6 and 8.
Those skilled in the art will understand that, though continuous intensity data is described graphically, in reality the data is taken at discrete points (e.g. pixels).
Thus, summations performed upon the data, as described, are performed at each data point within the region of interest. It is desirable that three or more pixels or data points exist within any region which is recognized as a peak of color intensity in the following algorithm.

wo oono3oa 13 »rius~n tns6 Step 1: scanning along data rows which are generally perpendicular to the structured image lines, find start point 102 and end point 104 locations of the intensity profile 100 of each region assigned to a particular color. This step is preferably performed region by region, and is the same as steps 81-88 of Fig. 8. These steps 81-88 are presently preferred to be S performed as a separate step, upon data which has already been adjusted by color compensation, as described above. In any event, since start 102 and end 104 points are in reality discrete pixel data, though both are above threshold level TL, one will be higher than the other.
Step 2: find base 106 (i.e., the bias level) of peak area 104 using the definition:
base = max (Intensity of start, Intensity of end) (4) Step 3: refine the estimated center of the image, interpolating between pixels and calculating the refined center {RC) of each line using a bias adjusted centroid method, given by end for Intensity(x) - base > 0 } (Intensity(x) - base) * x~o~ation RC - start tnd (5) { for Intensity(x) - base > 0 } (Intensity(x) - base) sfnrt where Intensity(x) represents the intensity at location x.
Step 4: repeat steps 1-3 for each row of data in each color.
According to data from the aforementioned 500 color encoded images, the average error of RC is about 0.2 pixel, about 1/2 of the error of C (without bias adjusted centroid detection). This, in turn, will double the ~curacy of 3D imaging based upon RC.
Optional smoothing The location of the centers of lines determined may be smoothed, filtering the location of each point to reduce deviations from other nearby points of the line using any well-known filtering algorithm. Since the lines of the reference image should be smooth, heavy filtering is preferred and should reduce noise without impairing accuracy. With regard to structuring lines reflected from an object, excessive filtering may impair accuracy. It is preferred, therefore, to determine discontinuities along structuring lines in the reflected image, and then perform a modest filtering upon the (continuous) segments of the structuring lines between such discontinuities.
System Calibration A line-by-line calibration procedure may be employed in imaging systems embodying the present invention to compensate system modeling error, the aberration of both projector and 3 0 camera imaging lenses, and defocusing effects due to object depth variation, as enumerated and detailed below:
Three-dimensional data is often deduced from comparison of (a) object-reflected structured wo oono3o3 14 PCTNS99/10756 image point locations to (b) the theoretically expected locations of structured light image points reflected from a reference plane. The theoretical expected location generally is calculated upon assumptions, including that the structured light image is perfectly created as intended, perfectly projected and accurately captured by the camera. Due to manufacturing tolerances of the siruclured light source and of the camera receptors, and to lens aberrations such as coma and chromatic distortion, each of these asstunptions is mistaken to some degree in the real world, introducing a variety of errors which shift the actual location of the structured line from its theoretical location. Other defocusing and color cross-talk effects add further errors to the detected location of the structured lines. Accordingly, accuracy of the 3D
determination is enhanced by comparing the (a) object-reflected structured image point locations to (c) a carefully measured image of the actual structured light reflected from a precise reference plane.
The following description of Fig. 12 is needed to describe such calibration; however, the principles discussed will also help the reader understand the general triangulation method used to deduce 3D information.
In Fig. 12; optical axis 121 is the z axis, and is perpendicular to reference plane 123.
1S Baseline 124 is parallel to reference plane 123, is defined by the respective centers 126 and 12S of the principle planes of the lenses of source 16 and camera 44. The x axis direction is defined parallel to baseline 124, and y-axis 122 is defined to be perpendicular to the plane of baseline 124 and optical axis 121. D is the distance between points 125 and 124 of structured light source 16 and digital camera 44. L is the distance between base line 144 and the reference plane of a white surface. Object point 130 (P(xoblecc~ Yobjecc~ Zoblecc)) is a point on the object to be profiled. Object point 130 projects to point 129 (P'(xoblect, U, zo~ecc)) in the y~ plane which is perpendicular to Y
axis 122 and includes optical axis 121.
Point 130 is seen by camera 44 as a point having an x-value, translated to reference plane 123, of xc 128. However, the structured light line crossing point 130 is known from a recorded image of reference plane 123 to have an x-value xp 127. The discrepancy between xc 128 and xp 127 permits deduction of z~ie~ by well-known triangulation from the given known distances of the setup.
Fig. 11 is a flow chart of general data processing steps using line-by-line calibration and some of the other accuracy enhancements taught herein.
3 0 Step 112: using a device arranged in a known configuration such as shown in Fig. 12, capture a reference image using perfect white reference plane 123, then digitize if necessary and output the digital image data to a processing system.
Step 113: capture an object image using the same system configuration as in Step 112. One may capture the images of multiple objects without capturing a new reference image as long as the system set-up is the same as used to capture the reference image.
Step 114: perform cross-talk compensation on both object and reference data.
Step 115: perform bias adjusted centroid peak detection to refine the location of center peaks for lines in the object, and optionally smooth the lines along segment between determined discontinuities.
Step 116: perform bias adjusted centroid peak detection to refine the location of center peaks for lines of the reference image, and preferably heavily filter these lines.
Step 117: determine the system-calibrated height of each point of each structured light line deflected from the object by triangulation of the difference between the expected x-location of a given point on a line of the reference image, and the x-location of that point of the same structured light line as reflected from the object.
Step 118: Use center-weighted averaging of the height determined by each of three adjacent structured light lines at that y-location. Do not perform averaging if the adjacent structured light lines are determined to be discontinuous. Adjacent lines are considered discontinuous if the difference between the heights determined by any 2 of the three lines exceeds a threshold. The threshold is a design choice, but is preferably about 2 mm in the setup shown in Fig. 12.
Center Weighted Line Average The fluctuation error of 3D profiles may be reduced by averaging the z value determined for points which are near each other. The present invention prefers to perform center weighted averaging of nearby points around the image point being adjusted. In particular, using data scanned roughly perpendicularly to the swctured light bars, it is preferred to perform weighted averaging of the z value determined at each of three adjacent structured light lines, using the weighting function (0.5, 1, 0.5). To avoid erroneously smoothing discontinuous or steeply varying locations, weighted averaging is not performed at all points. In particular, if the difference in height, for any two of the three nearby points to be used for averaging, exceeds a threshold value, then averaging is not performed (the threshold is a design variable, but preferably is about 2 mm in the setup of Fig. 12). This weighted technique improves accuracy 0.3/0.1 ~ 3-fold, substantially better than the - 1.73-fold improvement resulting from conventional 3 point averaging.
This technique is effective for all embodiments of the grating'aceording to the present invention. However, since the errors between different colors tend to be independent, while errors between same colored lines may be partly coherent, it is believed that this weighted averaging technique will be more effective for adjacent different colored lines than for adjacent same-colored 3 0 lines.
Preferred embodiments and results The principles of the present invention permit accurate three dimensional imaging and measurement of an object using a single camera in a known spatial relationship to a single stnrctured light source, taking a single image of the object reflecting the structured illumination.
The light may be provided by a flash, which due to its very short duration substantially freezes motion, permitting profile imaging on moving objects. One preferred embodiment utilizes a standard commercial photographic flash unit as an illumination source, the flash unit separated WO OOI70303 16 PGTNS99/10~56 from but synchronized with a standard digital camera. By simply placing in front of the flash unit an optical unit including a grating as described above, and one or more focussing lenses, the flash will cause structured light to be projected upon an object. A relative 3D
image may be obtained by processing only the data obtained from the digital camera image. If the orientation measurements of the structured flash, camera and object are known, then absolute 3D
information and measurements of the image can be obtained. The invention can use film cameras, but the structured-light-illuminated image must then be digitized to complete enhancements and 3D image determination.
Object reconstruction in full ~lor Another preferred embodiment of the invention permits true-color reconstruction of an object, such as a human face. Using two separate exposures, one under structured light and one under white light, a 3D image and a two dimensional color profile are separately obtained at almost the same moment. A model is constructed from the 3D profile information, and the color profile of tl~ original object is projected onto the model. Alignment of the color profile and model is accomplished by matching features of the object and of the image.
In this embodiment the two images are preferably close in time to each other.
The structured illumination for one image may be color encoded or not, and is directed at the objet from a baseline distance away from the camera, as described in detail above.
The data therefrom is processed to obtain a 3D profile of the object.
The unstructured white light image is taken either before or after the other image, preferably capturing a color profile of substantially the same image from substantially the same perspective as the other image. The unstructured light source may be simply a built-in flash of the camera, but it may also include lighting of the object from more than one direction to reduce shadows.
Ideally, the two images are taken at nearly the same time, and are short duration exposures synchronized with flash illumination sources, so as to permit 3D color image reconstruction of ev'8ii living of moving-objects:' '-'this can be accomplished by using two cameras. Two of the same -type camera (e.g. Kodak DC 260), which permits initiation of an exposure by means of an external electrical input, and also has a flash trigger output, may be used. The first camera drives one flash, and the flash control signal from the first camera is also input to a circuit which, in response, sends an exposure initiate signal to the second camera after a delay. The second camera in turn controls the second flash. The delay between the flashes is preferably about 20-30 ms, to allow for flash and shutter duration and timing fitter, but this will be a design choice depending on the expected movement of the object and upon the characteristics of the chosen cameras.
In another embodiment, a single camera having "burst" mode operation may be used. In burst mode, a single camera (e.g. Fuji DS 300) takes a plurality of exposures closely spaced in time. Presently, the time spacing may be as little as 100ms. If the camera has a single flash drive output, a means must be added to control the two separate flash units. The preferred means is to connect the flash control signal from the camera to an electronic switching circuit which directs the flash control signal first to one flash unit, and then to the other, by any means, many of which will be apparent to those skilled in the electronic arts. This embodiment presently has two advantages, due to using a single camera: setup is simpler, and the orientation of the camera is identical for the two shots. However, the period between exposures is longer, and the cost of cameras having burst mode is presently much higher than for other types, so this is not the most preferred embodiment at this time.
Principles of the present invention for accurately cktermining the 3D profile of an image include structured light grating opaque areas, color compensation, bias adjusted centroid detection, calibration, filtering and weighted averaging of determined heights. All of these principles work together to enable one to produce high accuracy 3D profiles of even a living or moving object with virtually any commercially available image-capturing device, particularly an off the-shelf digital camera, and any compatible, separate commercially available flash (or other lighting) unit. To these off the-shelf items, only a light structuring optical device and an image data processing program need be added to implement this embodiment of the invention.
However, some aspects of the present invention may be employed while omitting others to produce still good-quality 3D images under many circumstances. Accordingly, the invention is conceived to encompass use of less than all of the disclosed principles.
A preferred embodiment of the present invention may be practiced and tested as follows. A
Kodak DC 260 digital camera is used. Por testing, a well-defined triangular object is imaged. The object is 25 mm in height, 25 mm thick, and 125 mm long. Although the DC 260 camera has 1536 x 1024 pixels, the test object only occupies 600x5?0 pixels, due to limitations of the camera zoom lens. The structured light source employs opaque areas between each of 3 separate colored bars in a repeating pattern, as described above.
The parameters of the system are selected as follows: (1) D is lf-,e distance from base line 124 point 125 to point 126; D = 230 mm; and (2) the object distance L = 1000 mm. Fig. 13a shows a one-dimensional scanned profile derived from the basic test setup. The worst case range error is about 1.5 mm, so the relafi~b error' is about 1.5!25 ~ 6%a; On accuracy comparable to thaC
reported by other groups using a single encoded color frame. The theoretical range accuracy Ozrhbased upon camera resolution can be estimated by simply differentiating Eq. (6). Since D >
xP x~, tizrh can be estimated by the simple formula: ~zrh ~ n ~ (?) where ~x is the maximum enor of transverse coordinate x. 580 pixels are used for the 125 mm long object, or ~ 0.22 mm/pixel. Since the maximum transverse error due to limitations of camera resolution is approximately one half of a pixel, Ox ~ 0.11 mm. Substituting D
= 230 mm, L =
1000 mm, and Ox= 0.11 mm into Eq. (7), one finds Az.,.h ~ 0.48 mm (Note that a variety of averaging and filtering aspects of the present invention can reduce the error of the center location to substantially less than .5 pixels). Since the measured error is much greater than the error due to camera resolution, the measured error is not primarily due to the limited resolution of the camera.
Fig. 13b shows the result of processing the image data with cross-talk compensation as described above, which reduces measured error to ~0.8 mm, for a ~ 1.9-fold improvement in range accuracy.
Fig. 13c shows the result of adding line-by-line reference plane calibration, as discussed above, to the color compensated data. The maximum measured error is reduced to about 0.5 mm, a relative improvement of about 1.6-fold, for about 3-fold total improvement in accuracy.
Fig. 13d shows the calculated profile after adding bias adjusted centroid peak detection to the cross-talk compensated and line-by-line calibrated data. The maximum error is reduced to about .25 mm, a 2-fold relative improvement and ~6-fold total improvement.
Thus, the measured accuracy exceeds by 2-fold the error estimate based on camera resolution.
Fig. 13e shows the calculated profile after weighted averaging. Averaging the data between adjacent lines will artificially improve the camera resolution; and weighted averaging as taught above improves it more than uniform averaging. Accordingly, the resolution is not limited by the .5 pixel basic camera resolution. As can be seen, the maximum error is now reduced to 0.1 mm, about a 2.5-fold relative improvement, and an overall 15-fold (l.5mm / O.lmm) improvement in accuracy.
Fig. 14a shows a human face; Fig. 14b shows the face as illuminated by swctured light in three colors, and Fig. 14c shows the reconstructed profile of the face after processing as described above. A cross-section of the height profile is shown in Fig. 14d.
The invention has been shown, described, and illustrated in substantial detail with reference to presently preferred embodiments. It will be understood by those skilled in this art that other and further changes and modifications may be made without departing from the spirit and scope of the invention, which is defined by the claims append hereto.

Claims (19)

What is claimed is:
1. A three dimensional imaging apparatus comprising:
a structured light source to project a known pattern of light upon an object;
a camera to provide a captured two dimensional image of said object under structured illumination, said captured two dimensional image including features of said projected known pattern reflected from said object; and a data processor to receive image data representing said captured two dimensional image and to perform algorithmic procedures to provide:
modified image data, said modified image data having apparent locations of said reflected pattern features shifted to more accurately deduce actual locations of said reflected pattern features; and three dimensional information, calculated from said modified image data, for locations of said reflected pattern features.
2. The apparatus of claim 1 wherein the data processor performs bias adjusted centroid detection on said image data.
3. The apparatus of any of the preceding claims wherein the data processor performs center weighted averaging of said three dimensional information.
4. The apparatus of any of the preceding claims further comprising a light structuring apparatus having a device to provide the pattern of the structured lighting, the device including apertures providing light of different distinct colors separated by opaque areas providing substantially no light.
5. The apparatus of any of the preceding claims wherein said processor provides modified image data by color compensation of said image data.
6. The apparatus of any of the preceding claims wherein the object is moving.
7. The apparatus of any of the preceding claims wherein the object is at least part of a human.
8. The apparatus of any of the preceding claims wherein the processor determines the three dimensional information by comparing said modified image data to stored image data substantially representing an image of said structured light reflected from a reference plane.
9. The apparatus of any of claims 1 to 7 wherein said apparatus includes a single image capturing device and a single light source synchronized to the image capturing device, and said three dimensional image data is determined from a single image captured by said single image capturing device, synchronized with said single light source.
10. The apparatus of any of claims 1 to 7 wherein:
said apparatus includes a single image capturing device and a single light source synchronized to the image capturing device;
said image data is determined by a single image captured by said single image capturing device, synchronized with said single light source; and said processor calculates said three dimensional image data by comparing said modified image data to stored image data substantially representing an image of said structured light reflected from a reference plane.
11. A method of obtaining a three dimensional image of an object, comprising the steps of:
providing a light structuring apparatus to project patterned light;
illuminating the object with said patterned light;
obtaining data substantially representing an image of the object reflecting the patterned light;
algorithmically modifying said data to provide modified image data more accurately representing said image; and determining three dimensional information about said object from the modified data.
12. The method of claim 11 wherein the step of algorithmically modifying said data includes performing bias adjusted centroid detection on said data.
13. The method of either of claims 11 or 12 further including the step of performing center weighted averaging of said three dimensional information.
14. The method of any of claims 11 to 13 wherein said light structuring apparatus includes a device to provide the pattern of the structured lighting, the device including apertures providing light of different distinct colors separated by opaque areas providing substantially no light.
15. The method of any of claims 11 to 14 wherein the step of algorithmically modifying said data includes performing color compensation of said data.
16. The method of any of claims 11 to 15 wherein the object is moving.
17. The method of any of claims 11 to 16 wherein the object is at least part of a human.
18. The method of any of claims 11 to 17 wherein the step of determining the three dimensional information includes comparing said modified image data to stored image data substantially representing a stored image of structured light from said structured light apparatus reflected from a reference plane.
19. The method of any of claims 11 to 18 wherein the step of illuminating the object employs only a single light structuring apparatus; and the step of determining three dimensional information employs data from only a single image of the object.
CA002373284A 1999-05-14 1999-05-14 Color structured light 3d-imaging system Abandoned CA2373284A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US1999/010756 WO2000070303A1 (en) 1999-05-14 1999-05-14 Color structured light 3d-imaging system

Publications (1)

Publication Number Publication Date
CA2373284A1 true CA2373284A1 (en) 2000-11-23

Family

ID=22272767

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002373284A Abandoned CA2373284A1 (en) 1999-05-14 1999-05-14 Color structured light 3d-imaging system

Country Status (6)

Country Link
EP (1) EP1190213A1 (en)
JP (1) JP2002544510A (en)
CN (1) CN1159566C (en)
AU (1) AU3994799A (en)
CA (1) CA2373284A1 (en)
WO (1) WO2000070303A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109855559A (en) * 2018-12-27 2019-06-07 成都市众智三维科技有限公司 A kind of total space calibration system and method
CN113654487A (en) * 2021-08-17 2021-11-16 西安交通大学 Dynamic three-dimensional measurement method and system for single color fringe pattern

Families Citing this family (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6556706B1 (en) * 2000-01-28 2003-04-29 Z. Jason Geng Three-dimensional surface profile imaging method and apparatus using single spectral light condition
JP2002191058A (en) * 2000-12-20 2002-07-05 Olympus Optical Co Ltd Three-dimensional image acquisition device and three- dimensional image acquisition method
US7176440B2 (en) * 2001-01-19 2007-02-13 Honeywell International Inc. Method and apparatus for detecting objects using structured light patterns
RU2184933C1 (en) * 2001-02-21 2002-07-10 Климов Андрей Владимирович Gear for contactless test of linear dimensions of three- dimensional objects
RU2185599C1 (en) * 2001-03-19 2002-07-20 Зеляев Юрий Ирфатович Procedure of contactless control over linear dimensions of three-dimensional objects
US7257236B2 (en) 2002-05-22 2007-08-14 A4Vision Methods and systems for detecting and recognizing objects in a controlled wide area
US7174033B2 (en) 2002-05-22 2007-02-06 A4Vision Methods and systems for detecting and recognizing an object based on 3D image data
CN100417976C (en) * 2002-09-15 2008-09-10 深圳市泛友科技有限公司 Three-dimensional photographic technology
DE10250954B4 (en) * 2002-10-26 2007-10-18 Carl Zeiss Method and device for carrying out a televisite and televisite receiving device
US7146036B2 (en) 2003-02-03 2006-12-05 Hewlett-Packard Development Company, L.P. Multiframe correspondence estimation
TWI257072B (en) 2003-06-20 2006-06-21 Ind Tech Res Inst 3D color information acquisition method and device thereof
CN100387065C (en) * 2003-07-07 2008-05-07 财团法人工业技术研究院 Three-dimensional color information acquisition method and device therefor
JP3831946B2 (en) * 2003-09-26 2006-10-11 ソニー株式会社 Imaging device
JP2005164434A (en) * 2003-12-03 2005-06-23 Fukuoka Institute Of Technology Noncontact three-dimensional measuring method and apparatus
DE102004007829B4 (en) * 2004-02-18 2007-04-05 Isra Vision Systems Ag Method for determining areas to be inspected
US7319529B2 (en) 2004-06-17 2008-01-15 Cadent Ltd Method and apparatus for colour imaging a three-dimensional structure
CA2615316C (en) 2004-08-12 2013-02-12 A4 Vision S.A. Device for contactlessly controlling the surface profile of objects
AU2005285558C1 (en) * 2004-08-12 2012-05-24 A4 Vision S.A Device for biometrically controlling a face surface
US7646896B2 (en) 2005-08-02 2010-01-12 A4Vision Apparatus and method for performing enrollment of user biometric information
CN101496033B (en) * 2006-03-14 2012-03-21 普莱姆森斯有限公司 Depth-varying light fields for three dimensional sensing
US9052294B2 (en) 2006-05-31 2015-06-09 The Boeing Company Method and system for two-dimensional and three-dimensional inspection of a workpiece
US8050486B2 (en) 2006-05-16 2011-11-01 The Boeing Company System and method for identifying a feature of a workpiece
US7495758B2 (en) 2006-09-06 2009-02-24 Theo Boeing Company Apparatus and methods for two-dimensional and three-dimensional inspection of a workpiece
JP2008170281A (en) * 2007-01-11 2008-07-24 Nikon Corp Shape measuring device and shape measuring method
US20110316978A1 (en) * 2009-02-25 2011-12-29 Dimensional Photonics International, Inc. Intensity and color display for a three-dimensional metrology system
DE102010029091B4 (en) * 2009-05-21 2015-08-20 Koh Young Technology Inc. Form measuring device and method
JP5633719B2 (en) * 2009-09-18 2014-12-03 学校法人福岡工業大学 3D information measuring apparatus and 3D information measuring method
CN102022981B (en) * 2009-09-22 2013-04-03 重庆工商大学 Peak-valley motion detection method and device for measuring sub-pixel displacement
CN102052900B (en) * 2009-11-02 2013-09-25 重庆工商大学 Peak valley motion detection method and device for quickly measuring sub-pixel displacement
US20110187878A1 (en) * 2010-02-02 2011-08-04 Primesense Ltd. Synchronization of projected illumination with rolling shutter of image sensor
US8279418B2 (en) 2010-03-17 2012-10-02 Microsoft Corporation Raster scanning for depth detection
CN101975994B (en) * 2010-08-27 2012-03-28 中国科学院自动化研究所 Three-dimensional imaging system of multi-stage lens
US20120062725A1 (en) * 2010-09-10 2012-03-15 Gm Global Technology Operations, Inc. System for error-proofing manual assembly operations using machine vision
CN102161291B (en) * 2010-12-08 2013-03-27 合肥中加激光技术有限公司 Three-dimensional imaging crystal internally carving pavilion
TW201315962A (en) * 2011-10-05 2013-04-16 Au Optronics Corp Projection image recognition apparatus and method thereof
CN102628693A (en) * 2012-04-16 2012-08-08 中国航空无线电电子研究所 Method for registering camera spindle and laser beam in parallel
JP6005278B2 (en) * 2012-07-25 2016-10-12 シーメンス アクチエンゲゼルシヤフトSiemens Aktiengesellschaft Color coding for 3D measurements, especially on transmissive and scattering surfaces
CN104903680B (en) 2012-11-07 2019-01-08 阿泰克欧洲公司 The method for controlling the linear dimension of three-dimension object
US11509880B2 (en) * 2012-11-14 2022-11-22 Qualcomm Incorporated Dynamic adjustment of light source power in structured light active depth sensing systems
EP2799810A1 (en) * 2013-04-30 2014-11-05 Aimess Services GmbH Apparatus and method for simultaneous three-dimensional measuring of surfaces with multiple wavelengths
TWI464367B (en) * 2013-07-23 2014-12-11 Univ Nat Chiao Tung Active image acquisition system and method
CN103697815B (en) * 2014-01-15 2017-03-01 西安电子科技大学 Mixing structural light three-dimensional information getting method based on phase code
DE102014210672A1 (en) 2014-06-05 2015-12-17 BSH Hausgeräte GmbH Cooking device with light pattern projector and camera
CN104243843B (en) 2014-09-30 2017-11-03 北京智谷睿拓技术服务有限公司 Pickup light shines compensation method, compensation device and user equipment
JP6309174B1 (en) * 2014-12-18 2018-04-11 フェイスブック,インク. System, apparatus, and method for providing a user interface in a virtual reality environment
WO2016137351A1 (en) * 2015-02-25 2016-09-01 Андрей Владимирович КЛИМОВ Method and device for the 3d registration and recognition of a human face
CN104809940B (en) * 2015-05-14 2018-01-26 广东小天才科技有限公司 Geometry stereographic projection device and projecting method
CN105157613A (en) * 2015-06-03 2015-12-16 五邑大学 Three-dimensional fast measurement method utilizing colored structured light
CN105021138B (en) * 2015-07-15 2017-11-07 沈阳派特模式识别技术有限公司 3-D scanning microscope and fringe projection 3-D scanning method
CN106403838A (en) * 2015-07-31 2017-02-15 北京航天计量测试技术研究所 Field calibration method for hand-held line-structured light optical 3D scanner
TWI550253B (en) * 2015-08-28 2016-09-21 國立中正大學 Three-dimensional image scanning device and scanning method thereof
CN105300319B (en) * 2015-11-20 2017-11-07 华南理工大学 A kind of quick three-dimensional stereo reconstruction method based on chromatic grating
US10884127B2 (en) * 2016-08-02 2021-01-05 Samsung Electronics Co., Ltd. System and method for stereo triangulation
CN108693538A (en) * 2017-04-07 2018-10-23 北京雷动云合智能技术有限公司 Accurate confidence level depth camera range unit based on binocular structure light and method
CN108732066A (en) * 2017-04-24 2018-11-02 河北工业大学 A kind of Contact-angle measurement system
KR101931773B1 (en) 2017-07-18 2018-12-21 한양대학교 산학협력단 Method for shape modeling, device and system using the same
CN109348607B (en) * 2018-10-16 2020-02-21 华为技术有限公司 Luminous module support and terminal equipment
TWI763206B (en) 2020-12-25 2022-05-01 宏碁股份有限公司 Display driving device and operation method thereof

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4349277A (en) * 1980-06-11 1982-09-14 General Electric Company Non-contact measurement of surface profile
EP0076866B1 (en) * 1981-10-09 1985-05-02 Ibm Deutschland Gmbh Interpolating light section process
WO1994016611A1 (en) * 1993-01-21 1994-08-04 TECHNOMED GESELLSCHAFT FüR MED. UND MED.-TECHN. SYSTEME MBH Process and device for determining the topography of a reflecting surface
US5615003A (en) * 1994-11-29 1997-03-25 Hermary; Alexander T. Electromagnetic profile scanner

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109855559A (en) * 2018-12-27 2019-06-07 成都市众智三维科技有限公司 A kind of total space calibration system and method
CN113654487A (en) * 2021-08-17 2021-11-16 西安交通大学 Dynamic three-dimensional measurement method and system for single color fringe pattern
CN113654487B (en) * 2021-08-17 2023-07-18 西安交通大学 Dynamic three-dimensional measurement method and system for single color fringe pattern

Also Published As

Publication number Publication date
AU3994799A (en) 2000-12-05
JP2002544510A (en) 2002-12-24
CN1350633A (en) 2002-05-22
EP1190213A1 (en) 2002-03-27
CN1159566C (en) 2004-07-28
WO2000070303A1 (en) 2000-11-23

Similar Documents

Publication Publication Date Title
CA2373284A1 (en) Color structured light 3d-imaging system
TW385360B (en) 3D imaging system
US6341016B1 (en) Method and apparatus for measuring three-dimensional shape of object
KR100613421B1 (en) Three-dimensional shape measuring method
Tajima et al. 3-D data acquisition by rainbow range finder
US6549288B1 (en) Structured-light, triangulation-based three-dimensional digitizer
JP4040825B2 (en) Image capturing apparatus and distance measuring method
US6084712A (en) Three dimensional imaging using a refractive optic design
US7006132B2 (en) Aperture coded camera for three dimensional imaging
JP3884321B2 (en) 3D information acquisition apparatus, projection pattern in 3D information acquisition, and 3D information acquisition method
JP2010507079A (en) Apparatus and method for non-contact detection of 3D contours
KR20100017236A (en) Single-lens, single-sensor 3-d imaging device with a central aperture for obtaining camera position
CA2017518A1 (en) Colour-range imaging
US6765606B1 (en) Three dimension imaging by dual wavelength triangulation
JPS58122409A (en) Method of sectioning beam
JP2714152B2 (en) Object shape measurement method
JP4090860B2 (en) 3D shape measuring device
JP2005520142A (en) Method and apparatus for measuring absolute coordinates of object
JP4516590B2 (en) Image capturing apparatus and distance measuring method
KR20000053779A (en) Three dimension measuring system using two dimensional linear grid patterns
JP3912666B2 (en) Optical shape measuring device
JP4141627B2 (en) Information acquisition method, image capturing apparatus, and image processing apparatus
JP4204746B2 (en) Information acquisition method, imaging apparatus, and image processing apparatus
JP3932776B2 (en) 3D image generation apparatus and 3D image generation method
JP2500660B2 (en) Distance / color image acquisition method and device

Legal Events

Date Code Title Description
FZDE Discontinued