EP1190213A1 - Systeme d'imagerie tridimensionnelle couleur a lumiere structuree - Google Patents

Systeme d'imagerie tridimensionnelle couleur a lumiere structuree

Info

Publication number
EP1190213A1
EP1190213A1 EP99923100A EP99923100A EP1190213A1 EP 1190213 A1 EP1190213 A1 EP 1190213A1 EP 99923100 A EP99923100 A EP 99923100A EP 99923100 A EP99923100 A EP 99923100A EP 1190213 A1 EP1190213 A1 EP 1190213A1
Authority
EP
European Patent Office
Prior art keywords
image
light
color
data
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP99923100A
Other languages
German (de)
English (en)
Inventor
Taiwei Lu
Jianzhong Zhang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
3DMetrics Inc
Original Assignee
3DMetrics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 3DMetrics Inc filed Critical 3DMetrics Inc
Publication of EP1190213A1 publication Critical patent/EP1190213A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2509Color coding

Definitions

  • the present invention relates to a method and apparatus for three-dimensional surface profile imaging and measurement and, more particularly, to distance profile measurement of objects based upon two-dimensional imaging of the objects reflecting structured illumination.
  • Three-dimensional (hereinafter also referred to as either “3D” or “3-D”) imaging and measurement systems are known. In general, the purpose is to determine the shape of an object in three dimensions, ideally with actual dimensions.
  • imaging and measurement systems fall into two basic categories: 1) Surface Contact Systems and 2) Optical Systems.
  • Optical Systems are further categorized as using Laser Triangulation, Structured Illumination, Optical Moire Interferometry, Stereoscopic Imaging, and Time-of-Flight Measurement.
  • Optical Moire Interferometry is accurate, but expensive and time-consuming. Stereoscopic
  • Time-of-Flight Measurement calculates the time for a laser beam to reflect from an object at each point of interest, and requires an expensive scanning laser transmitter and receiver.
  • the present invention is an Optical System based upon Structured Illumination, which means that it determines the three dimensional profile of an object which is illuminated with light having a known structure, or pattern.
  • the structured illumination is projected onto an object from a point laterally separated from a camera.
  • the camera captures an image of the structured light pattern, as reflected by the object.
  • the object can be profiled in three dimensions where the structured light pattern reflected by the object can be discerned clearly.
  • the shift in the reflected pattern as compared to that which would be expected from projection of the same pattern onto a reference plane, may be triangulated to calculate the "z" distance, or depth.
  • Color-encoded structured light has been proposed to achieve fast active 3D imaging, as for example by K. L. Boyer and A. C. Kak, "Color-encoded structured light for rapid active ranging," IEEE transactions on pattern analysis and machine intelligence, Vol. PA J-9. pp. 14-28, 1987.
  • the color of the projected structured light is used to identify the locations of stripes and thereby reduce ambiguity when interpreting the data.
  • the present invention provides a three dimensional (3D) imaging system requiring only a single image capture by substantially any presently-manufactured single camera using a single structured light source.
  • it provides a 3D imaging system which is inexpensive to manufacture and easy to use.
  • it permits 3D imaging and measurement using a structured source of visible, infrared, or ultraviolet light.
  • the invention provides a 3D imaging and measurement system which reduces crosstalk between reflections of a color-encoded structured light source.
  • the invention provides 3D imaging using any combination of enhancing a color-encoded structured light source, algorithmically enhancing the accuracy of raw data from an image reflecting a structured light source, comparing the image data to a measured reference image, and algorithmically enhancing the calculated depth profile.
  • the invention permits 3D imaging and measurement using either a pulsed light source, such as a photographic flash, or a continuous light source.
  • the invention provides a light structuring optical device accepting light from a standard commercial flash unit synchronized to a standard commercial digital camera to provide data from which a 3D image may be obtained.
  • the invention permits 3D imaging and measurement using a structured light source which modulates the intensity and/or the spectrum of light to provide black and white or multi-color structured lighting pattern.
  • the invention provides a method to project a structured illumination pattern onto an object.
  • Another aspect provides 3D imaging using an improved color grating.
  • the invention provides a 3D imaging system that can be used to project an image onto an object.
  • the invention provides a 3D imaging system that can be used with a moving object, and with a living object.
  • the invention provides a 3D imaging and measurement system having a camera and light source which can be integrated into a single body.
  • the invention permits accurate 3D imaging and measurement of objects having a surface color texture, by using two images reflecting different lighting.
  • the present invention enables accurate three dimensional imaging using any off-the-shelf digital camera, or indeed substantially any decent camera, taking a single image of an object under structured lighting.
  • the structured lighting may be provided by adding a fairly simple pattern-projecting device in front of an off-the-shelf photographic flash unit.
  • the improvements encompassed by the present invention work together to make full realization of the advantages of the invention possible; however, one may in some cases omit individual improvements and yet still obtain good quality 3D profile information.
  • the structured light source is improved by separating color images to reduce color crosstalk, and by adaptation for use with off-the-shelf flash units.
  • the image data inte ⁇ retive algorithms reduce the effects of color cross-talk, improve detection of light intensity peaks, enhance system calibration, and improve the precision of identifying the location of adjacent lines.
  • a three dimensional imaging system for use in obtaining 3D information about an object, constructed in accordance with the principles of the present invention, has a structured light source including a source of illumination which is transmitted through a black and white or color grating and then projected onto the object.
  • the grating includes a predetermined pattern of light transmitting areas or apertures, which are typically parallel transmissive bars disposed a predetermined distance apart from each other. In some embodiments the grating will include an opaque area intermediate each of a plurality of differently colored bars.
  • the imaging system also includes a camera, or other image capturing device, for capturing an image of an object reflecting light from the structured light source.
  • the camera may take short duration exposures, and/or the light source may be a short duration flash synchronized to the exposure, to enable capture clear 3D images of even moving and living objects.
  • the system may include a means for digitizing the captured image into computer-manipulable data, if the camera does not provide digital data directly.
  • a bias adjusted centroid light peak detection algorithm aspect of the present invention may be employed to enhance accuracy of the detected image, and system calibration methods are an aspect of the invention which can reduce errors by comparing the detected object image to an actual reference image taken using the same system setup.
  • color cross-talk compensation aspects of the invention include employing either or both using opaque areas between different colors in the grating, and a color compensation algorithm which is the inverse of a determined color cross-talk matrix.
  • a center weighted line average algorithm aspect of the present invention is also particularly useful for plural color gratings.
  • a three-dimensional imaging system constructed and operated in accordance with the principles of the present invention would employ a combination of these mechanical and algorithmic aspects, and, in conjunction with well known calculations performed upon the derived image data, would determine information about an imaged object along three dimensional planes x,y, and z.
  • Fig. 1 shows structured lighting for a 3D imaging system using a CCD video camera.
  • Fig. 2 shows some details of another 3D imaging system.
  • Fig. 3 is an improved grating for use with a 3D imaging system.
  • Fig. 4 shows image contours of an object reflecting structured light.
  • Fig. 5 is a perspective view of a three-dimensional profile obtained from Fig. 4 data.
  • Fig. 6 is a graph showing determination, by threshold, of regions as being of a color or not.
  • Fig. 7a is a portion of an image reflecting a three-color structured light source.
  • Fig. 7b is a graph of color intensities measured from Fig. 7a image.
  • Fig. 8 is a flowchart of a color cross-talk compensation image processing procedure.
  • Fig. 9 is a graph of color-compensated color intensities from Fig. 7a image.
  • Fig. 10 graphically shows bias-adjusted centroid peak detection.
  • Fig. 11 is a flowchart of a system calibration procedure.
  • Fig. 12 shows details of 3D image system using system calibration.
  • Figs. 13 a-e shows measured profiles after progressively enhancing accuracy.
  • Fig. 14a shows a human face.
  • Fig. 14b shows the human face illuminated by structured light.
  • Fig. 14c shows a reconstruction of the 3D profile of the human face.
  • Fig. 14d shows a cross-section of the 3D profile of the human face.
  • FIG. 1 A three dimensional imaging system is shown in Fig. 1, identified in general by the reference numeral 10.
  • a modified 3D imaging system, identified in g Dcral by the reference numeral 12 is shown with somewhat more detail in Fig. 2, and a grating, identified in general by the reference numeral 14 is shown in Fig. 3.
  • the 3-D imaging system of Fig. 1 shows structured illumination source 16 projecting patterned light (structured illumination) toward object 18.
  • the pattern of light may be color encoded or not, and may be patterned in any way which is known and can be readily recognized in a image.
  • a simple and preferred pattern consists of parallel bars of light.
  • Light pattern 20 displays a preferred pattern of light projected from structured light source 16, seen where the light passes through plane O-X (pe ⁇ endicular to the plane of the paper) before reaching object 18.
  • O-X pe ⁇ endicular to the plane of the paper
  • light pattern 20 would continue on to object 18, from whence it would be reflected according to the contours of object 18.
  • An image so reflected, indicated generally by reference numeral 32, will be captured by camera 30.
  • Object 18 is only a cross- section, but the entire object is the face of a statue of the goddess Venus.
  • a representation of image 32, as seen by camera 30, is shown in Fig. 4. In the image of Fig. 4, it can be seen that the parallel bars of light from the structured light source are shifted according to the contours of the statue.
  • a representation of the statue as determined using the present invention is shown in Fig. 5.
  • each light area of light pattern 20 is a bar of light which is oriented pe ⁇ endicular to the page and thus is seen in cross-section.
  • Dark areas 22 are disposed between each light area 24a, 26a, 28a, 24, 26 and 28.
  • Distance 21 is the spacing P between the centers of like colors in a three-color embodiment, and distance 23 is the spacing P' between adjacent light bars. In the preferred embodiment, the same distance 23 is the distance between the centers of adjacent dark areas 22.
  • Bars of light 24, 26 and 28 may all be white, or all of the same color or mixture of colors, or they may be arranged as a pattern of different colors, preferably repeating.
  • Dark areas 22 are preferred intermediate each of light bars 24, 26, 28, etc.
  • the preferred proportion of dark area to light area depends upon whether or not a plurality of distinct colors are used. In embodiments not using a plurality of distinct colors, it is preferred that dark areas 22 are about equal to light areas. In embodiments using distinct colors, dark areas 22 are preferably as small as possible without permitting actual intermixing of adjacent colors (which will occur as a result of inevitable defocusing and other blurring of the reflected structured illumination). Dark areas 22 greatly reduce crosstalk which would otherwise corrupt reflections from object 18 of projected light pattern 20.
  • light color used with the present invention need not be in the visible spectrum, as long as a structured light source can accurately provide a pattern in that color, and the image- capturing device can detect the color.
  • a structured light source can accurately provide a pattern in that color
  • the image- capturing device can detect the color.
  • the use of light from infrared at least through ultraviolet is well within the scope of the present invention.
  • System 10 includes structured light source 16.
  • Grating 14 although not shown in Fig. 1, is contained within the optical system of structured light source 16, and determines the pattern to be projected onto object 18.
  • Fig. 2 shows details of a light source 16.
  • the light from light source 34 is collimated by collimating lens 32.
  • the collimated light passes through grating 14, which imposes structure (or patterning) on the light.
  • the structured light from grating 14 is focussed by projecting lens 38, so that it may be accurately detected by camera 44 when the structured light reflects from object 40 to form image 42.
  • Data 48 representing image 42 as captured by camera 44 is conveyed to processor 46 so that calculations may be performed to extract the 3D information from image data 48.
  • any predetermined pattern may be used for grating 14.
  • the requirements for the pattern are that it be distinct enough and recognizable enough that it can be identified after reflection from object 40.
  • Parallel bars are preferred for the structured light pattern, and are primarily described herein.
  • grating 14 includes a repetitive pattern of parallel apertures 4, 6, 8 etc. disposed a predetermined center to center distance 5 apart from each other.
  • aperture as used herein means a portion of the grating which transmits light, in contrast to opaque areas which block light transmission.
  • the apertures of grating 14 may transmit a plurality of distinct colors of light.
  • a structured light pattern using a plurality of distinct colors is considered to be color-encoded.
  • the grating modulates the color of the light.
  • the apertures of grating 14 may each transmit the same color of light.
  • Any particular mix of light frequencies may be understood to be a "color” as used here, so that if all apertures transmit white light they are considered not to transmit a plurality of distinct colors).
  • the grating does not vary (or "modulate") the color of the transmitted light, then it must at least modulate the intensity of the single color of light transmitted so that a recognizable pattern of light is projected.
  • opaque areas 2 having width 7 are disposed between adjacent apertures 4 and 6, 6 and 8, 4a and 6a, etc.
  • Each aperture preferably has width 9.
  • a preferred embodiment employs three different colors 4 (e.g. red), 6 (e.g. green), and 8 (e.g. blue), repeating as 4a, 6a, 8a and again as 4b, 6b, 8b, etc.
  • interval distance 1 is the spacing between centers of like-colored apertures.
  • Opaque area interval distance 3 between the centers of opaque areas 2 is equal to adjacent aperture center spacing distance 5.
  • the size of the periods P (21) and P' (23) on projected image 20 of Fig. 1, and of interval distances 3 and 1 in grating 14 in FIG. 3, has been exaggerated to provide improved clarity.
  • the size of these periods is a design variable, as are the use of different colors, the number of colors used, the actual colors used, the dimensions of the colored bars, and the size of grating 14.
  • the actual spacings of grating 14 may be readily determined when the focussing characteristics of projector lens 38 are known, in order to produce a desired pattern of structured light, as explained below.
  • width 7 of opaque areas 2 is preferably about 2/3 of aperture interval distance 5, while for gratings not using a plurality of distinct colors it is preferred that width 7 be about 4/5 of aperture interval distance 5.
  • the receptor characteristics are such as to best distinguish red, green and blue.
  • the ability to distinguish the colors used to encode a plural-color structured light pattern determines how closely spaced the pattern may be and still be clearly resolved. Accordingly, red, green and blue are generally preferred for the encoding colors of plural color structured light sources. However, it should be understood that more or less distinct colors could be used, and could well be adjusted to correspond to the best color selectivity of a particular camera.
  • the film image must be "digitized" to provide data which can be processed by a computer - that is, processable data. Digitizers typically are also typically best able to distinguish red, green and blue.
  • a preferred method is to make one central line identifiable, and then to count lines from there.
  • a central line can be made identifiable by making it white, as opposed to the other distinct colors.
  • another marking should be used, such as periodically adding "ladder steps," pe ⁇ endicular apertures across what would be an opaque area in the rest of the pattern.
  • a single different-color line could be used for registration. From certain identified lines, the remainder of the lines are registered by counting. When the 3D profile is steep, the projected lines reflected by the object may become very close together, and become lost or indistinguishable. This can interfere with registration.
  • Color-encoded parallel-bar patterns have an advantage in registration, because it is easier to count lines. Given three colors, two adjacent colored lines can be indistinguishable and still not interfere with the ability to count lines (and thus keep track of registration). In general, for n colors, n-1 adjacent lines can be missing without impairing registration. Thus, color-encoded systems generally have more robust registration.
  • a structured light source constructed according to the present invention with opaque areas 2 intermediate each of color bars 4, 6 and 8 in grating 14 to create projected dark areas 22, enables a single exposure of object 22, recorded by essentially any commercially available image recorder (e.g. camera) to provide the necessary information from reflected image 32, sufficiently free from unwanted crosstalk, to permit proper detection and registration of the light bar lines, and hence to enable practical 3-D profilometry.
  • image recorder e.g. camera
  • structured light sources will vary in the accuracy of the projected image, in the difference in intensity between intended light and dark areas, in the colors projected and their spectral purity.
  • Cameras may be either film-type photographic units, requiring scanning of the film image to obtain dig*!;!! data, or digital cameras which provide digital information more directly.
  • some cameras have a high ability to distinguish different colors with minimal interference; this is most often true of cameras employing three separate monochromatic CCD receptors for each color pixel.
  • Other cameras employ broad-band CCD receptors, and deduce the received colors through internal data manipulation which may not be accessible externally.
  • the present 3-D imaging system invention may be practiced in a variety of aspects, with various features of the invention employed as appropriate for a particular camera and a particular structured light source.
  • Region detection and color-cross-talk compensation Whether single or multiple colors are used, the resulting image data must be examined to assign regions as either light or dark.
  • a light bar (or "line) location is the feature which must be determined to calculate the 3D profile, the center of these lines must be located as accurately as possible. To accomplish this, noise and interference should be eliminated as far as possible.
  • color crosstalk noise comes from the color grating of plural-color embodiments of grating 14, from object colors, and from color detectors used by camera 30 or 44
  • Color crosstalk may be understood by examining the intensities of color-encoded light from a structured light source as detected by a camera. First, the grating which produces the color pattern should be understood.
  • Figure 7a shows a color grating including red, green, and blue color lines, which is a preferred color version of grating 14 in Fig. 3.
  • a real color grating is created therefrom by writing the designed color pattern on a high resolution film (e.g. Fujichrome Velvia) using a slide maker (e.g. Lasergraphics Mark LU ultrahigh resolution, 8000 x 16000).
  • a uniform white light i.e. a light having a uniform distribution of intensity across at least the visible spectrum
  • a digital camera e.g. Kodak DC260
  • the color spectrum of the grating is obtained by analyzing the intensity distribution of different color lines of the recorded digital image, as shown in Fig. 7b.
  • Fig. 7b shows severe color cross-talk among different color channels.
  • the intensity of typical false blue peak 71 in the location of a green line is comparable to the intensity of typical true blue peak 75.
  • the color cross-talk noise e.g., an apparent but false color detected where it should not be
  • the color cross-talk noise is about the same level as the color signal itself. This can lead to false line detection, in which one would detect a line (blue, in this case) that actually did not exist.
  • typical red peak 77 does not cause a significant false peak in another color.
  • color cross-talk can substantially shift the apparent peak locations of color lines from their actual locations. Since a shift in the peak location will result in errors in the depth calculation, it is important to compensate this shifting effect.
  • the color spectrums of the color grating are collected by taking pictures with different objects and different digital cameras.
  • the objects include both neutral color objects such as a white plane, a white ball, and a white cube, and lightly- colored objects such as human faces with different skin colors including white, yellow, and black.
  • Fig. 6 shows a graph of a portion of intensity data for light. It should be understood that the continuous line is an idealization of data from a multitude of pixels, or points, tied together. In reality there may be as few as three pixels in a region 63 which exceeds threshold level (TL) 61. As long as this convention is understood, the graphs showing continuous intensity data can be properly understood.
  • TL threshold level
  • Fig. 8 a flowchart for color crosstalk compensation, will be described with reference also to the reference designators of Fig. 6.
  • Step 81 The image data is captured of an object reflecting color-encoded structured light.
  • the color compensation algorithm is generally applied to only a portion of the image at a time, preferably to a square region covering about 10 colored lines; the size of the regions analyzed is a design variable. Thus it must be understood that this procedure may be repeated in toto for many sub-images.
  • Step 84 Areas are tentatively identified assigned as peak areas of the subject color where the intensity of the subject color light exceeds TL 61.
  • Step 85 If areas have already been assigned to other colors, a tentative peak area located in an area previously assigned to another color will be rejected as invalid, and will skip the assignment step of Step 86.
  • Step 86 For tentative peak areas of the subject color which are not in regions previously assigned to another color, the area will be assigned to the subject color.
  • Step 87 Repeat Steps 84-86 until assignment of the subject color is complete.
  • Step 88 Repeat Steps 82-87 until each of the encoding colors has been tested.
  • Step 89 Calculate a Color Crosstalk Matrix (CCM) for the image (or sub-image).
  • CCM Color Crosstalk Matrix
  • Step 90 the compensation for color cross-talk can be implemented by using the inverse of CCM, as given by
  • r g', b' represent the red, green, and blue colors after cross-talk compensation.
  • the matrix is readily adjusted to accommodate N colors.
  • the colors are represented as r, g and b by way of example.
  • Fig. 9 shows the color spectrum after the color cross-talk compensation for the same color grating and same digital camera used to obtain Fig. 7(b).
  • 3D imaging systems may employ bias adjusted centroid peak detection to determine the centers of light patterns, as shown in Fig. 10 and described below.
  • Step 1 scanning along data rows which are generally pe ⁇ endicular to the structured image lines, find start point 102 and end point 104 locations of the intensity profile 100 of each region assigned to a particular color. This step is preferably performed region by region, and is the same as steps 81-88 of Fig. 8.
  • steps 81-88 are presently preferred to be performed as a separate step, upon data which has already been adjusted by color compensation, as described above.
  • start 102 and end 104 points are in reality discrete pixel data, though both are above threshold level TL, one will be higher than the other.
  • Step 3 refine the estimated center of the image, inte ⁇ olating between pixels and calculating the refined center (RC) of each line using a bias adjusted centroid method, given by end
  • Step 4 repeat steps 1-3 for each row of data in each color.
  • the average error of RC is about 0.2 pixel, about 1/2 of the error of C (without bias adjusted centroid detection). This, in turn, will double the accuracy of 3D imaging based upon RC.
  • Optional smoothing The location of the centers of lines determined may be smoothed, filtering the location of each point to reduce deviations from other nearby points of the line using any well-known filtering algorithm. Since the lines of the reference image should be smooth, heavy filtering is preferred and should reduce noise without impairing accuracy. With regard to structuring lines reflected from an object, excessive filtering may impair accuracy. It is preferred, therefore, to determine discontinuities along structuring lines in the reflected image, and then perform a modest filtering upon the (continuous) segments of the structuring lines between such discontinuities.
  • a line-by-line calibration procedure may be employed in imaging systems embodying the present invention to compensate system modeling error, the aberration of both projector and camera imaging lenses, and defocusing effects due to object depth variation, as enumerated and detailed below:
  • Three-dimensional data is often deduced from comparison of (a) object-reflected structured image point locations to (b) the theoretically expected locations of structured light image points reflected from a reference plane.
  • the theoretical expected location generally is calculated upon assumptions, including that the structured light image is perfectly created as intended, perfectly projected and accurately captured by the camera Due to manufacturing tolerances of the structured light source and of the camera receptors, and to lens aberrations such as coma and chromatic distortion, each of these assumptions is mistaken to some degree in the real world, introducing a variety of errors which shift the actual location of the structured line from its theoretical location.
  • optical axis 121 is the z axis, and is pe ⁇ endicular to reference plane 123
  • Baseline 124 is parallel to reference plane 123, is defined by the respective centers 126 and 125 of the p ⁇ nciple planes of the lenses of source 16 and camera 44
  • the x axis direction is defined parallel to baseline 124
  • y-axis 122 is defined to be pe ⁇ endicular to the plane of baseline 124 and optical axis 121.
  • D is the distance between points 125 and 124 of structured light source 16 and digital camera 44.
  • L is the distance between base line 144 and the reference plane of a white surface.
  • Point 130 is seen by camera 44 as a point having an x-value, translated to reference plane 123, of x c 128.
  • the structured light line crossing point 130 is known from a recorded image of reference plane 123 to have an x-value x p 127
  • Fig. 11 is a flow chart of general data processing steps using line-by-lme calibration and some of the other accuracy enhancements taught herein.
  • Step 112 using a device arranged in a known configuration such as shown in Fig 12, capture a reference image using perfect white reference plane 123, then digitize if necessary and output the digital image data to a processing system
  • Step 113" capture an object image using the same system configuration as in Step 112
  • Step 114 perform cross-talk compensation on both object and reference data.
  • Step 115- perform bias adjusted centroid peak detection to refine the location of center peaks for lines in the object, and optionally smooth the lines along segment between determined discontinuities.
  • Step 118 Use center- weighted averaging of the height determined by each of three adjacent structured light lines at that y-location. Do not perform averaging if the adjacent structured light lines are determined to be discontinuous. Adjacent lines are considered discontinuous if the difference between the heights determined by any 2 of the three lines exceeds a threshold.
  • the threshold is a design choice, but is preferably about 2 mm in the setup shown in Fig. 12.
  • the fluctuation error of 3D profiles may be reduced by averaging the z value determined for points which are near each other.
  • the present invention prefers to perform center weighted averaging of nearby points around the image point being adjusted.
  • weighted averaging is not performed at all points.
  • the threshold is a design variable, but preferably is about 2 mm in the setup of Fig. 12.
  • This weighted technique improves accuracy 0.3/0.1 ⁇ 3-fold, substantially better than the ⁇ 1.73-fold improvement resulting from conventional 3 point averaging.
  • This technique is effective for all embodiments of the grating according to the present invention. However, since the errors between different colors tend to be independent, while errors between same colored lines may be partly coherent, it is believed that this weighted averaging technique will be more effective for adjacent different colored lines than for adjacent same-colored lines.
  • the principles of the present invention permit accurate three dimensional imaging and measurement of an object using a single camera in a known spatial relationship to a single structured light source, taking a single image of the object reflecting the structured illumination.
  • the light may be provided by a flash, which due to its very short duration substantially freezes motion, permitting profile imaging on moving objects.
  • One prefe ⁇ ed embodiment utilizes a standard commercial photographic flash unit as an illumination source, the flash unit separated from but synchronized with a standard digital camera. By simply placing in front of the flash unit an optical unit including a grating as described above, and one or more focussing lenses, the flash will cause structured light to be projected upon an object.
  • a relative 3D image may be obtained by processing only the data obtained from the digital camera image. If the orientation measurements of the structured flash, camera and object are known, then absolute 3D information and measurements of the image can be obtained.
  • the invention can use film cameras, but the structured-light-illuminated image must then be digitized to complete enhancements and 3D image determination.
  • Object reconstruction in full color Another prefe ⁇ ed embodiment of the invention permits true-color reconstruction of an object, such as a human face. Using two separate exposures, one under structured light and one under white light, a 3D image and a two dimensional color profile are separately obtained at almost the same moment. A model is constructed from the 3D profile information, and the color profile of the original object is projected onto the model. Alignment of the color profile and model is accomplished by matching features of the object and of the image.
  • the two images are preferably close in time to each other.
  • the structured illumination for one image may be color encoded or not, and is directed at the object from a baseline distance away from the camera, as described in detail above.
  • the data therefrom is processed to obtain a 3D profile of the object.
  • the unstructured white light image is taken either before or after the other image, preferably capturing a color profile of substantially the same image from substantially the same perspective as the other image.
  • the unstructured light source may be simply a built-in flash of the camera, but it may also include lighting of the object from more than one direction to reduce shadows.
  • the two images are taken at nearly the same time, and are short duration exposures synchronized with flash illumination sources, so as to permit 3D color image reconstruction of even living or moving objects.
  • This can be accomplished by using two cameras.
  • Two of the same type camera e.g. Kodak DC 260
  • the first camera drives one flash, and the flash control signal from the first camera is also input to a circuit which, in response, sends an exposure initiate signal to the second camera after a delay.
  • the second camera in turn controls the second flash.
  • the delay between the flashes is preferably about 20-30 ms, to allow for flash and shutter duration and timing jitter, but this will be a design choice depending on the expected movement of the object and upon the characteristics of the chosen cameras.
  • a single camera having "burst" mode operation may be used.
  • burst mode a single camera (e.g. Fuji DS 300) takes a plurality of exposures closely spaced in time. Presently, the time spacing may be as little as 100ms.
  • a means must be added to control the two separate flash units.
  • the preferred means is to connect the flash control signal from the camera to an electronic switching circuit which directs the flash control signal first to one flash unit, and then to the other, by any means, many of which will be apparent to those skilled in the electronic arts.
  • This embodiment presently has two advantages, due to using a single camera: setup is simpler, and the orientation of the camera is identical for the two shots. However, the period between exposures is longer, and the cost of cameras having burst mode is presently much higher than for other types, so this is not the most preferred embodiment at this time.
  • Principles of the present invention for accurately determining the 3D profile of an image include structured light grating opaque areas, color compensation, bias adjusted centroid detection, calibration, filtering and weighted averaging of determined heights. All of these principles work together to enable one to produce high accuracy 3D profiles of even a living or moving object with virtually any commercially available image-capturing device, particularly an off-the-shelf digital camera, and any compatible, separate commercially available flash (or other lighting) unit. To these off-the-shelf items, only a light structuring optical device and an image data processing program need be added to implement this embodiment of the invention. However, some aspects of the present invention may be employed while omitting others to produce still good-quality 3D images under many circumstances. Accordingly, the invention is conceived to encompass use of less than all of the disclosed principles.
  • a preferred embodiment of the present invention may be practiced and tested as follows.
  • a Kodak DC 260 digital camera is used.
  • a well-defined triangular object is imaged.
  • the object is 25 mm in height, 25 mm thick, and 125 mm long.
  • the DC 260 camera has 1536 x 1024 pixels, the test object only occupies 600x570 pixels, due to limitations of the camera zoom lens.
  • the structured light source employs opaque areas between each of 3 separate colored bars in a repeating pattern, as described above.
  • Fig. 13a shows a one-dimensional scanned profile derived from the basic test setup. The worst case range error is about 1.5 mm, so the relative e ⁇ or is about 1.5/25 ⁇ 6%, an accuracy comparable to that reported by other groups using a single encoded color frame. The theoretical range accuracy ⁇ z rA based upon camera resolution can be estimated by simply differentiating Eq. (6).
  • Fig. 13b shows the result of processing the image data with cross-talk compensation as described above, which reduces measured error to -0.8 mm, for a -1.9-fold improvement in range accuracy.
  • Fig. 13c shows the result of adding line-by-line reference plane calibration, as discussed above, to the color compensated data.
  • the maximum measured error is reduced to about 0.5 mm, a relative improvement of about 1.6-fold, for about 3-fold total improvement in accuracy.
  • Fig. 13d shows the calculated profile after adding bias adjusted centroid peak detection to the cross-talk compensated and line-by-line calibrated data.
  • the maximum error is reduced to about .25 mm, a 2-fold relative improvement and ⁇ 6-fold total improvement.
  • the measured accuracy exceeds by 2-fold the error estimate based on camera resolution.
  • Fig. 13e shows the calculated profile after weighted averaging. Averaging the data between adjacent lines will artificially improve the camera resolution; and weighted averaging as taught above improves it more than uniform averaging. Accordingly, the resolution is not limited by the .5 pixel basic camera resolution.
  • the maximum e ⁇ or is now reduced to 0.1 mm, about a 2.5-fold relative improvement, and an overall 15-fold (1.5mm / 0.1mm) improvement in accuracy.
  • Fig. 14a shows a human face
  • Fig. 14b shows the face as illuminated by structured light in three colors
  • Fig. 14c shows the reconstructed profile of the face after processing as described above.
  • a cross-section of the height profile is shown in Fig. 14d.

Abstract

L'invention se rapporte à un procédé et à un appareil d'imagerie d'objets tridimensionnels. Ledit appareil comporte une source de lumière structurée qui projette une image mise au point sur un objet au moyen de rayons continus ou d'éclairs lumineux, issus d'une source collimatée, qui traversent un réseau de diffraction optique ou une lentille de projection. Des ouvertures du réseau, qui transmettent éventuellement une pluralité de couleurs distinctes, imposent un motif connu à la lumière projetée, dans lequel des zones opaques séparant les zones lumineuses réduisent la diaphonie entre couleurs de façon à augmenter la précision. Une caméra sensible à l'image projetée saisit une image de la lumière projetée réfléchie par un objet. En raison de la courte durée de la saisie d'image, effectuée notamment au moyen d'un éclair de lumière structurée de courte durée, synchronisé avec une caméra à obturateur très rapide, il est possible de produire des images tridimensionnelles statiques de haute précision et d'obtenir des mesures d'objets mobiles ou vivants, tels que des humains. On analyse ensuite les données de l'image saisie pour établir et affiner l'emplacement apparent des points du motif réfléchi.
EP99923100A 1999-05-14 1999-05-14 Systeme d'imagerie tridimensionnelle couleur a lumiere structuree Withdrawn EP1190213A1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US1999/010756 WO2000070303A1 (fr) 1999-05-14 1999-05-14 Systeme d'imagerie tridimensionnelle couleur a lumiere structuree

Publications (1)

Publication Number Publication Date
EP1190213A1 true EP1190213A1 (fr) 2002-03-27

Family

ID=22272767

Family Applications (1)

Application Number Title Priority Date Filing Date
EP99923100A Withdrawn EP1190213A1 (fr) 1999-05-14 1999-05-14 Systeme d'imagerie tridimensionnelle couleur a lumiere structuree

Country Status (6)

Country Link
EP (1) EP1190213A1 (fr)
JP (1) JP2002544510A (fr)
CN (1) CN1159566C (fr)
AU (1) AU3994799A (fr)
CA (1) CA2373284A1 (fr)
WO (1) WO2000070303A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102161291A (zh) * 2010-12-08 2011-08-24 合肥中加激光技术有限公司 三维成像水晶内雕像亭
US11308830B1 (en) 2020-12-25 2022-04-19 Acer Incorporated Display driving device and operation method thereof for improving display quality of 3D images

Families Citing this family (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6556706B1 (en) 2000-01-28 2003-04-29 Z. Jason Geng Three-dimensional surface profile imaging method and apparatus using single spectral light condition
JP2002191058A (ja) * 2000-12-20 2002-07-05 Olympus Optical Co Ltd 3次元画像取得装置および3次元画像取得方法
US7176440B2 (en) * 2001-01-19 2007-02-13 Honeywell International Inc. Method and apparatus for detecting objects using structured light patterns
RU2184933C1 (ru) * 2001-02-21 2002-07-10 Климов Андрей Владимирович Устройство для бесконтактного контроля линейных размеров трехмерных объектов
RU2185599C1 (ru) * 2001-03-19 2002-07-20 Зеляев Юрий Ирфатович Способ бесконтактного контроля линейных размеров трехмерных объектов
US7174033B2 (en) 2002-05-22 2007-02-06 A4Vision Methods and systems for detecting and recognizing an object based on 3D image data
US7257236B2 (en) 2002-05-22 2007-08-14 A4Vision Methods and systems for detecting and recognizing objects in a controlled wide area
CN100417976C (zh) * 2002-09-15 2008-09-10 深圳市泛友科技有限公司 三维摄影技术方法
DE10250954B4 (de) * 2002-10-26 2007-10-18 Carl Zeiss Verfahren und Vorrichtung zum Durchführen einer Televisite sowie Televisiteempfangsgerät
US7146036B2 (en) 2003-02-03 2006-12-05 Hewlett-Packard Development Company, L.P. Multiframe correspondence estimation
TWI257072B (en) 2003-06-20 2006-06-21 Ind Tech Res Inst 3D color information acquisition method and device thereof
CN100387065C (zh) * 2003-07-07 2008-05-07 财团法人工业技术研究院 三维彩色信息撷取方法及其装置
JP3831946B2 (ja) * 2003-09-26 2006-10-11 ソニー株式会社 撮像装置
JP2005164434A (ja) * 2003-12-03 2005-06-23 Fukuoka Institute Of Technology 非接触三次元計測方法および装置
DE102004007829B4 (de) * 2004-02-18 2007-04-05 Isra Vision Systems Ag Verfahren zur Bestimmung von zu inspizierenden Bereichen
EP1607041B1 (fr) 2004-06-17 2008-01-16 Cadent Ltd. Méthode pour fournir des données concernant la cavité intra-orale
US7646896B2 (en) 2005-08-02 2010-01-12 A4Vision Apparatus and method for performing enrollment of user biometric information
WO2006031143A1 (fr) 2004-08-12 2006-03-23 A4 Vision S.A. Dispositif de contrôle sans contact du profil de la surface d'objets
CA2615335C (fr) 2004-08-12 2014-10-28 A4 Vision S.A. Dispositif de controle biometrique de la surface d'un visage
JP5174684B2 (ja) * 2006-03-14 2013-04-03 プライムセンス リミテッド スペックル・パターンを用いた三次元検出
US8050486B2 (en) 2006-05-16 2011-11-01 The Boeing Company System and method for identifying a feature of a workpiece
US9052294B2 (en) 2006-05-31 2015-06-09 The Boeing Company Method and system for two-dimensional and three-dimensional inspection of a workpiece
US7495758B2 (en) 2006-09-06 2009-02-24 Theo Boeing Company Apparatus and methods for two-dimensional and three-dimensional inspection of a workpiece
JP2008170281A (ja) * 2007-01-11 2008-07-24 Nikon Corp 形状測定装置及び形状測定方法
CN102334006A (zh) * 2009-02-25 2012-01-25 立体光子国际有限公司 用于三维计量系统的强度和彩色显示
DE102010029091B4 (de) * 2009-05-21 2015-08-20 Koh Young Technology Inc. Formmessgerät und -verfahren
JP5633719B2 (ja) * 2009-09-18 2014-12-03 学校法人福岡工業大学 三次元情報計測装置および三次元情報計測方法
CN102022981B (zh) * 2009-09-22 2013-04-03 重庆工商大学 测量亚像素位移的峰谷运动探测方法及装置
CN102052900B (zh) * 2009-11-02 2013-09-25 重庆工商大学 快速测量亚像素位移的峰谷运动探测方法及装置
US20110187878A1 (en) * 2010-02-02 2011-08-04 Primesense Ltd. Synchronization of projected illumination with rolling shutter of image sensor
US8279418B2 (en) * 2010-03-17 2012-10-02 Microsoft Corporation Raster scanning for depth detection
CN101975994B (zh) * 2010-08-27 2012-03-28 中国科学院自动化研究所 多级透镜的三维成像系统
US20120062725A1 (en) * 2010-09-10 2012-03-15 Gm Global Technology Operations, Inc. System for error-proofing manual assembly operations using machine vision
TW201315962A (zh) * 2011-10-05 2013-04-16 Au Optronics Corp 投影式影像辨識裝置及其影像辨識方法
CN102628693A (zh) * 2012-04-16 2012-08-08 中国航空无线电电子研究所 一种用于摄像机主轴与激光束进行平行配准的方法
EP2852814A1 (fr) * 2012-07-25 2015-04-01 Siemens Aktiengesellschaft Codage de couleur pour mesure en trois dimensions, en particulier sur des surfaces dispersives transparentes
TR201811449T4 (tr) 2012-11-07 2018-09-21 Artec Europe S A R L Üç boyutlu nesnelerin doğrusal boyutlarını gözetlemek için yöntem.
US11509880B2 (en) 2012-11-14 2022-11-22 Qualcomm Incorporated Dynamic adjustment of light source power in structured light active depth sensing systems
EP2799810A1 (fr) * 2013-04-30 2014-11-05 Aimess Services GmbH Dispositif et procédé de mesure tridimensionnelle simultanée de surfaces avec plusieurs longueurs d'onde
TWI464367B (zh) * 2013-07-23 2014-12-11 Univ Nat Chiao Tung 主動式影像擷取之光學系統及光學方法
CN103697815B (zh) * 2014-01-15 2017-03-01 西安电子科技大学 基于相位编码的混频结构光三维信息获取方法
DE102014210672A1 (de) 2014-06-05 2015-12-17 BSH Hausgeräte GmbH Gargerät mit Lichtmusterprojektor und Kamera
CN104243843B (zh) 2014-09-30 2017-11-03 北京智谷睿拓技术服务有限公司 拍摄光照补偿方法、补偿装置及用户设备
CN112530025A (zh) 2014-12-18 2021-03-19 脸谱科技有限责任公司 用于提供虚拟现实环境的用户界面的系统、设备及方法
WO2016137351A1 (fr) * 2015-02-25 2016-09-01 Андрей Владимирович КЛИМОВ Procédé et dispositif d'enregistrement 3d et de reconnaissance du visage d'une personne
CN104809940B (zh) * 2015-05-14 2018-01-26 广东小天才科技有限公司 几何立体图形投影装置及投影方法
CN105157613A (zh) * 2015-06-03 2015-12-16 五邑大学 利用彩色结构光的三维快速测量方法
CN105021138B (zh) * 2015-07-15 2017-11-07 沈阳派特模式识别技术有限公司 三维扫描显微镜及条纹投影三维扫描方法
CN106403838A (zh) * 2015-07-31 2017-02-15 北京航天计量测试技术研究所 一种手持式线结构光视觉三维扫描仪的现场标定方法
TWI550253B (zh) * 2015-08-28 2016-09-21 國立中正大學 三維影像掃描裝置及其掃描方法
CN105300319B (zh) * 2015-11-20 2017-11-07 华南理工大学 一种基于彩色光栅的快速三维立体重建方法
US10884127B2 (en) * 2016-08-02 2021-01-05 Samsung Electronics Co., Ltd. System and method for stereo triangulation
CN108693538A (zh) * 2017-04-07 2018-10-23 北京雷动云合智能技术有限公司 基于双目结构光的准确置信度深度摄像机测距装置及方法
CN108732066A (zh) * 2017-04-24 2018-11-02 河北工业大学 一种接触角测量系统
KR101931773B1 (ko) 2017-07-18 2018-12-21 한양대학교 산학협력단 형상 모델링 방법, 이를 이용하는 장치 및 시스템
CN109348607B (zh) * 2018-10-16 2020-02-21 华为技术有限公司 一种发光模组支架及终端设备
CN109855559B (zh) * 2018-12-27 2020-08-04 成都市众智三维科技有限公司 一种全空间标定系统及方法
CN113654487B (zh) * 2021-08-17 2023-07-18 西安交通大学 一种单幅彩色条纹图动态三维测量方法及系统

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4349277A (en) * 1980-06-11 1982-09-14 General Electric Company Non-contact measurement of surface profile
DE3170315D1 (en) * 1981-10-09 1985-06-05 Ibm Deutschland Interpolating light section process
US5640962A (en) * 1993-01-21 1997-06-24 Technomed Gesellschaft fur Med. und Med. Techn. Systeme mbH Process and device for determining the topography of a reflecting surface
US5615003A (en) * 1994-11-29 1997-03-25 Hermary; Alexander T. Electromagnetic profile scanner

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO0070303A1 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102161291A (zh) * 2010-12-08 2011-08-24 合肥中加激光技术有限公司 三维成像水晶内雕像亭
CN102161291B (zh) * 2010-12-08 2013-03-27 合肥中加激光技术有限公司 三维成像水晶内雕像亭
US11308830B1 (en) 2020-12-25 2022-04-19 Acer Incorporated Display driving device and operation method thereof for improving display quality of 3D images

Also Published As

Publication number Publication date
CN1159566C (zh) 2004-07-28
AU3994799A (en) 2000-12-05
WO2000070303A1 (fr) 2000-11-23
CN1350633A (zh) 2002-05-22
JP2002544510A (ja) 2002-12-24
CA2373284A1 (fr) 2000-11-23

Similar Documents

Publication Publication Date Title
EP1190213A1 (fr) Systeme d'imagerie tridimensionnelle couleur a lumiere structuree
TW385360B (en) 3D imaging system
US6341016B1 (en) Method and apparatus for measuring three-dimensional shape of object
Tajima et al. 3-D data acquisition by rainbow range finder
US6549288B1 (en) Structured-light, triangulation-based three-dimensional digitizer
JP3884321B2 (ja) 3次元情報取得装置、3次元情報取得における投影パターン、及び、3次元情報取得方法
JP3519698B2 (ja) 3次元形状測定方法
JP4040825B2 (ja) 画像撮像装置及び距離測定方法
CA2369710C (fr) Methode et appareil pour le balayage 3d a haute resolution d'objets comprenant des vides
DE69734504T2 (de) Verfahren und vorrichtung zum messen der streifenphase in der abbildung eines objekts
KR20100017234A (ko) 편광 감지 센서와 연결된 편광 코드화 개구 마스크를 사용하는 단일-렌즈, 3-d 영상화 장치
WO1997035439A1 (fr) Systeme et procede permettant de numeriser rapidement une forme et de generer un reseau maille adaptatif
US6765606B1 (en) Three dimension imaging by dual wavelength triangulation
CA2017518A1 (fr) Imagerie a determination du spectre de couleurs
US11212508B2 (en) Imaging unit and system for obtaining a three-dimensional image
JP2714152B2 (ja) 物体形状測定方法
JP4516590B2 (ja) 画像撮像装置及び距離測定方法
CN109242895B (zh) 一种基于多相机系统实时三维测量的自适应深度约束方法
JP2001304821A (ja) 画像撮像装置及び距離測定方法
Sato Range imaging based on moving pattern light and spatio-temporal matched filter
JP3912666B2 (ja) 光学的形状測定装置
JP4204746B2 (ja) 情報獲得方法、撮像装置及び、画像処理装置
JP4141627B2 (ja) 情報獲得方法、画像撮像装置及び、画像処理装置
JPH0723684Y2 (ja) レンジファインダ
JP2006058092A (ja) 3次元形状測定装置および方法

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20020103

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20030424