CA1253251A - Three-dimensional range camera - Google Patents

Three-dimensional range camera

Info

Publication number
CA1253251A
CA1253251A CA000511597A CA511597A CA1253251A CA 1253251 A CA1253251 A CA 1253251A CA 000511597 A CA000511597 A CA 000511597A CA 511597 A CA511597 A CA 511597A CA 1253251 A CA1253251 A CA 1253251A
Authority
CA
Canada
Prior art keywords
light
rays
range
light rays
coded
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired
Application number
CA000511597A
Other languages
French (fr)
Inventor
Nelson R. Corby, Jr.
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
General Electric Co
Original Assignee
General Electric Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by General Electric Co filed Critical General Electric Co
Priority to CA000511597A priority Critical patent/CA1253251A/en
Application granted granted Critical
Publication of CA1253251A publication Critical patent/CA1253251A/en
Expired legal-status Critical Current

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

THREE-DIMENSIONAL RANGE CAMERA
Abstract of the Disclosure A non-contact sensor system measures distance from a reference plane to many remote points on the surface of an object. The set of points at which range is measured lie along a straight line (N points) or are distributed over a rectangular plane (M x N points).
The system is comprised of a pattern generator to produce a 1 x N array of time/space coded light rays, optionally a means such as a rotating mirror to sweep the coded light rays orthogonally by steps, a linear array camera to image subsets of the light rays incident on the object surface, and a high speed range processor to determine depth by analyzing one-dimensional scan signals.
The range camera output is a one-dimensional profile or a two-dimensional area range map, typically for inspection and robotic vision applications.

Description

3~

THREE-DIMENSIONAL RANGE CAMERA
This invention relates to an optical syste~ to measure range to an object along a line and over a plane, and more particularly to a non-contact sensor or camera system to generate one-dimension profiles 05 and two-dimension range maps.
Range from a reference point or plane to a target point can be measured in many way~. Electromagnetic energy yields RADAR type images, acoustic energy yields SONAR images, and light yields LIDAR images. The range measurement technique in all these approaches uses time-of-flight measurement as a basis for determining range.
In machine vision research and application, an optical effect that has been exploited is structured illumination. In this technique, a ray of light is caused to strike a surface along a direction which is not coaxial with the optical axis of a one or two-dimensional imaging device. The intersection of the ray and surface produces a point of light which is imaged on the sensor plane of the imaging device.
The 3-D position of this point in space may be calculated in many ways. If multiple views, using mul~iple cameras or one moving camera, can be obtained from sensors whose position in space is known, the point's position can be calculated by triangulation from basic trigonometric relations. One sensor may be used if the angle of the light ray is known. Another way that one sensor may be used is if a zero reference plane has been estab-. .

lished beforehand. In the latter case, one measures and stores the position in the sensor plane of the image of the 3-D point of light as it strikes the reference plane. Then an unknown surface is introduced and the 05 impinging ray intersects the surface creating a new 3-D point of light whose position is to be measured.
The displacement of the image of the new point of light, as measured in the image plane, from the stored or reference position is proportional to the distance of the surface above the reference plane.
In all of these techniques, one measures the position of the point of light in the sensor plane. In an effort to speed up the process, more than one ray of light can be used in a given sensor cycle. Experimenters have used sheets of light producing a line o~ intersection on the object's surface. This intersection curve when imaged onto a two dimensional sensor plane can be processed to yield a profile under several assumptions about field of view, ray divergences, etc. In addition, one is limited to the frame rate of the sensor - typically 30 frames per second. This works out to one 256 x 2~6 frame every 8.53 seconds. In addition, arrays of sheets (producing alternating light and dark bands1, two dimen-sional arrays of light rays (producing grids of points) have been tried. All these methods can suffer from aliasing. ~ sing occurs when the shifted image of a point produced by one ray is confused with that of another ray. In this case the actual range is ambiguous.
The first problem to be solved is how to tag or label individual rays such that when multiple rays are introduced simultaneously into the scene, the inter-section points each produces may be unambiguously identified.
The second problem is how to construct a device capable of impleme~ting the tagging scheme. The third problem ~;3~

is to develop a range signal processor which is able to rapidly process the image of the tagged shifted points into an array of range values.
Rays of light may be tagged based on static physical 05 properties such as brightness, hue, and phase, or they may be labelled by time modulation of the above charac-teristics. Static property labelling based on brightness or hue can exhibit aliasing problems, since reflectivity variations on the surface to be sensed can cause a ray with one brightness or hue to be imaged with a brightness or hue corresponding to that of another ray. Phase labelling is a good candidate, but requires a coherent source such as a laser. Such systerns are used in transits for surveying. They are complicated and tend to take a long time (2 - 5 seconds) per point for each measurement.
Time modulation could be employed by using time modulations (simple chopping, for example) at N different frequencies. One would have to process the image of the N points at a rate 2 to 4 times faster than the highest frequency of variatisn but would have to accumulate data for a time interval 2 to 4 times longer than the period of the lowest frequency. Unless N parallel systems were employed, each tuned to a particular frequency, the time per point could be unacceptably high.
The published technical papers, such as M.D. and B.R. Altschuler and J. Taboada, Optical Engineering, November/December 1981, show that the idea of time/space coding has been employed a number of times by researchers.
These authors used a two-dimensional N x N array of laser rays produced by complex optical apparatus such as shear plates and electro-optic shutters. The device assumes "on" and "off" coding of the laser beams.
It has been shown that for N rays, it is possible to -~53~i uniquely identify all rays by processing P=l + log2N
sequential presentations of rays. A 2-D raster scan camera is utilized to image each of the individual sequential schemes comprising the set of scenes to 05 be analyzed per complete range image. There is the problem of processing this image rapidly (1/4 ~o 1/8 second) so as to find the centroid of all visible dots.
This is difficult to do rapidly on a large raster scanned image; it presents a complex 2-D centroid finding prGblem.
Summary_of the Invention An object of the invention is to provide an improved non-contact sensor/camera system that is more suitable for industrial vision and measurement tasks and outputs a one-dimensional array of distances for profiling and a matrix of distances for area ranging.
Another object is the provision of such a range camera system having a means of producing time/space coded light rays inexpensively, rapidly, and flexibly, using two or more brightness levels.
Yet another object is the provision of image analysis along a line rather than over an area, and creation of area range arrays by sweeping the line profile in a direction orthogonal to the profile.
The non-contact sensor system to measure range to a three-dimensional object from a reference plane has the following components. A pattern generator and projector produces a 1 x N array of time/space coded light rays whose intensities can be varied with time, and projects P sequential presentations of different subsets of the light rays onto the object, where P=
1 + logbN and b is the number of brightness levels and N is the number of rays. A linear array sensor such as a line scan camera images the points of light where rays are incident on the object surface, and ~2~5~

generates one-dimensional scan signals which have ray peaks at locations corresponding to imaged light.
A high speed range processor analyzes the one-dimensional waveforms to uniquely identify all rays, determine 05 depth from the displacement of the ray peaks fr~m zero height reference plane peaks, and provide ~utput range data. To create two-dimensional range maps, a means such as a rotating mirror is provided for othogonally sweeping the 1 x N coded light rays by steps over a rectangular plane.
The pattern generator includes a sheet light source and a rotating code wheel similar to an optical encoder disk. It has one or more group patterns each comprised of one pattern per presentation of discrete regions whose light transmission coefficients are selected to spatially modulate the light sheet into coded light rays. In a binary system (b = 2) the regions are clear and opaque and the rays are "on" or "off". A lens system projects the coded rays onto the object. A
means of delivering a remotely generated pattern of light rays to the sensing site utilizes a reduction lens to project the coded rays onto the entrance of a coherent optical fiber bundle and a lens system to reimage the ray pattern on the object surface.
The pattern generator and projector in another embodiment of a range camera for profiling has a narrow laser beam source, a modulator to vary beam intensity, and a beam deflector to scan the modulated beam along a straight line and project it onto the object surface.
This range camera system is high speed and capable of generating one or more range maps per second. Since image analysis involves processing one-dimensional signals, the system is self calibrating and immune to surface reflectivity variations to a large extent.

, .~ ~ 5 ~

It is useful in inspection and measurement of three-dimensional surfaces and in robotic vision, for object recognition and position determination.
Brief Description of the Drawings 05 Fig. 1 is a schematic top view of a first embodiment of the range camera system for profiling applications and shows in dashed lines a rotating mirror to implement area ranging.
Fig. 2 shows in greater detail a top view of the pattern generator and projector and the orthogonal sweep mirror.
Fig. 3 shows an alternative to the projection scheme in Fig. 2 and has a coherent fiber optic bundle for remote projection of the coded rays.
Fig. 4 has a table of ray states at four sequential presentations to explain the principle of time/space coding, letting the number of light rays N = 80 Figs. 5a-5d illustrate the pattern of light rays - incident on the object at the four presentations.
Figs. 6-8 depict the rotating code wheel, one of the 32 sectors at an enlarged scale, and a typical group pattern ir, the sector where N = 128.
Fig. 9 illustrates an alternative pattern generator and projector and another method to produce coded light rayS.
Fig. 10 is a sketch of light rays incident on a reference surface and on an object, to facilitate understanding of the next figures.
Fig. 11 shows idealized output voltage waveforms of the line scan camera, lines a-c for three presentations of time/space coded rays on the reference surface and lines d-f for presentations of the coded rays on the object surface.

-3253~

Fig. 12 shows actual camera scan signals and modulation of the "envelope" due to surface reflectivity variations.
Detailed Descr~tion of the Invention The components of the preferred embodiment of 05 a three-dimensional range camera system are seen in Fig. 1. A pattern generator and projector 10 generates at high speed and low cost a 1 x N array 11 of time and space coded light rays 12 which are illustrated projected onto a reference surface 13. The object 14 to be profiled, shown in dashed lines, is placed on this surface and depth or height is measured from this reference plane (z=0) to the surface of the object.
The total number of light rays N is typically large, N = 128 or 256, but for purpo~es of explanation, N
= 4 and 8 in some of the figures. The dots of light where the projected rays 12 strike reference surface -13 (assuming the object is not there) are indicated as points a-h. A linear array sensor such as line scan camera 15 is oriented in the plane of the rays and views the points of light at an angle to the projected rays 12. A lens 16 on the camera focuses the imaged points a'-h' on the light sensitive elements. The line scan camera may be a Reticon Corp. 1 x 2048 pixel camera which can scan at rates up to 2000 scans per second. Care must be taken to use a high resolution lens with sufficient depth of field.
Linear scan signals sequentially read out of line scan camera 15 are sent to a range profile processor 17, which has a special purpose range calculation computer.
Its output is a 1 x N array of distances for a profiling application, i.e., the height or depth of the object at points on its surface relative to the reference plane, above reference points a-h. The system optionally has an orthogonal sweep means lB, such as a rotating ~32~

mirror, to orthogonally sweep the coded light rays 12 over a rectangular plane. Area range maps are produced by optically or mechanically stepping the profiler field of view in a direction orthogonal to the profile 05 direction. Range processor 17 receives 1 x N arrays of luminance data read out of line scan camera lS, coordinates timing of the camera 15 and pattern generator 10, and has a ~ control to step the rotating mirror 18.
The f~nction of pattern generator 10 is to produce a one dimensional array of rays of light each of whose intensities (and/or hues, if desired) can be varied with time. These rays strike the surface to be profiled and are imaged by the line scan camera 15. Referring to Fig. 2, a light source 19 emits a collimated sheet of light 20. This can be an incandescent point or line source, a laser, or other appropriate light source.
The dimensions of a pattern generator using a laser source that was built and operated are given, but the invention is not limited to these details. Collimated light sheet 20 is focussed, if necessary, by a cylindrical lens 21 into a thin line on the order of 0~010-0.020 inches thick. A field stop 22 in the form of a rectangular aperture mask approximately 1 inch by 0.008 inches is placed at the focus o~ lens 21. This will create a rectangular light source of high intensity~ The light sheet apertured by the mask impinges on the face of a rotating code wheel 23 which is mounted on a shaft 24 and turned by steps. The rotating code wheel 23 is similar to an optical encoder disk, in this case a glass disk upon whose surface has been photodeposited a pattern of sectors. Each of the sectors has a pattern comprised of clear and opaque regions to spatially modulate the light sheet 20 into time/space coded light rays 25. The 1 x N sheet of coded rays 25 are projected RD-1601~

by a projection lens system 26 onto an angularly pOsition-able mirror 27. The mirror is omitted to profile along a single line. From the mirror surface, the N rays are projected through space and strike the surface 05 of the object 14 to be scanned. An area of N by M
samples can be scanned by stepping the mirror 27 through M small angular steps. Fig. 2 is a top view of the range camera system: the light sheet 20 is horizontal, rotating code wheel 23 is vertical, and the axis of rotating mirror 27 is horizontal. In an industrial setup, the projected rays 28 are bent through 90, more or less, onto the object which is placed on a horizontal reference surface 13. The object is stationary as it is scanned. An alternative, used in laboratory experiments, is that the entire range camera system is stationary and the object 14 is mounted on a vertically moving robot arm.
A unique means of projecting the time/space coded light ray pattern employs a coherent fiber bundle as shown in Fig. 3. A reduction lens 29 in this case projects the demagnified image 30 of the 1 x N ray array onto the entrance surface of a coherent optical fiber bundle 31. At the exit end the pattern i5 reproduced exactly, and is then projected with a lens system 32 onto the surface in question where it is imaged by the line array sensor. The projected light pattern 30' is illustrated incident on the surface of a 3-D
object 14. A coherent optical fiber bundle 33 can also be used to convey the image of the surface to be profiled or area mapped to the line array camera. The embodiment in Fig. 3 allows the exit end of coherent fiber bundle 31 and projection lens 32 to be on the robot arm with the image pickup coherent bundle 33 and imaging lens 34; the remainder of the range camera system including ,.
_g_ ~3~S~

line scan camera 15 and lens 35 at the exit end of bundle 33 are remotely located.
The principle of time/space coding of light rays is now explained. This invention uses the idea of 05 presenting N rays simultaneously in each of P presentations.
If there are more than two brightness levels, it can be shown that for N rays, it is possible to uniquely identify all rays by processing P=l ~ logbN sequential presentations of rays. In a binary system, b = 2, and the number of presentations P= 1 + log2N. In each presentation for the binary case, different subsets of the N rays are "on" with the balance of the rays noff ".
Figs. 4 and 5a-5d illustrate the principle for N = 8. In this case, P= 1 + log28 = 4. There are four sequential presentations of the coded light rays, all at the same location of the object tQ be profiled.
In the first presentation, all eight rays are "on".
In the second presentation, the rays are alternately "on" and "off". In the third, pairs of rays are alternately "on" and "off", and in the fourth, the first four rays are "on" and the second four rays are "offn. These four presentations are depicted in Figs. 5a-5d; the incident light rays appear as dots of light on the object surface. All four figures show the identical object surface, but at different points of time. Having received the four images, the light rays are uniquely identified as follows. If "onl' = 0 and "off" = 1, at x = X4, the pattern in binary notation (reading from bottom to top, patterns Sd, 5c amd 5b) is 011 = 3. The ray is then 1 + 3 = no. 4. At x = x6, the pattern is 101 = 5. The ray is 1 + 5 = no. 6. The above uses a straight binary coding of rays. Other coding, such as a Gray coding, may be more useful.
-10~

3~2i5~

The rotating code wheel 23 and its construction are illustrated in Figs. 6-8. Here the number of light rays ~ = 128. The rotating code wheel 23 for such a 128 point profiler comprises a glass disk 36 onto 05 whose surface has been photodeposited a pattern of sectors. This disk has 32 groups of 8 patterns for a total of 256 sectors The 3~ sector-shaped group patterns 37 are seen in Fig. ~. Fig. 7 shows that every group pattern 37 has 8 sector-shaped patterns 38, one pattern 38 for each of the eight presentations of the light rays. The sector pattern types (see Fig.
8) that are presented as the disk revolves are 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4 . . . .
Each sector has 256 circular tracks, of which 128 are opaque (guard tracks) and 128 are information tracks. For the case of two light levels (on and off) the information track is either clear or opaque. By using partially transmitting coatings on a given track, more than two levels of intensity coding can be provided.
2D That is, the light transmission coefficients are selected to take any value between O and 1. This disk can be produced by the same technology as is used to make rotating optical encoder disks. Also dielectric films and coatings can be deposited on desired sections of the disk allowing for hue and phase modulation as well as intensity modulation.
A typical group pattern 37 is drawn enlarged in Fig. 8; this figure doesn't show the 128 guard tracks, only the information tracks. The group pattern is composed of 8 sub-patterned sectors. There is one sub-pattern per presentation having clear and opaque regions, indicated at 39 and 40, to divide the continuous light sheet into coded light rays. Pattern 1 has 128 of the clear regions 39, on alternate tracks, and results ~i3~5~

in the full complement of 128 light rays. Pattern
2 has alternating clear and opaque regions to pass and block light rays. Patterns 3-8 respectively have 2, 4, ~, 16, 32, and 64 rays "on" followed by a like 05 number "off". The sector/track pattern on rotating code wheel 23 acts as 128 parallel "shutters" that spatially modulate the collimated light sheet into 128 coded rays. Two additional tracks 41 and 42 around the periphery of the disk (Figs. 6 and 7) identify the current pattern being illuminated as well as provide synchronizing information to the scan electronics.
Index track 41 has a clear region 43 at the pattern 1 position of each group, and timing track 42 has eight clear re~ions 44, one for each of the pattern 1-8 positions in each group. The disk 36 ~ust described has 256 tracks and is 4 inches in diameter. ~urrent plans provide for production of a disk 36 with 512 tracks, 256 guard and 256 information bearing, f~r a 256 point profiler. It is contemplated that the disk can be made smaller, with a 1 inch diameter, if necessary.
An alternate method to produce the coded rays, replacing the light sheet source, rotating code wheel, and projection lens, is shown in Fig. 9. Here a laser source 45 e~its a narrow collimated beam 46 whose intensity is varied by an acousto-optic or electro-optic modulator ~7. The modulated beam 48 then is scanned, for instance by a rotating polygonal mirror, galvanometer driven mirrors, or acousto-optic beam deflector 49. The beam deflection is one-dimensional. The scanned modulated beam 50 is illustratea incident on the reference surface 13. The narrow laser beam is modulated into an on-off pattern or may have several brightness levels as previously discussed. The modulation pattern is supplied by the range profile processor 17: this table can be . . , ~5~

RD~16015 precalculated or stored in a ROM (read-only memory).
The range profile processor also controls beam deflector 49, and the 1 x N array of luminance data is fed to it by the line scan camera 15 as before. The patterns 05 produced are processed in a similar way to the code wheel produced patterns. This embodiment does not have a separate projection lens because the low divergence of the scanned laser beam makes this unnecessary.
The angular width or extent of the pattern is governed by the beam deflector drive. The resulting pattern projection exhibits near infinite depth of focus of the pattern.
The range processor 17 and processing algorithms are now discussed in detail. To review how the light rays are uniquely identified and how the depth calculations are made, Fig. 10 shows a simple object 14 on the reference surface 13, four light rays incident on the reference plane, and the ~isible points 51, 52 and 53 where rays 1, 3 and 4 strike the object surface. Two sets of Iinear scan signals outputted by line scan camera 15 are illustrated, the full line reference plane set and the dashed line set with the object in place.
These are idealized one-dimensional waveforms, and have one or more ray peaks 54 at X axis locations corres-ponding to imaged points of light. Here N = 4 and there are three sequential presentations of the time/space coded light rays. Line a is the received waveform with rays 1-4 "on", line b with rays 1 and 3 "on", and line c with rays 1 and 2 "onn. The ray peaks in all three presentations remain in the same pla~e. Let "on" =0, "off" = 1. At X = Xl, reading from bottom to top, the ray no. is 1 + 00 = 1; at X = X2, the ray
3 ~i32S;~

no. = 1 + 01 = 2; at X = X3, the ray no. = 1 + 10 = 3, at X = X4, the ray no. is 1 + 11 = 4. Thus, all the reference plane light rays are determined unambiguously.
When profiling the object 14, as seen from the viewing 05 direction, ray 1 hits reference surface 13 at point 51, ray 2 is not visible, and rays 3 and 4 strike the object at points 52 and 53. The last two rays are shifted to the left by an amount proportional to the object height relative to the reference points where those rays are incident on the reference surface 13.
Referring to lines d-f, the ray peaks 54 are at the same X axis locations in all three waveforms.
The ray numbers are determined as follows. At X = Xl1, the ray no. = 1 + 00 = 1; at X = X3', it is 1 + 10 =
3; at X = X4', it is 1 + 11 = 4. There is no ray no.2.
Depth or height calculation at the ith point involves determining the displacement (measured along the X `
axis) of the ith peak from a calibration, ~ero height reference plane peak~ The shift of ray 1 = Xl- Xl' = 0; ray 2 is unknown; ray 3 = X3- X3' = ~ X3; ray
4 = X4 - X4 = ~ X4.
If the optical and electronic systems of the pattern projector 10 and the line scan camera 15 were perfect and the surface being scanned had constant reflectance, the line scan camera output would look like Fig. 11.
Due to array noise and offsets, reflectance variations, and optical limitations and abberations, the actual scan signal is closer to Fig. 12 (lines a'-c' correspond to lines a~c in Fig.ll). The ray peaks are desi~nated by 54'. Envelopes 55, 56, and 57 are identical and represent variations in the reflectance~ etc.~ which modulate the impressed pattern of rays as a function of position across the part surface. 5ince the scan signals read out of the line scan camera are one-dimensional 2 5 ~

waveforms, many of the known signal processing techniques such as matched filtering and correlation detection for analyzing these waveforms may be employed by the range processor 17.
05 The actual system operation goes as detailed below.
The following assumes that first order correction and linearization of the line scan camera have been provided for.
1. Project patterns onto zero reference surface 13.
2. Find location of peaks 54 ~N peaks).
3. Store peak positions in reference table (TO);
pl~ce object below profiler.
4. Project pattern 1 (all dots on).
5. Run peak detector along pattern - note location lS and strength of each peak in table (Tl); there will be L peaks, L C N.
6. Project pattern 2 (dot 1 on, dot 2 off, etc.).
7. ~un peak detector along pattern - note location and strength of each potential peak 54 in table (T2).
8. Do remaining patterns and fill tables T3 thru Tp.
9. Correlate pattern 2 against pattern 1 using any of a number of statistical tests. NOTE:
The outcome of the test at each tentative peak location is a "yes" or "no" as to the presence of a peak 54 at a given location in a subpattern.
Each of the subpatterns 2 - p will exhibit the same response envelope as pattern 1. The only difference is as rays are gated on or off, the corresponding peaks will appear and disappear.
Pattern 1 was the only one with all possible visible rays enabled. Now label each of the L entries in table Tl with its identity or ray number.
~ -15-3 2 ~ ~
10. After identificat~on and location of L peaks (L less than or equal to N) from examination of patterns 1 through P, compute the distance (difference) between corresponding entries of table TO and 05 table Tl. There will be L depth values. Due to the shifts of the ray peaks 54 induced by the surface being measured, some of the N rays will not be visible (missing~ while in general the spacing will be a function of the surface height gradient For a profiling application, the range processor 17 analyzes a single set of camera linear scan signals, one signal per presentation. In an area range mapping situation, each of the M profiles produces a set of P one-dimensional scan signals. The range processor analyzes the M sets of scan signals set by set. It is recognized that some things have to be done only once, such as generating and analyzing calibration, reference plane scan signals. Since the ray peaks are aligned in any given set (such as lines d-f in Fig. 11) the X axis displacement of ray peaks relative to the calibration, reference plane peaks has to be done only oncef say for the signal on line d.
The non-contact sensing system has the benefit of being largely invarient to ambient conditions, since all P scans take place at the same physical location, at nearly the same time. Any variation in ambient illumination, etc., is adapted to and compensated for every P scans. For the system that was built and previously referred to, the rotating code wheel 23 presents 1000 patterns per second. Since there are 8 patterns per profile, this corresponds to 125 profiles per second.
Thus, changes in the environment occurring slower than 1/125th of a second are compensated for. Changes in -16~

~;~53~5~

surface reflectivity, etc., are compensated for by virtue of testing the current pattern against the base pattern of the current group at a given location.
In addition, the range processing computer can interpolate 05 certain types of missing data and can resolve local ambiguities in range. This can be done both on the l-D profile and the 2-D area range map. It can also map from spherical and cylindrical range data to rectangular or Cartesian range data.
In conclusion, the non-contact sensor/camera system measures distance from a reference point to each of many remote target points. A set o~ points at which the range is to be measured may lie alGng a straight line (N points) or may be distributed over a rectangular plane (M x N points). Two ways to generate the time/space coded light rays have been described, and a means of delivering remotely generated coded rays to the sensing site. The system output is a 1 x N array of distances for a profiling application, and is a M x N array of distances for an area ranging application. The 1 x N profiles are computed at a high rate of, say, 1000-3000 per second, so that a ~quare 128 x 128 point range image or 256 x 256 point range image is produced at a rate of several per second.
While the invention has been particularly shown and described with ref erence to preferred embodiments thereof, it will be understood by those skilled in the art that the foregoing and other changes in form and details may be made without departing from the spirit and scope of the invention.

Claims (14)

The embodiments of the invention in which an exclusive property or privilege is claimed are defined as follows:
1. A non-contact sensor system to measure range to an object from a reference plane comprising:
a pattern generator and projector to produce a 1 x N array of time/space coded light rays whose intensities can be varied with time, and to project P sequential presentations of different light ray patterns onto a three-dimensional object, where P = 1 + logbN, b is the number of brightness levels and N is the number of rays;
a linear array sensor which views, at an angle to projected light rays, points of light where said rays are incident on the surface of said object, and generates one-dimensional scan signals which have ray peaks at locations corresponding to imaged light; and a range profile processor to analyze said scan signals to uniquely identify all rays, determine depth from the displacement of said ray peaks from calibration, reference plane peaks, and provide a range map.
2. The system of claim 1 and means for orthogonally sweeping said 1 x N array of coded light rays by steps over a rectangular plane, whereby at each step one line of light points on the surface of said object are imaged by said linear array sensor and depth is measured to provide an area range map.
3. The system of claim 2 wherein said orthogonal sweep means is a rotating mirror.
4. The system of claim 1 wherein said pattern generator includes a sheet light source, and a rotating code wheel which has at least one group pattern comprised of one pattern per presentation having discrete regions whose light transmission coefficients are selected to spatially modulate the light sheet into said coded light rays.
5. The system of claim 1 wherein said pattern generator and projector includes a sheet light source, a rotating code wheel which has one or more group patterns each comprised of one pattern per presentation having clear and opaque regions to spatially modulate the light sheet into said coded light rays, and a projection lens.
6. The system of claim 1 wherein said pattern generator and projector includes a sheet light source, a rotating code wheel which has at least one group pattern comprised of one pattern per presentation having discrete regions whose light transmission coefficients are selected to spatially modulate the light sheet into said coded light rays, a reduction lens to project said coded rays onto the entrance of a coherent optical fiber bundle, and a projection lens to reimage said coded rays on the object surface.
7. The system of claim 1 wherein said pattern generator and projector includes a narrow laser beam source, a modulator to programmably vary the intensity of said laser beam, and a beam deflector to scan the modulated beam along a straight line and project it onto the object surface.
8. A three-dimensional range camera comprising:
a pattern generator and projector to produce a 1 x N array of time/space coded light rays and to project P sequential presentations of different subsets of said light rays onto a three-dimensional object, where P = 1 + log2N and N is the number of light rays;
means for orthogonally sweeping said 1 x N array of coded light rays by steps over a rectangular plane;
a linear array, line scan camera to image points of light on the surface of said object where said light rays are incident, and generate after each presentation a one-dimensional scan signal which has ray peaks at locations corresponding to imaged light points, said line scan camera generating a set of said scan signals per step;
and a range processor to analyze said scan signals, set by set, to uniquely identify all rays, determine depth from the displacement of said ray peaks from calibration, reference plane peaks, and output area range map data.
9. The range camera of claim 8 wherein said pattern generator includes a collimated light sheet source and a rotating code wheel which has group patterns each comprised of one pattern per presentation of discrete regions whose light transmission coefficients are selected to spatially modulate the light sheet into said coded light rays, said patterns being illuminated one at a time in sequence.
10. The range camera of claim 9 wherein said projector is a projection lens and said orthogonal sweep means is a rotating mirror.
11. The range camera of claim 8 wherein said pattern generator includes a sheet light laser source and a rotating code wheel which has sector-shaped group patterns each comprised of one pattern per presentation of clear and opaque regions to spatially modulate the light sheet into said coded light rays, said patterns being illuminated one at a time in sequence.
12. The range camera of claim 11 wherein said projector is comprised of a reduction lens to project said coded light rays onto the entrance of a coherent optical fiber bundle and a projection lens to reimage said coded rays on the object surface.
13. A range camera for profiling applications comprising:
a narrow laser beam source;
a modulator to vary the intensity of said laser beam;
a beam deflector to scan the modulated beam along a straight line and project it onto the surface of an object;
said modulator and beam deflector producing a 1 x N array of time/space coded light rays and P sequential presentations of different subsets of said light rays, where P = 1 + logbN, b is the number of brightness levels and N is the number of light rays;
a linear array, line scan camera to image points of light on the surface of said object where said light rays are incident, and generate a one-dimensional scan signal which has ray peaks at locations corresponding to imaged light; and a range processor to analyze said scan signals to uniquely identify all rays, determine depth from the displacement of said ray peaks from calibration, reference plane peaks, and output profile data.
14. The range camera of claim 13 wherein said modulator and beam deflector are controlled by said range processor.
CA000511597A 1986-06-13 1986-06-13 Three-dimensional range camera Expired CA1253251A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CA000511597A CA1253251A (en) 1986-06-13 1986-06-13 Three-dimensional range camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CA000511597A CA1253251A (en) 1986-06-13 1986-06-13 Three-dimensional range camera

Publications (1)

Publication Number Publication Date
CA1253251A true CA1253251A (en) 1989-04-25

Family

ID=4133350

Family Applications (1)

Application Number Title Priority Date Filing Date
CA000511597A Expired CA1253251A (en) 1986-06-13 1986-06-13 Three-dimensional range camera

Country Status (1)

Country Link
CA (1) CA1253251A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111983633A (en) * 2020-08-26 2020-11-24 中国科学院半导体研究所 Multi-line three-dimensional radar for railway monitoring and scanning method thereof
CN116424279A (en) * 2023-06-13 2023-07-14 宁德时代新能源科技股份有限公司 Posture adjustment method, posture adjustment device, control equipment and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111983633A (en) * 2020-08-26 2020-11-24 中国科学院半导体研究所 Multi-line three-dimensional radar for railway monitoring and scanning method thereof
CN111983633B (en) * 2020-08-26 2023-12-05 中国科学院半导体研究所 Multi-line three-dimensional radar for railway monitoring and scanning method thereof
CN116424279A (en) * 2023-06-13 2023-07-14 宁德时代新能源科技股份有限公司 Posture adjustment method, posture adjustment device, control equipment and storage medium
CN116424279B (en) * 2023-06-13 2023-11-10 宁德时代新能源科技股份有限公司 Posture adjustment method, posture adjustment device, control equipment and storage medium

Similar Documents

Publication Publication Date Title
US4687325A (en) Three-dimensional range camera
US4687326A (en) Integrated range and luminance camera
CA1233234A (en) Optical three-dimensional digital data acquisition system
CN105143820B (en) Depth scan is carried out using multiple transmitters
US4627734A (en) Three dimensional imaging method and device
CN109557522A (en) Multi-beam laser scanner
FR2590681A1 (en) SYSTEM FOR LOCATING AN OBJECT PROVIDED WITH AT LEAST ONE PASSIVE PATTERN.
CA2327894A1 (en) Method and system for complete 3d object and area digitizing
JP2000270169A (en) Motion identification method
US3809477A (en) Measuring apparatus for spatially modulated reflected beams
GB2166920A (en) Measuring angular deviation
US5018803A (en) Three-dimensional volumetric sensor
JPS60257309A (en) Noncontacting distance measuring device
JPH0348531B2 (en)
CN107430193A (en) Distance measuring instrument
CA1312755C (en) Synchronous optical scanning apparatus
US6369879B1 (en) Method and apparatus for determining the coordinates of an object
US5568258A (en) Method and device for measuring distortion of a transmitting beam or a surface shape of a three-dimensional object
CA1253251A (en) Three-dimensional range camera
US3700907A (en) Code reading system for identification of moving and stationary objects utilizing noncoherent optics
CN104061901B (en) Three-dimensional method for measuring distance and system thereof
US6504605B1 (en) Method and apparatus for determining the coordinates of an object
US4830443A (en) Three-dimensional volumetric sensor
US4939439A (en) Robot vision and optical location systems
JPH10177617A (en) Scanning reader and polygon rotor and light code scanning method

Legal Events

Date Code Title Description
MKEX Expiry