AU3831189A - Range finding device - Google Patents

Range finding device

Info

Publication number
AU3831189A
AU3831189A AU38311/89A AU3831189A AU3831189A AU 3831189 A AU3831189 A AU 3831189A AU 38311/89 A AU38311/89 A AU 38311/89A AU 3831189 A AU3831189 A AU 3831189A AU 3831189 A AU3831189 A AU 3831189A
Authority
AU
Australia
Prior art keywords
scene
light source
image data
image
range finding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU38311/89A
Inventor
Kemal Ajay
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to AU38311/89A priority Critical patent/AU3831189A/en
Publication of AU3831189A publication Critical patent/AU3831189A/en
Abandoned legal-status Critical Current

Links

Description

RANGE FINDING DEVICE FIELD OF THE INVENTION
This invention relates to optical radar systems, and more particularly to systems which acquire range data points in a parallel or simultaneous fashion, rather than point by point using a scanning mechanism. The invention has particular utility in robotics where it is necessary for a robot to obtain a "picture" of the surrounding and consequently the invention i generally applicable over only a short range.
DESCRIPTION OF PRIOR ACT
Previous disclosures relating to pulsed illuminator, gated receiver viewing systems are mostly concerned with equalizing the intensity of viewed targets at different ranges while eliminating effects of bac scatter illumination. This is addressed in disclosures contained in the following.
Chernoch, U.S. Patent No. 3,305,633, discloses a system for enhancing the contrast of images of distant targets.
Bamburg et al., U.S. Patent No. 3,899,250 discloses a system where the delay between the outgoing pulse and receiver activation is controlled on successive pulse cycles, to give a known correction for sensitivity with distance.
French, U.S. Patent No. 4,226,529, presents a viewing system whereby the contrast of an image of a target at a particular distance is enhanced with the time gating adjustable to view different ranges.
Contini et al., U.S. Patent No. 4,603,250 discloses a viewing system in which an image intensifier is used as the receiver ~ with the photocathode gated and the gain of the receiver adjusted by varying the microchannel plate voltage.
This overcomes problems associated with poor focus control on gain controlled image intensifier, where the gain is controlled by the photocathode voltage alone. The detailed operation of a gated image intensifier is also presented.
Previous approaches to the problem of obtaining representations of range data using pulsed illumination include the following.
Kleider, U.S. Patent No. 4,068,124, uses a combination of pulse illumination techniques and single lined CCD sensor to detect wire-like obstacles at a fixed range.
Meyerand et al., U.S. Patent No. 3,463,588 describes a pulsed illuminator viewing system whereby a scene is scanned longitudinally (along the direction of light propagation). The time delay between the outgoing pulse and the receiver activation pulse is varied by an operator with the result that the system is sensitive only to objects at a distance corresponding to the time delay. Then, for a given delay, and hence distance, an object is visible only if it exists at that distance. Endo, U.K. Patent No. GB2,139,036A, discloses an optical radar for vehicles wherein an array of photodiodes, onto which the scene is focused, is used to detect the return echo of a laser light pulse which illuminates the scene. Each diode is activated in turn and the time taken for light to travel from the source to the reflecting object and back to the photodiode is measured. The echo time gives the time of flight, and hence distance, to the object in the scene that the particular diode is focused upon.
DESCRIPTION OF THE INVENTION
Accordingly, it is an object of this invention to provide an improved optical radar range finding system which at least reduces the aforementioned problems of prior art systems.
Thus one broad form of the invention provides an optical radar range finding system comprising a high speed switchable light source for illuminating a scene, a high speed switchable imagin device for receiving a reflected image from said scene and providing an output to control means, a pulsing means connected to said light source and to said imaging device to provide pulses to trigger said light source and said imaging device, respectively, said control -means including a store means for storing image information from said imaging device, control logic to sequence operation of said light source and logic circuitry to process data and produce range information relevan to said scene. Preferably, said control logic operates a switching device and the switching device is able to switch said light source to be continuously on, triggered by said pulsing means or in an off condition.
Preferably, said light source is a laser diode array. Preferably, the store means is a video frame store.
Another broad form of the invention provides a method of obtaining range information relating to a scene comprising the steps of:
(1) illuminating the scene with continuous light from a ligh source and storing first image data reflected therefrom;
(2) illuminating the scene with high speed switchable pulsed light from the light source and storing second image dat reflected therefrom; and
(3) dividing the second image data by the first image data t obtain range image data relevant to the range of the scene.
Preferably, said method includes the further step of storing third image data reflected from the scene in the absence of light from the light source and subtracting the third image dat from the first and second image data, respectively, prior to said division so as to remove background illumination effects. In order that the invention may be more readily understood, one particular embodiment will now be described with reference to the accompanying drawings wherein:
t
Figure 1 is a simplified block diagram of apparatus constitutin the system.
Figure 2 shows time profiles of reflected light pulses and sensitivity of the image sensor, of the system of Figure 1.
Figures 3 and 4 show plan view and side view, respectively, of experimental apparatus used to demonstrate the apparatus.
Figure 5 shows the range image acquired from the system when viewing the apparatus of Figures 3 and 4.
Figure 6 shows a profile of distances for a horizontal section through Figure 5.
Figure 7 is a more detailed block diagram of the apparatus of Figure 1.
Figure 8 is a timing diagram showing timing details of the system's hardware, and
Figure 9 shows graphs of the approximate output of laser diode array, and the image intensifier sensitivity, respectively. To extract range information about a scene, where the only ligh source is that of the invention, then two modes of operation ar necessary so that two images may be captured.
OPERATING MODE 1
Referring to Figure 1, the control hardware 1 selects switch 2 so that the high speed light source 3 is pulsed on and off by the pulse generator 4. The pulse generator 4 also drives circuitry (not shown in Figure 1) that controls the sensitivity of image sensor 5. The resulting image is stored in the contro hardware 1. The reference 6 represents a video monitor.
OPERATING MODE 2
In this mode, the control hardware 1 selects switch 2 so that the light source is continuously on. The resulting image from the image sensor is again stored in the control hardware 1.
PROCESSING
Let II be the sensed image from mode 1.
Let 12 be the sensed image from mode 2.
II represents an image that is dependent on the surface reflectivity of the objects in the scene. The intensity of the image is dependent upon the distance to the objects, due to the inverse square law. The image is also dependent upon a gating effect caused by the overlap of the reflected light pulse from the scene with the 'on' time of the image sensor, again depende on distance.
.12, is taken when the light source is continuously on and is dependent only on the surface reflectivity of objects in the scene. The intensity of the image is dependent upon the distan to the objects, caused by the inverse square law.
The range is found by forming the quotient of the corresponding parts of the II and 12, so that,
R = K. 11/12
where R is a range image and K is a calibration factor which is constant.
This processing removes the effect of surface reflectivity and inverse square light loss. The only effect that remains is tha due to the overlap or convolution of the pulsed light with the gated image sensor.
Idealised time profiles of the reflected light pulse "a" and th sensitivity of the image sensor "b" are shown in Figure 2. The convolution of the two is shown as "c". Each profile is a plot of relative magnitude versus time.
The convolution value depends on the delay between the onset of the reflected pulse at the receiver, and the onset of the activation of the image sensor. This delay is due to the round trip time for light to travel from the pulsed light source, to the objects in the scene and back to the image sensor 5. Thus, for any pixel in the range, knowing the shape of the convolution function and the overall calibration factor K, the distance from the rangefinder to any point in the image may be found directly from R.
The calibration factor, K, is globally applied to all points in the image and may be found by obtaining a range image of a scene of known dimensions. This need be done only once.
A refinement of the above mode of operation, to facilitate operation in non-ideal environments, is the extension of the control hardware to capture a third 'background' image to includ in the processing. This image is taken with the light source turned off so that background illumination effects are recorded. The processing is modified so that
R = K. (II - I3)/(I2 - 13)
where 13 is the background image. This removes backgroun effects from II and 12 which would cause errors in the range values.
Referring to Figures 3-4,it can be seen that the experiment used to demonstrate the system comprises six test cards A separated from each other by 15 cm. The receiver 5 is 115 cm from the closet test card A. The cards are also offset transversely to the direction of light source. Figure 5 shows the range image acquired using such apparatus and Figure 6 shows a profile of distances to the various cards on a horizontal section through the image of Figure 5.
Reference is now be made to the more detailed block diagram sho in Figure 7. A pulse generator 11 provides timed trigger signa to the image intensifier driver 12 which drives the gate of the image intensifier 9. The pulse generator also triggers the las diode array and associated drive circuitry 17. The laser diode used are the SHARP LT015 F type. Light from the laser diode array is directed toward the scene (not shown) .
Light reflected from the scene enters the imaging systems throu the filter 7 to reduce the effect of light from extraneous sources. Primary lens 8 focuses the filtered light onto the image intensifier 9, Varo type 5772, which produces an output observed by the video camera 10, NEC TI 22C. The output of vid camera 10 is connected to frame store 14 and sync processor 13. The system controller 15 selects the mode of the electronic switch 16 which allows the diode array 17 to be operated in pulsed mode (position a), continuous mode (position b) and turn off completely (position c). The controller 15 also selects on of three frame memory buffers in the frame store 14 correspondi to images obtained in the three operating modes of the laser diode array.
The three outputs from the frame store 14 are processed by digital logic circuitry 18 to produce data representing range information about the scene. This is converted to an analogue signal by a digital to analog converter 19 the output of which i combined by a circuit 20 with video sync information form the sync processor 13. The resulting signal is displayed on a video monitor 21.
The image intensifier 9 has a driving voltage which ranges from O V to -60 V and back to 0 V in 30 nSec. The microchannel plate is provided with -850 V (MCP out to MCP in) and the phosphor screen accelerating voltage is 5 V (Screen to MCP out).
Receiving lens optics 8 consist of a manually adjusted focus len and a galvanmeter controlled aperture. The filter is a Kodak Wratten filter #87.
Figure 8 shows the relevant timing details of the system's hardware. Timing signals (23), (24) and (25) show the select signals for operating the mode switch and governing the frame buffer memory selection in the frame store 14. These are shown in relation to the odd and even fields of the interlaced video signal 22 from the video camera 10. The times when a control signal is asserted is the "active" time and is indicated by reference 26. When active, signal 23 selects the pulsed mode of the laser array. When signal 24 is active, it selects the continuous mode of the laser array, while signal 25 being active selects the background mode, during which time the laser array i idle. The laser array is also idle when none of the selections is active (time intervals 27). The active time 26 for a select signal, is two video field time in length or 20 mSec. The images are captured during the odd field of the interlaced video represented by the low state of signal 22. Reference 27 represents the recovery time between modes. This allows the phosphor of the image intensifier to return to a neutral state prior to the next selection. The recovery time is also two field times. These timing values are adjusted according to the persistence characteristics of the imaging system.
Figure 8 also shows the timing of the waveforms that drive the laser diode during pulsed mode, and drive the image intensifier. The length 28 of the driving pulse is of the order of 30 nSec. The off time 29 may be varied between 30 nSec and greater than 100 nSec. The image intensifier 9 has the same pulse duration and duty cycle as the laser array. A time delay 32 is adjustab over the range + or - 15 nSec. This is to compensate for circui delays and the longitudinal displacement between the laser arra and the image intensifier. This delay is set once, after the system is constructed and is not thereafter adjusted unless recalibration is required.
Figure 9 shows approximate graphs of the laser optical output 33 and the image intensifier sensitivity 34. These vary from ideal square waves because of the limited response times of the driver circuitry. The effects of these imperfections on the final rang readings are readily calibrated out of the system.
1271047/WF127 - 19/06/89 It will be appreciated that the invention differs from the prior art in that a complete range image is determined, in the preferred embodiment, after three image sampling operations, whereas previous systems have relied on acquiring grange information by scanning the scene with repetitive samples longitudinally, along the direction of light propagation (Mayerand et al.).
As a result of the reduced number of samples, the range representation of scene is acquired much more quickly.
Images obtained from three modes of operation of a pulsed illuminator, gated receiver imaging system are combined through the use of the special hardware to produce a representation of range data for the viewed scene. The system captures a complete range image of the viewed scene using just three images, one fro each mode.
In a simplified form of operation which still produces quite acceptable results the step which measures the background image may be eliminated, particularly when background light levels are low.
Since modifications within the spirit and scope of the invention may be readily effected by persons skilled in the art, it is to be understood that the invention is not limited to the particula embodiment described, by way of example, hereinabove.

Claims (10)

THE CLAIMS DEFINING THE INVENTION ARE AS FOLLOWS:
1. An optical radar range finding system comprising a high speed switchable light source for illuminating a scene, high speed switchable imaging device for receiving a reflected image from said scene and providing an output control means, a pulsing means connected to said light source and to said imaging device to provide pulses to trigger said light source and said imaging device, respectively, said control means including a store means for storing image information from said imaging device, control logic to sequence operation of said light source and logic circuitry to process data and produce range information relevant to said scene.
2. An optical radar range finding system according to claim wherein the control logic operates a switching device an the switching device is able to switch the light source be continuously on, triggered by the pulsing means or in an off condition.
3. An optical radar range finding system according to claim wherein the light source is a laser diode array.
4. An optical radar range finding system according to claim wherein the store means is a video frame store.
5. A method of obtaining range information relating to a scene comprising the steps of: (1) illuminating the scene with continuous light from light source and storing first image data reflecte therefrom;
(2) illuminating the scene with high speed switchable pulsed light from the light source and storing second image data reflected therefrom; and
(3) dividing the second image data by the first image data to obtain range image data relevant to the range of the scene.
6. A method according to claim 5, including the further step of (4) storing third image data reflected from the scene in the absence of light from the light source and subtracting the third image data from the first and secon image data, respectively, prior to step (3).
7. A method according to claim 5 using the optical radar range finding system of any one of claims 1 to 3.
8. A method according to claim 6 using the optical radar range finding system of any one of claims 1 to 5.
9. An optical radar range finding system substantially as hereinbefore described with reference to the accompanying drawings.
10. A method of obtaining range information relating to a scene substantially as hereinbefore described with reference to the accompanying drawings.
AU38311/89A 1988-06-20 1989-06-20 Range finding device Abandoned AU3831189A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU38311/89A AU3831189A (en) 1988-06-20 1989-06-20 Range finding device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
AUPI887688 1988-06-20
AUPI8876 1988-06-20
AU38311/89A AU3831189A (en) 1988-06-20 1989-06-20 Range finding device

Publications (1)

Publication Number Publication Date
AU3831189A true AU3831189A (en) 1990-01-12

Family

ID=25624320

Family Applications (1)

Application Number Title Priority Date Filing Date
AU38311/89A Abandoned AU3831189A (en) 1988-06-20 1989-06-20 Range finding device

Country Status (1)

Country Link
AU (1) AU3831189A (en)

Similar Documents

Publication Publication Date Title
CA1332978C (en) Imaging lidar system using non-visible light
US5081530A (en) Three dimensional camera and range finder
US6323942B1 (en) CMOS-compatible three-dimensional image sensor IC
CN1099802C (en) Device and method for detection and demodulation of intensity modulated radiation field
US4708473A (en) Acquisition of range images
EP0462289A1 (en) Apparatus for measuring three-dimensional coordinates
US5048950A (en) Optical radar
US20090091738A1 (en) Surface profile measurement
JP2004538491A (en) Method and apparatus for recording three-dimensional range images
KR20010033549A (en) Method and device for recording three-dimensional distance-measuring images
JP2004523769A (en) Surface shape measurement
US4119379A (en) Optical detection and ranging apparatus
Bretthauer et al. An electronic Cranz–Schardin camera
GB2374743A (en) Surface profile measurement
WO1989012837A1 (en) Range finding device
EP3543742B1 (en) A 3d imaging system and method of 3d imaging
AU3831189A (en) Range finding device
GB2154388A (en) Image processing system
Christie et al. Design and development of a multi-detecting two-dimensional ranging sensor
CN100417915C (en) Scanner-free imaging range finding method and its range finder
JPH0136082B2 (en)
Kotake et al. Performance improvement of real-time 3D imaging ladar based on a modified array receiver
EP0777134A1 (en) Device for observing objects
SU712662A1 (en) Method of remote automatic measuring of demensions of similar objects
SU275423A1 (en) Photoelectronic device for measuring hot and cold rolled thickness