WO1993009523A1 - Video-based object acquisition, identification and velocimetry - Google Patents

Video-based object acquisition, identification and velocimetry Download PDF

Info

Publication number
WO1993009523A1
WO1993009523A1 PCT/CA1991/000400 CA9100400W WO9309523A1 WO 1993009523 A1 WO1993009523 A1 WO 1993009523A1 CA 9100400 W CA9100400 W CA 9100400W WO 9309523 A1 WO9309523 A1 WO 9309523A1
Authority
WO
WIPO (PCT)
Prior art keywords
signals
gated
time
signal
output
Prior art date
Application number
PCT/CA1991/000400
Other languages
French (fr)
Inventor
Dieter Wolfgang Blum
Original Assignee
Traffic Vision Systems International Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Traffic Vision Systems International Inc. filed Critical Traffic Vision Systems International Inc.
Priority to PCT/CA1991/000400 priority Critical patent/WO1993009523A1/en
Publication of WO1993009523A1 publication Critical patent/WO1993009523A1/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/052Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/64Devices characterised by the determination of the time taken to traverse a fixed distance
    • G01P3/68Devices characterised by the determination of the time taken to traverse a fixed distance using optical means, i.e. using infrared, visible, or ultraviolet light
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

Apparatus for determining the time T in which a moving object traverses a distance D between first and second boundaries which transversely intersect the path (60) along which the object moves in a particular direction (50). A scanning means (62, 61) scans the path in a direction parallel to the particular direction and produces a plurality of signals (L1, L2, ...LN) representative of the relative luminance of a corresponding plurality of portions of the scanned path. A gating means (106, 109, 110, 111, 112) gates the signals during first and second time intervals. A comparison means (129, 130, 131, 132) compares selected groups of the gated signals (230, 231, 232, 233) and produces signals (133, 134, 135, 136) representative of detection therein of signal components in one of the groups which exceed signal components in another of the groups. A timing means (78, 137) determines the time T as the time differential between production of first and second signals output by the comparison means, thereby facilitating determination of the object's velocity V=D/T.

Description

VIDEO-BASED OBJECT ACQUISITION. IDENTIFICATION AND VELOCIMETRY
Technical Field
This application pertains to an image acquisition and processing system for acquiring an image of a moving object, determining the object's velocity, and extracting information from the object image. In particular, the system facilitates determination of the velocity of a moving vehicle, location of the vehicle's license plate within an image of the vehicle, and extraction of the vehicle's license identifier from the license plate image.
Background Art
The prior art has evolved a variety of techniques for determining the velocity of moving objects such as vehicles. One very basic technique is human observation of the moving object. It is well known that if a moving object is observed to pass linearly between two points separated by a known distance D within a period of time T then the average velocity V of the moving object may be calculated according to the formula V = D/T. This method is commonly used to enforce traffic speed limits through the use of low-altitude airplane flights over roadways. A human observer on the airplane uses a stop watch to time selected vehicles as they travel between regularly spaced lines marked on the roadway and calculates each vehicle's velocity as aforesaid. This technique suffers a number of disadvantages, including high operational costs, the need for communication with ground personnel to apprehend speed limit violators, and restriction to use in only favourable weather conditions. However, the technique is passive, in the sense that persons in the vehicles being monitored cannot determine that they are being monitored.
Improved prior art vehicle veloci etry techniques include the emission of electromagnetic radiation such as microwave energy, which is aimed at an approaching or receding motor vehicle, some of the energy impinging thereupon being reflected. This reflected energy may be received at the emitter source and will be observed to have undergone a frequency shift (i.e. Doppler shift) propor¬ tional to the difference in velocity between the emitter source and the target vehicle. By measuring this frequency shift one may calculate the velocity of the target vehicle. Police "radar guns" which utilize this principle are in widespread use. However, such devices require a human operator to select a target vehicle, note and record its velocity, and take appropriate speed limit enforcement action. Although such devices are more practical and cost effective than the manual timing method, they are prone to errors caused by improper operation. They are also active, in the sense that persons in the vehicles being monitored may use devices such as "radar detectors" to determine that they are being monitored and take evasive action to avoid apprehension for speed limit violation.
The prior art "radar camera", of which United States Patent No. 4,866,438 Knisch issued 12 September, 1989 is generally representative, attempts to improve upon and automate the "radar gun" method of traffic speed limit enforcement. A radar camera utilizes a reflected radar beam to determine the velocity of each vehicle passing the device. If a vehicle exceeds a predetermined speed then a photographic camera is triggered to take a frontal photo¬ graph of the vehicle. The photograph is subsequently analyzed to identify information on the vehicle's license plate so that appropriate speed limit enforcement action may be taken. The device measures the vehicle's velocity and superimposes that information together with other relevant data such as time, date and location on the photo¬ graph to provide hard evidence that the vehicle was exceed¬ ing the prescribed speed limit.
Because all objects moving through a radar camera's emitted radar beam which are of sufficient size and shape to reflect the beam are automatically targeted, the radar camera facilitates increased processing and throughput, and consequential increased efficiency in traffic speed limit enforcement. However, there are some drawbacks to the radar camera technique. For example, the camera's photographic film must be developed, enlarged and manually examined in order to extract the license plate information of vehicles whose velocity was measured to be in excess of the speed limit. Further, warning devices such as the aforementioned "radar detectors" may be used by the motoring public to avoid apprehension for speed limit violation.
Further advances in prior art techniques offering possible improvements to traffic speed limit enforcement have been related to video based license plate extraction methods, and various video-based velocity measurement techniques. One example of a prior art video-based veloc¬ ity measurement system is United States Patent No. 4,866,639 Adrian issued 12 September, 1989. Adrian pro- vides a multiple exposure velocimetry system employing directional compensation, primarily intended for use in determining the velocity of fluid flows by measuring the displacement of fluid-suspended particles between subse¬ quent exposures or images. Another example is United States Patent No. 4,653,109 Lemelson et al issued 24 March, 1987. Lemelson et al provide an image analysis system capable of computing variables such as the velocity of an object by analyzing the changes in successive code signals generated by the pre-processing of video signals represent- ing consecutive exposures of an imaging field.
United States Patent No. 4,893,922 Eichweber issued 16 January, 1990 provides an object location and velocity measurement system which utilizes a strobe-type illuminator synchronized to the imaging exposure rate of a video camera. The video camera receives reflected illumi¬ nator light from a pair of retro-reflectors mounted on the object and spaced apart by a known distance. This facili¬ tates calculation of the object's location (and, by subse¬ quent processing of data derived from a displaced location of the object, calculation of the object's velocity) relative to the camera and illuminator position, based on the distance relationship between two image points defined on the object by the retro-reflectors, as imaged by the video camera.
Such prior art video-based velocimetry methods have been unsuited to traffic speed limit enforcement because they require multiple image exposures to calculate the target object's velocity. Such methods are incapable of providing useful results at the velocities encountered in traffic speed limit enforcement applications. This is due to the low frame rate (typically 25-30 frames per second) capabilities of state of the art video cameras, coupled with the short time period within which a motor vehicle travelling at normal highway speed traverses the camera's relatively small field of view. These factors cause large measurement errors which are directly related to the time increment between successive frames. Even specialized high frame rate (50-60 frames per second) video cameras suffer the same time resolution limit, gaining only a small increase in computable object velocity.
The prior art also teaches various means of extracting desired features from video camera-acquired images, including location and extraction of vehicle license plate information. For example, United States Patent No. 4,603,390 Mehdipour et al issued 29 July, 1986 discloses a computerized parking system in which video cameras are positioned at the entrances and exits of a con¬ trolled parking area to image the license plates of ve- hides entering and leaving the area. The cameras' video signals are digitized and processed to extract license plate information. This information is stored as each vehicle enters the area and is subsequently compared with information similarly derived from exiting vehicles to determine the time spent within the controlled parking area by each exiting vehicle, thus automating parking fee calculation and ticket issuance procedures. United States Patent No. 4,774,571 Mehdipour et al issued 27 September, 1988 discloses a computerized ticket dispenser system for use in such a controlled vehicle parking area. The '571 patent is similar to the '390 patent, but couples the license plate extraction and timing methodologies to print tickets upon entry and exit of a vehicle from the con¬ trolled parking area.
United States Patent No. 4,817,166 Gonzalez et al issued 28 March, 1989 discloses another apparatus for reading vehicle license plates. A video camera produces an image of a vehicle license plate. A scanning apparatus is then used to find the license plate number within the image. This is followed by character extraction and state/country logo extraction from the scanned image and then by verification against prestored information repre¬ sentative of the font style used by the particular state/country.
United States Patent No. 4,963,723 Masada issued
16 October, 1990 discloses an automatic toll collector. A mechanism for recognizing vehicle license plate information installed at the entrance to a toll road stores in a memory device data representative of the time each vehicle enters the toll road, together with a vehicle identifier based upon information extracted from the vehicle's license plate. A similar mechanism installed at the toll road exit again extracts information from the exiting vehicle's license plate, compares it with the data stored in the memory device, and calculates the appropriate toll charge. United States Patent No. 4,878,248 Shyu et al issued 31 October, 1989 discloses a system for recognizing license plate characters wherein an image sampling trigger is produced by photosensors arranged along the two opposing sides of the vehicle driving path. Interruption of the photosensor light path by a passing vehicle indicates that a vehicle has reached a predetermined position relative to the image sampling system and whether the view of the vehicle's license plate is obscured by another vehicle. The license plate character extraction means is similar to other prior art as mentioned above.
The foregoing prior art video-based image acqui¬ sition methods assume prior knowledge of the approximate position of the targeted object and portions thereof (i.e. a vehicle's license plate) . They also assume that the target object is either stationary or is moving relatively slowly through the imaging field of view. Given these factors, prior art methods which utilize a single video camera as the image sensing device cannot facilitate practical traffic speed limit enforcement or traffic supervision, because their resolution is inadequate to allow extraction of meaningful details when imaging at least one entire lane of traffic which is moving at normal highway speed.
United States Patent No. 4,539,590 Gage issued 3 September, 1985 discloses a method of processing video signals in an optical tracking system, wherein a top and bottom scanning line originating from a video camera are digitized (i.e. gated) and stored as a reference for the purpose of background elimination, thereby providing a signal representative of only the tracked target. The present invention also gates scanned signals produced by a video camera, but both the scanning methodology and the subsequent processing of the camera's output signals differ from Gage's method. So far as the applicant is aware, the prior art has not evolved a practical, accurate video-based means for traffic speed limit enforcement capable of determining the velocity of each vehicle in a stream of vehicles moving at normal highway speed, and extracting license plate data for each vehicle, as is accomplished by the present invention.
Disclosure of Invention In accordance with the preferred embodiment, the invention provides apparatus for determining the time T in which a moving object traverses a distance D between first and second boundaries which transversely intersect the path along which the object moves in a particular direction, thereby facilitating determination of the object's velocity V=D/T. A scanning means such as a video camera scans the path in a direction parallel to the particular direction and produces a plurality of signals representative of the relative luminance of a corresponding plurality of por- tions of the scanned path. A gating means gates the signals during first and second time intervals. A compari¬ son means compares selected groups of the gated signals and outputs signals representative of detection of signal components in one group which exceed signal components in another group. A timing means determines the time T as the time differential between production of first and second signals output by the comparison means.
A signal grouping means such as an integrator is coupled between the gating means and the comparison means.
The signal grouping means groups (averages) the gated signals during selected time intervals for input to the comparison means.
Advantageously, first and second gating means are provided to gate the scanning means output signals during first and second time intervals; first and second compari- son means are provided to compare selected groups of the gated signals output by the first and second gating means to produce signals representative of detection of signal components in one group which exceed signal components in another group. First and second signal grouping means (i.e. integrators) are coupled between the first gating means and the first comparison means to group the gated signals output by the first gating means during a third time interval for input to the first comparison means. Third and fourth signal grouping means are coupled between the second gating means and the second comparison means to group selected numbers of the gated signals output by the second gating means during a fourth time interval for input to the second comparison means. The third time interval is much shorter than the fourth time interval. In particular, the third time interval may be the time required to scan several of the path portions; and, the fourth time interval may be the time required to scan substantially all of the path portions.
If vehicle license plate data is to be acquired then first and second scanning means are preferably pro¬ vided for scanning first and second segments of the path in a direction parallel to the particular direction and for producing first and second pluralities of signals represen¬ tative of the relative luminance of corresponding plural¬ ities of portions of the scanned path segments. First and second gating means gate the first plurality of signals during first and second time intervals and produce first and second gated output signals. Third and fourth gating means gate the second plurality of signals during the first and second time intervals and produce third and fourth gated output signals. First, second, third and fourth comparison means respectively compare selected groups of the first, second, third and fourth gated output signals and produce first, second, third and fourth signals re¬ spectively representative of detection, in the gated output signals, of signal components in one group which exceed signal components in another group. The timing means determines the time T as the time differential between production of first and second signals output by the first and second comparison means or output by the third and fourth comparison means.
The invention also provides a method of determin¬ ing the velocity "V" of a moving object. A path is defined along which the object is expected to move in a particular direction. A field of view overlapping a portion of the path is also defined, the field of view comprising a plurality of scan lines oriented parallel to the particular direction. The field of view is scanned along each of the scan lines to produce a corresponding plurality of signals representative of the relative luminance of each image portion intersected by the respective scan lines. Entry and exit boundaries which transversely intersect the scan lines are defined, the boundaries defining a separation distance D. The signals are gated during first and second time intervals corresponding to the respective widths of the entry and exit boundaries. Selected groups of the gated signals are compared to detect signal components in one group which exceed signal components in another group. An entry detection signal is produced upon detection, within one selected group of gated entry boundary signals of signal components which exceed signal components in another selected group of gated entry boundary signals; and, an exit detection signal is produced upon detection, within one selected group of gated exit boundary signals of signal components which exceed signal components in another selected group of gated exit boundary signals. The time differential T between the respective occurrences of the entry and exit detection signals is determined, thereby facilitating determination of the velocity as V = D/T. The object may be a motor vehicle, in which case the scanning step is preferably performed at a location displaced to the rear of the vehicle, from which license plate data on the vehicle can be resolved to an acceptable degree.
Before the comparing step the gated signals are grouped (averaged) by integration over third and fourth time intervals. The third time interval is preferably much shorter than the fourth time interval. For example, the third time interval may be the time required to scan several scan lines and the fourth time interval may be the time required to scan substantially all of the scan lines.
The time determining step may comprise determina¬ tion of the time interval Te between:
(i) initial detection within a selected group of gated entry boundary signals of signal components which exceed signal components in another se- lected group of gated entry boundary signals; and, (ii) initial detection within a selected group of gated exit boundary signals of signal components which exceed signal components in another se- lected group of gated exit boundary signals; and, determination of the time interval Tχ between:
(i) the time during detection within the selected gated entry boundary of signal components in one group exceeding signal components another group, at which said exceeding relationship ceases; and,
(ii) the time during detection within the selected gated exit boundary of signal components in one group exceeding signal components another group, at which said exceeding relationship ceases. If Db is defined as the distance between the leading edges of the respective entry and exit boundaries; and, Dt is defined as the distance between the trailing edges of the respective entry and exit boundaries, the velocity V may then be determined as the average of the velocities Ve and V , where Ve=Db/Te is the entry velocity at which the object intersects the leading edges, and Vχ=Dt/Te is the exit veloc- ity at which the object intersects the trailing edges.
By determining the time interval Tt between the time intervals Te and Tχ, one may also determine the object's physical length Ot = T-V.
Advantageously, a second field of view may be defined to overlap a second portion of the path, the second field of view comprising another plurality of scan lines oriented parallel to the particular direction aforesaid. The field scanning and subsequent steps may then be per¬ formed with respect to the second field of view, simulta¬ neously as those steps are performed with respect to the first field of view. This duplicates the velocity deter¬ mining capability mentioned above; and, in the case of traffic speed limit enforcement (i.e. if the object is a motor vehicle and the path is a highway) facilitates determination of vehicle license plate data by combining the scanned first and second fields of view, if the scan¬ ning step is performed at a location displaced to the rear of the target vehicle from which license plate data on the vehicle can be resolved to an acceptable degree within the imaged fields of view or some selected combination thereof.
Brief Description of Drawings Figure 1 diagrammatically illustrates a prior art technique for scanning one field of a video image.
Figure 2 diagrammatically illustrates a prior art technique for scanning one interlaced video frame of a video image. Figure 3 diagrammatically illustrates a prior art technique for scanning one field of a video image of a moving object.
Figure 4 diagrammatically illustrates a technique for scanning one field of a video image of a moving object in accordance with the present invention.
Figure 5 is a partial diagrammatic and block circuit diagram of the image acquisition methodology and apparatus employed by the present invention.
Figures 6(A) and 6(B) depict signal waveforms representative of the operation of the apparatus of Figure 5.
Figures 7A and 7B together comprise a block circuit diagram of apparatus for detecting a moving object and determining its velocity in accordance with the pre- ferred embodiment of the invention.
Figures 8(A) through 8(G); and, 9(A) through 9(F) depict signal waveforms (amplitude versus time) representa¬ tive of the operation of the apparatus of Figure 7.
Figure 10 is a block circuit diagram of appar¬ atus for manipulating and storing data acquired by the apparatus of Figure 7, for storing and for communicating control information and data.
Best Modefs) For Carrying Out the Invention
Figure 1 illustrates the image scanning procedure employed by prior art "vidicon" type optical image sensing tubes, as in state of the art video cameras. Such prior art image scanning processes typically compose an image "frame" by combining two "interlaced" fields in an effort to conserve the bandwidth of the resultant electronic video
SUBSTITUTESHEET signals. Figure 1 depicts a single field in which scanning beam 5 initially traverses the imaged field of view from left to right along line L1 during a first time period 1 called the "horizontal period". When beam 5 reaches the right edge of the field of view, it rapidly returns along diagonal retracing path 6 to the left side of the field of view. This defines a new starting position 4 for beam 5, leaving an unscanned line position 8 (i.e. L2) between scan lines L-, and 1^. That is, after scanning along line L-,, beam 5 next scans the field of view from left to right along line Lj. This process continues until all odd-numbered lines within the field of view have been scanned.
The scan lines shown in Figure 1 are odd-num- bered, starting from L1 and incrementing by two up to LN, where "N" is the total number of scan lines within the imaged field of view. When beam 5 reaches the right side of the field after scanning the last line 1^, the beam returns along retrace path 7 to starting position 8 at the top right side of the image, interposed between previously scanned lines L-, and Lj.
Starting position 8 defines the beginning of the second field. The lines scanned in the second field are those which were skipped during scanning of the first field. That is, as the second field is scanned, beam 5 traverses even-numbered lines L2, L4, etc. interposed between the previously scanned odd-numbered lines L1# L3, etc. The first and second fields are thus said to be "interlaced" with one another. The first field, containing odd-numbered scan lines L1, L3, etc., is called the "odd field"; and, the second field, containing even-numbered scan lines L2, L , etc., is called the "even field". Time period 2, called the "vertical period", is the product obtained by multiplying the horizontal period 1 by the total number of scan lines "N". In practice, the vertical period includes retrace time intervals 6 and 7. Figure 2 depicts a complete image frame made up of two interlaced video fields as yielded by the prior art scanning process aforesaid. The first odd field scan line is followed, one field period later, by the first inter¬ laced even field scan line L2, etc. In the example of Figure 2, the last scan line LN is assumed to conclude the even field. After scanning the last even line 1^, the scanning process begins again at the start of a fresh odd field as shown by retrace path 15. The ratio of the horizontal image width 10 to the vertical image height 11 is called the "image aspect ratio", which is typically 4 horizontal units by 3 vertical units. Prior art inter¬ laced video scanning systems provide from 525 to more than 625 vertically-spaced scan lines per frame and commonly utilize field rates on the order of 50 to 60 fields per second (i.e. frame rates of 25 to 30 frames per second) . Other prior art video scanning systems provide from 1050 to 1250 vertically-spaced scan lines per frame; and, differing aspect ratios, for example 16H x 9V, which are utilized for high-definition television applications.
Figure 3 depicts an odd field of video scanning a field of view through which an object moves from an initial position 35 to a new position 36 after two fields have been imaged (one frame period) . The scan lines start at L, and are odd-numbered through to L,,. Scan lines 48 and 49 intersect the object's image at its initial position 35. For illustrative purposes, the object is shown to have moved, such that its new position 36 is displaced from initial position 35 by the equivalent of one scan line. As a result, scan line 47 now intersects the top of the displaced object's image. Assuming the object to be brighter than the imaged background, the luminance signa- tures returned by the scan lines within the first odd field (shown in the column to the immediate right of the image frame) are composed of dark lines 38 which do not intercept the object's image; and, where such intercepts do occur, lines 39 containing varying representative luminance values. After a frame period 37 (i.e. two field periods) elapses, and given the aforesaid displacement of the object, a second odd field (shown in the column to the far right of the image frame) will be composed of non-inter¬ cepting dark lines 40, and intercepting lines 41 which reveal the object's displacement when the luminance values of lines 41 are compared to the luminance values of the corresponding scan lines of the previous odd field.
It will be evident from the foregoing illustra¬ tion that, through examination and comparison of luminance signatures, the minimum resolvable time increment for detecting displacement of a moving object within a prior art interlaced scanned video image is equivalent to the frame period, if the comparison is based upon successive corresponding video fields. A prior art interlaced video scanning system having a frame rate of 30 frames per second requires 1/30 second to image one frame. A vehicle moving at 100 kilometres per hour travels 27.78 meters per second; or, .93 meters in the time required to image one frame at a frame rate of 30 frames per second. A vehicle moving at 120 kilometres per hour travels 1.1 meters in the time required to image one frame at a frame rate of 30 frames per second. In other words, a high quality optical system focused (vertically downward) over a distance of about 8 meters could image only 9 frames in the time required for a vehicle moving at 100 kilometres per hour to pass com- pletely through the imaged field of view. If the same optical system imaged only 8 frames then the corresponding vehicle speed increases to 108 kilometres per hour; and, at 7 frames, to 123 kilometres per hour. This resolution is too coarse to serve as a practical aid to traffic speed limit enforcement. Figure 4 depicts an odd field of video scanning a field of view through which an object is moving to illustrate the present invention's capability of detecting the object within a single (odd or even) field scan. The scanning process, imaged area aspect ratio, and the use of alternately scanned odd/even fields to create an inter¬ laced frame is similar to the prior art described above. However, scanning raster is rotated by 90°, as hereinafter explained. A moving object (assumed to be more luminous than the imaged background) is shown to be at an initial position 54 within the imaged field of view. Odd field scan lines L-, through Lκ are depicted. The object moves in a direction 50 which is substantially parallel to the direction in which lines L, through LM are scanned.
At regular time intervals equivalent to the horizontal scan period 56 (which, because the scanning raster has been rotated by 90° as compared to the Figure 1- 3 rasters, is now the time period required to scan any one of lines L-, -1^) scanning lines I>. through LN exhibit lumin¬ ance values representative of the luminosity of correspon¬ ding portions of the imaged field of view. More particu¬ larly, as shown in the row beneath the image frame, these values consist of dark lines 52 which do not intercept the image of the moving object; and, where such intercepts do occur, lines 57 containing luminance values which vary in proportion to the luminosity of the portion of the object intercepted by lines 57. Displacement of a moving object can thus be ascertained within a minimum time interval equivalent to a single horizontal scan period 56 by compar¬ ing the positions of the luminance signatures of adjacent scan lines within the horizontal time period. By averaging across several scan lines one may correct for object shape irregularities. Prior art NTSC type interlaced video scanning systems employ a horizontal time period of 63.5 microseconds (μsec) . A vehicle moving at 100 kilometres per hour travels only 1.764 millimetres in 63.5 μsec; and, a vehicle moving at 120 kilometres per hour travels only 2.116 millimetres in 63.5 μsec. It can thus be seen that scanning in a direction parallel to the presumed direction of object motion provides a theoretical capability of detecting objects moving at speeds far in excess of those attainable by motor vehicles. This principle underlies the present invention's capability of providing a practical aid to traffic speed limit enforcement, as is now explained in greater detail.
Figure 5 depicts two colour video cameras 61, 62 arranged so that their respective fields of view 63, 64 cover approximately equal, slightly overlapping portions of a single traffic lane 60 bounded by shoulder 68 and center line 69. Two cameras are required to obtain vehicle license plate images of adequate resolution using reasonab¬ ly priced, currently available cameras having about 450 lines of horizontal resolution, positioned to look down at an angle on traffic lane 60.
If vehicle velocity was the only parameter of interest, then image resolution factors would not be of particular concern. For example, a single camera could theoretically be positioned to look vertically down on traffic lane 60 from a height sufficient to ensure that vehicles moving at highway speeds remained within the cam¬ era's field of view for a time sufficient to facilitate frame-based velocimetry. This would require an imprac- tically tall camera support tower. However, even if a sufficiently tall camera support was available, the camera would be too far away to yield images from which data appearing on vehicle license plates could be ascertained.
A camera vantage point such as a highway overpass lowers the camera to a location from which the camera's resolution facilitates ascertainment of data appearing on vehicle license plates. However, when one considers the impact on image resolution of: (1) the horizontal focus width required to image across an entire traffic lane from a distance at which license plate data can be resolved to some acceptable degree; (2) the impact on image resolution of foreshortening in the vertical plane caused by the camera's angle of inclination relative to traffic lane 60; (3) the fact that a vehicle's license plate may be anywhere on the rear end of the vehicle (i.e. offset to one side or the other) ; and, (4) the fact that a vehicle may change lanes, further altering the relative location of the vehicle's license plate within the field of view; it turns out that, given reasonably priced, currently available cameras having about 450 lines of horizontal resolution, two cameras are required to image across an entire traffic lane from a distance at which license plate data can be resolved to an acceptable degree.
Cameras 61, 62 acquire optical images and convert the images into corresponding composite colour video signals output on lines 71, 70 respectively. These signals are respectively amplified by unity gain buffers 72, 73. The signals output by buffers 72, 73 are fed on lines 74, 75 (respectively labelled "Camera A Video" and "Camera B Video") to additional circuitry for further processing as hereinafter described. Video clock/synchronization gener¬ ator 78 outputs horizontal synchronization signals "HSYNC" on line 77; and, vertical synchronization signals "VSYNC" on line 76 for synchronizing the operation of cameras 61, 62 and other portions of the apparatus, as hereinafter ex- plained. Video clock/synchronization generator 78 also outputs a master clock signal "MST CLK" on line 79 which is utilized by other portions of the apparatus, as hereinafter explained.
Figure 6(A) depicts a single-line portion of an
NTSC type composite colour video signal waveform which is typical of the signal waveforms output by cameras 61, 62 on lines 71, 70. The illustrated waveform comprises a lumin¬ ance portion 80 of the actual scan line, a colour burst 82, horizontal synchronization pulses 83, and a reference black level 84. The chrominance information is amplitude-modu- lated and contained in two sidebands within luminance signal portion 80. These sidebands may be recovered, in known fashion, with the aid of colour burst 82, as phase and frequency references for generating an appropriate sub- carrier.
Figure 6(B) depicts a composite colour video waveform representative of the composite colour video signals output on lines 70, 71. This waveform is composed of the luminance portions 80 of consecutive scan lines, colour reference bursts 82, horizontal synchronization pulses 83, and reference black level 84. Figure 6(B) also depicts the end and therefore bottom of one video field 81, as well as the vertical blanking interval 85 consisting of a number of equalization pulses 86 and other non-image related signals, which occurs before the start and there¬ fore the top of the next video field 87, as is well known in the art.
Figure 7 is a block diagram of the object detec- tion and velocity calculation circuitry employed in the preferred embodiment of present invention. The Camera A and Camera B composite colour video signals output on lines
74, 75 (Figure 5), are respectively fed to chroma strip circuits 100, 101 which remove all chroma components (colour burst and chrominance sub-carrier sidebands) from the composite video signals, in essence leaving only the luminance component and synchronization pulses of each scanning line intact. The resultant monochrome composite video signals output on lines 102, 103 are fed to the respective inputs of analog switch pairs 109, 110; and,
111, 112. The monochrome composite video signals, together with the original colour composite video signals are also fed on lines 104, 105 to video digitization and acquisition circuitry for further processing as hereinafter explained.
Analog switches 109, 110, 111 and 112 are con- trolled by "entry boundary" and "exit boundary" control signals respectively output by edge gating circuit 106 on lines 107, 108. Edge gating circuit 106 utilizes the aforesaid MST CLK, HSYNC signals to generate the entry and exit boundary signals. The MST CLK signal is selected to be a multiple of the cameras' horizontal scan frequency, and as such is equivalent to the horizontal scan period divided by the achievable horizontal resolution of cameras 61, 62 and their ancillary circuitry. For example, with an NTSC standard horizontal time period of 63.5 μsec and at 640 pixels of camera resolution, a master clock frequency of 10,078,740 Hz would be used. Edge gating circuit 106 is therefore able to generate entry and exit boundary control signals having pulse widths corresponding to a timing resolution of better than 100 nanoseconds (nsec) . The entry and exit boundary control signals control analog switches 109, 110; and. 111, 112 to selectively connect the monochrome composite video signals supplied on lines 102, 103 to the inputs of integrators 113, 114, 115, 116, 117, 118, 119 and 120 via lines 230, 231, 232 and 233 respect- ively.
Integrators 113, 117 are "entry partial field integrators" ("EPFINT") ; integrators 114, 118 are "entry full field integrators" ("EFFINT") ; integrators 115, 119 are "exit partial field integrators" ("XPFINT") ; and, inte¬ grators 116, 120 are "exit full field integrators" ("XFFINT") . As hereinafter explained, each integrator "groups" (i.e. averages) its respective input signal over a time period defined by the particular integrator's time constant. This determines the sizes of objects and rela¬ tive motions which may trigger further operation of the circuitry. For example, it is desirable to limit the circuit's sensitivity to avoid false triggering by small objects such as the passage of leaves or birds through the fields of view imaged by cameras 61, 62. Similarly, false triggering by long time duration events such as the move- ment of shadows as the sun traverses the sky is desirably eliminated.
The outputs of partial field integrators 113, 115, 117 and 119 are respectively fed on lines 121, 123, 125 and 127 to the positive input terminals of comparators 129, 130, 131 and 132. The outputs of full field integra¬ tors 114, 116, 118 and 120 are respectively fed on lines 122, 124, 126 and 128 to the negative input terminals of comparators 129, 130, 131 and 132.
Comparator 129 outputs a binary logic "entry detection" ("ENTRYDET") signal on line 133. Comparator 130 outputs a binary logic "exit detection" ("EXITDET") signal on line 134. The logic signals output on lines 133, 134 define the crossings of predefined notional entry and exit boundaries by moving objects imaged by camera 62; signal components representative of such objects being present in the monochrome composite video signal provided on line 102. The notional entry and exit boundaries simulate the pres- ence of a pair of lines arranged transversely across traffic lane 60 and spaced apart by a known distance "D". Accordingly, determination of the times Te and Tχ at which an object imaged by camera 62 crosses the two notional boundaries facilitates direct calculation of that object's velocity "V" according to the formula V = D/(Te - Tχ) .
Comparators 131, 132 respectively output addi¬ tional binary logic entry and exit detection signals on lines 135, 136. The logic signals output on lines 135, 136 also define the crossings of the notional entry and exit boundaries by moving objects imaged by camera 61; signal components representative of such objects being present in the monochrome composite video signal provided on line 103. The notional entry and exit boundaries facilitate determi¬ nation, as aforesaid, of the velocities of objects imaged by camera 61.
The logic signals output on lines 133, 134 are fed to a "boundary intercept gating, timers and averagers" logic circuit 137 which outputs data on a digital data bus 139 of appropriate width. Logic circuit 137 also outputs an "object detect flag" binary logic signal on line 140, and receives a "read data enable" binary logic signal on line 141. The logic signals facilitate placement, on bus 139, of data indicative of the velocity of a moving object which traverses the notional entry and exit boundaries defined with respect to the camera "A" (i.e. camera 62) image field.
Similarly, the logic signals output on lines 135, 136 are fed to another "boundary intercept gating, timers and averagers" logic circuit 138 which also outputs data on bus 139. Logic circuit 138 also outputs an "object detect flag" binary logic signal on line 142, and receives a "read data enable" binary logic signal on line 143. The logic signals facilitate placement, on bus 139, of data indica- tive of the velocity of a moving object which traverses the notional entry and exit boundaries defined with respect to the camera "B" (i.e. camera 61) image field. Data bus 139 output lines 140, 142 and input lines 141, 143 are con¬ nected to appropriate inputs and outputs of a digital signal processing means, such as a micro processor, as hereinafter described.
The method of extracting object velocity from a video signal in accordance with the invention will now be explained for one video channel (camera "A") , with the aid of the waveforms illustrated in Figures 8 and 9, and with further reference to Figure 7. Figure 8(A) depicts a waveform 232 typical of the HSYNC signal provided on line 77 Figure 8(B) depicts a waveform 234 typical of the MST CLK signal provided on line 79. As previously explained, the frequency of the MST CLK signal will be a multiple of the horizontal scan frequency. Figures 8(C) and 8(D) respectively depict waveforms 235, 236 typical of the entry and exit boundary control signal pulses provided on lines 107, 108. It can be seen that both the starting time and the duration of the entry and exit control signal pulse is determined by counting a number of pulses of the MST CLK signal, relative to the beginning of each new horizontal synchronization interval
239 (Figure 8(A)). In particular, the widths of the pulses depicted in Figures 8(c) and 8(d) determine the relative widths of the respective entry and exit boundaries; and the time displaced between the falling edge of the Figure 8(c) pulse and the rising edge of the Figure 8(d) pulse deter¬ mines the distance between the notional entry and exit boundaries.
The entry and exit boundary control signal pulses provided on lines 107, 108 are applied to analog switches
109, 110 respectively. The monochrome composite video signal input on line 102 to analog switches 109, 110 is illustrated by waveform 240 shown in Figure 8(E) . Waveform
240 comprises a luminance portion 80 disposed between two horizontal synchronization pulses 83, and a reference black level 84 which is devoid of the colour burst information normally present in a composite colour video signal. Analog switch 109 outputs, on line 230, a gated form of the monochrome composite video signal received on line 102. A waveform 237 typical of the gated signal output by analog switch 109 is illustrated in Figure 8(F). Similarly, analog switch 110 outputs, on line 231, another gated form of the same monochrome composite video signal. A waveform 238 typical of the gated signal output by analog switch 110 is illustrated in Figure 8(G) .
It can be seen that the gated video signals output by switches 109, 110 on lines 230, 231 are selected luminance portions of the monochrome composite video signal provided on line 102. The starting time and duration of each selected portion lies within a horizontal scan inter¬ vals, as determined by the respective entry and exit bound- aries.
The gated entry boundary video signal output on line 230 is fed to the inputs of entry partial and full field integrators 113, 114. The time constant of entry partial field integrator 113 is chosen to be on the order of a multiple number of horizontal scan line periods and its output to line 121 is clamped at a maximum level. The time constant of entry full field integrator 114 is chosen to be on the order of the duration of a field period and its output to line 122 is also clamped at a maximum level.
Figure 9(A) depicts a waveform typical of the signal appearing at output 121 of integrator 113. When the imaged field contains only signals representative of a low luminance background, output 121 remains at a low level 152. As a luminous object enters the imaged field, output 121 rises along a sloped rise interval 154 to a clamped peak value 169 during a time period proportional to the product of the integration time constant and the luminance value of the gated entry boundary video signal on line 230. Output 121 remains at the maximum clamped value 169 for an interval proportional to the time required for the imaged luminous object to pass through the notional entry bound¬ ary.
The time constant of integrator 113 is selected such that a number of consecutive gated entry boundary video signal pulses are averaged to yield integrator 113 's output signal. For example, the time constant could be such that 20% of the total number "N" of scan lines are averaged to yield integrator 113's output signal. The gated entry boundary video signal pulses are thus averag¬ ed over some selected portion of the imaged field. This averaging offsets abberations that could otherwise be caused by object shape irregularities. Integrator 113 's time constant also determines the fundamental time resol- ution achievable for purposes of determining moving object velocity.
When the luminous object leaves the notional entry boundary, the output 121 of integrator 113 falls back to the background level 152 along a sloped fall interval 155 during a time period proportional to the product of the integration time constant and the background luminance level.
Figure 9(B) depicts a waveform typical of the signal appearing at output 122 of integrator 114. When the imaged field contains only signals representative of a low luminance background, output 122 remains at a low level 156. As a luminous object enters the imaged field, output 122 rises along a sloped rise interval 157 to a clamped peak value 158 during a time period proportional to the product of the integration time constant and the luminance value of the gated entry boundary video signal on line 230. The time constant of integrator 114 is selected such the output signal produced by integrator 114 represents the average of substantially all gated entry boundary video signal pulses input to integrator 114 throughout an entire field period. The gated entry boundary video signal pulses are thus averaged over the majority of the imaged field to yield a reference signal representative of the background luminance level within the imaged field of view. When the luminous object leaves the predefined entry boundary, the output 122 of integrator 114 falls back to the background level 156 along a sloped fall interval 159 during a time period proportional to the product of the integration time constant and the background luminance level.
The outputs 121, 122 of integrators 113, 114 are respectively fed to the positive and negative inputs of comparator 129 which detects moving objects within the predefined entry boundary. In particular, comparator 129 outputs, on line 133, a binary logic signal having a waveform as illustrated in Figure 9(C). If no object is present within the predefined entry boundary the signal output on line 133 remains at a low level 174. But, if integrator 113's output 121 exceeds integrator 114's output 122 (as illustrated at time 160 in Figures 9(A) and 9(B)) then the logic signal output by comparator 129 on line 133 switches to a high level 163, exhibiting a positive-going transition edge 162. The logic signal output by comparator 129 on line 133 remains at high level 163 as long as integrator 113's output 121 exceeds integrator 114's output 122, but falls back to low level 174, exhibiting a negative-going transition edge 164, at the point in time 161 when integrator 11 's output 122 exceeds integrator 113's output 121.
It can thus be seen that the logic signal output by comparator 129 on line 133 exhibits a high level pulse during a time interval which is equivalent to the product of the image width of the predefined entry boundary and the length of a luminous object traversing that boundary; and, which is inversely proportional to the velocity of the luminous moving object.
The gated exit boundary video signal output on line 231 is fed to the inputs of the exit partial and full field integrators 115, 116. The time constant of integra¬ tor 115 is chosen to be on the order of a multiple number of horizontal scan line periods and its output to line 123 is clamped at a maximum level. The time constant of integrator 116 is chosen to be on the order of the duration of a field period and its output to line 124 is also clamped at a maximum level.
Figure 9(D) depicts a waveform typical of the signal appearing at output 123 of integrator 115. When the imaged field contains only signals representative of a low luminance background, output 123 remains at a low level 165. As a luminous object enters the imaged field, output 123 rises along a sloped rise interval 166 to a clamped peak value 168 during a time period proportional to the product of the integration time constant and the luminance value of the gated exit boundary video signal on line 231. Output 123 remains at the maximum clamped value 168 for an interval proportional to the time required for the imaged luminous object to pass through the notional exit bound¬ ary.
The time constant of integrator 115 is selected such that a number of consecutive gated exit boundary video signal pulses are averaged to yield integrator 115's output signal. As explained above in relation to entry partial field integrator 113, this results in averaging of the gated exit boundary video signal pulses over a portion of the imaged field, thus overcoming object shape irregular- ities. Integrator 115's time constant also determines the fundamental time resolution achievable for the purposes of determining moving object velocity.
When the luminous object leaves the notional exit boundary, the output 123 of integrator 115 falls back to the background level 165 along a sloped fall interval 167 during a time period proportional to the product of the integration time constant and the background luminance level.
Figure 9(E) depicts a waveform typical of the signal appearing at output 124 of integrator 116. When the imaged field contains only signals representative of a low luminance background, output 124 remains at a low level 170. As a luminous object enters the imaged field, output 124 rises along a sloped rise interval 171 to a clamped peak value 172 during a time period proportional to the product of the integration time constant and the luminance value of the gated exit boundary video signal on line 231. The time constant of integrator 116 is selected such that the output signal produced by integrator 116 represents the average of substantially all gated exit boundary video signal pulses input to integrator 116 throughout an entire field period. The gated exit boundary video signal pulses are thus averaged over the majority of the imaged field to yield a reference signal representative of the background luminance level within the imaged field of view.
In practice, partial field integrators 113, 115 will typically have identical time constants of suitably short duration to average some selected portion of the imaged field (i.e. about 20%) . The time constants need not be identical if suitable signal processing techniques are employed to offset differences in the time constants. Full field integrators 114, 116 will also typically have ident¬ ical time constants which are preferably of suitably long duration to average substantially all gated entry boundary video signal pulses input to the respective full field integrators throughout an entire field period. Again, the time constants need not be identical if suitable signal processing techniques are employed to offset differences in the time constants. When the luminous object leaves the notional exit boundary, the output 124 of integrator 116 falls back to the background level 170 along a sloped fall interval 173 during a time period proportional to the product of the integration time constant and the background luminance level.
The outputs 123, 124 of integrators 115, 116 are respectively fed to the positive and negative inputs of comparator 130 which detects moving objects within the predefined exit boundary. In particular, comparator 130 outputs, on line 134, a binary logic signal having a waveform as illustrated in Figure 9(F). If no object is present within the predefined exit boundary the signal output on line 134 remains at a low level 185. But, if integrator 115's output 123 exceeds integrator 116's output 124 (as illustrated at time 177 in Figures 9(D) and 9(E)) then the logic signal output by comparator 130 on line 134 switches to a high level 153, exhibiting a positive-going transition edge 175. The logic signal output by comparator 130 on line 134 remains at high level 174 as long as integrator 115's output 123 exceeds integrator 116's output 124, but falls back to low level 185, exhibiting a negative-going transition edge 176, at the point in time 178 when integrator 116's output 124 exceeds integrator 115's output 123.
It can thus be seen that the logic signal output by comparator 130 on line 134 exhibits a high level pulse during a time interval which is equivalent to the product of the image width of the predefined exit boundary; and, which is the length of a luminous object traversing that boundary, and is inversely proportional to the velocity of the moving luminous object.
The "entry time interval" Te, between the posi¬ tive-going transitions 162, 175 of the Figure 9(C) and 9(F) wave orms is proportional to the image distance Db between the leading edges of the entry and exit boundaries, and is inversely proportional to the velocity of the moving object. With prior knowledge of this notionally defined image distance, one can determine the velocity of the moving object by measuring time interval Te and applying the formula Ve=Db/Te to determine the entry velocity Ve at which the moving object enters the imaging field of view.
The "exit time interval" Tχ, between the nega¬ tive-going transitions 164, 176 of the Figure 9(C) and 9(F) waveforms is proportional to the image distance Dt between the trailing edges of the entry and exit boundaries, and is inversely proportional to the velocity of the moving object. Again, with prior knowledge of this notionally defined image distance, one can determine the velocity of the moving object by measuring time interval Tx and applying the formula Vχ=Dt/Tχ to determine the exit velocity Vχ at which the moving obj ct exits the imaging field of view.
The formula Va = (Ve + Vχ)/2 may then be applied to determine the moving object's average velocity Va as it traverses the imaging field of view and to compensate for illumination distortion and/or possible changes in the object's velocity as it traverses the field of view. The equivalent formula Va = 2D/(Te + Tχ) may alternatively be used, where D=Db=Dt.
The physical length 0t of the moving object can be determined by measuring the time interval Tt between the positive-going transition 175 and the negative-going transition 164, and utilizing the previously calculated average velocity Va in applying the formula 0t = Tt x Va.
As shown in Figure 7, the binary logic signals output by comparators 129, 130 on lines 133, 134 are fed to boundary intercept gating, timing and averaging logic circuit 137, which measures entry and exit time intervals Te, Tχ to a desired degree of resolution, and outputs on data bus 139 a digital representation of their average (i.e. (Te + Tχ)/2). Circuit 137 also outputs, on line 140, a binary logic "object detect flag" to indicate detection of a moving object within the imaged field. This flag is preferably produced as the moving object leaves the notion¬ al entry boundary region. Circuit 137 outputs the data aforesaid onto data bus 139, which is of sufficient width to convey the data to an ancillary processing means such as a microprocessor. The microprocessor may read the data supplied on bus 139 by applying an appropriate strobe signal to the "read data enable" line 141 of circuit 137.
The method and apparatus described above for the camera 62 (i.e. camera "A") video channel is identical in function and operation to the circuitry associated with the camera 61 (i.e. camera "B") video channel, as shown in Figure 7. The waveforms illustrated in Figures 8(A) to 8(G) and Figures 9(A) to 9(F) apply to the operation of the camera "B" video channel in the same manner described above in relation to the camera "A" channel.
Figure 10 is a block circuit diagram of various means for manipulating, processing, storing and communicat¬ ing the determined velocity and acquired image data pro¬ duced by the present invention. Video feeds 104, 105 shown in Figure 10 convey the camera "A" and camera "B" mono¬ chrome and colour composite video signals from the corre- sponding output feeds 104, 105 shown in Figure 7. These four video signals are fed to four channel frame grab¬ ber/two channel video output buffer 190. Grabber/buffer 190 thus receives the monochrome camera "A" video signal on line 102 (Figures 7 and 10) , the colour camera "A" video signal on line 74, the monochrome camera "B" video signal on line 103, and the colour camera "B" video signal on 175. Grabber/buffer 190 also receives the VSYNC and HSYNC signals on lines 76, 77 respectively. By appropriate digital sampling and quantization of its four input chan¬ nels, grabber/ uffer 190 acquires two consecutive odd/even fields of video for each input channel.
The digital image data produced by Grabber/buffer 190 is fed over video data bus 191 to a suitable digital processing means such as microprocessor 193 which may be suitably programmed to perform object extraction, compari- son, manipulation and similar operations on the image data in accordance with known techniques. For example, micro¬ processor 193 may be programmed in known fashion to extract from a video image of a motor vehicle the image portion containing the vehicle's license plate and to further process the license plate image to identify the vehicle's license information.
Microprocessor 193 may also be programmed to return processed and/or manipulated video data to grab- ber/buffer 190 on video data bus 191. Data exchange between microprocessor 193 and grabber/buffer 190 is controlled by programming microprocessor 193 to apply suitable control signals to grabber/buffer 190 on control line 192. For example, microprocessor 193 may return to grabber/buffer 190 for subsequent storage by video cassette recorder (VCR) 199 a scaled, rotated, cropped and/or mag¬ nified portion or portions of the image originally supplied to microprocessor 193 by grabber/buffer 190. Similarly, microprocessor 193 may return to grabber/buffer 190 image data such as characters or graphics generated by micropro¬ cessor 193 and superimposed or inserted into a portion or portions of the original image data.
Grabber/buffer 190 outputs video data received from microprocessor 193 as two separate buffered composite video signals, namely a monochrome composite video signal output on line 195, and a colour composite video signal output on line 194. The monochrome composite video output signal may for example be a magnified view of the license plate portion of a motor vehicle image; and, the colour composite video output signal may be a view of the traffic lane being imaged. The two views are composed by micropro¬ cessor 193 from video data acquired as aforesaid by grabber/buffer 190 from the camera "A" and camera "B" video signals. Microprocessor 193 is capable of splicing together portions of the digital video data representative of the images obtained from camera "A" and camera "B" to yield a full traffic lane width image. Microprocessor 193 may also insert or superimpose textual or graphics data onto either of the two views, such as data representative of time, date, location, velocity, etc.
The video signals output on lines 194, 195 are fed to video selector/mixer 197 which is capable of select¬ ing an input video signal for output on line 198, which is in turn fed to the composite video input of VCR 199. The operation of video selector/mixer 197 is programmably controlled by microprocessor 193 via control line 196. VCR 199 is also programmably controlled by microprocessor 193, via VCR control interface 206, through suitable signals applied on lines 202, 207 (line 207 may be an infrared link) . VCR control interface 206 receives function data from microprocessor 193 on data bus 203.
A digital memory means 205 is coupled to micro¬ processor 193 in conventional fashion via data bus 203, address line 201 and read/write control line 204. Memory 205 stores program data, as well as any variable data utilized by microprocessor 193 in controlling the operation of the preferred embodiment.
Microprocessor 193 is suitably programmed to read the camera "A" and camera "B" object detect flag signals input on lines 140, 142. It will be recalled that these signals indicate detection of moving objects within the imaging fields of a view of cameras "A" and "B" respective¬ ly. Microprocessor 193 is also programmed to read the timing data applied to data bus 139 by logic circuits 137, 138 as described above in relation to Figure 7. Micropro¬ cessor 193 reads the timing data by applying suitable enable binary logic signals to lines 141, 143 respectively. By reading the timing data, and by retrieving from memory 205 predefined parametric data defining the entry/exit boundary image distances Db, Dt, microprocessor 193 can perform the previously described calculations in order to determine object velocity.
Microprocessor 193 may also be programmed to compensate for image field distortion (due to angular offsets in the camera imaging views) by applying a suitable compensation algorithm or by applying data from a table of predefined compensation factors stored in memory 205. Microprocessor 193 may also be programmed to compare data extracted from vehicle license plate images with data defining sought after, suspect or stolen motor vehicle license plate identifiers stored in memory 205. If a match is detected then microprocessor 193 may output a suitable alarm signal.
Microprocessor 193 may be provided with means for transmitting and/or receiving digital data over radio fre¬ quencies (RF) . To this end there is provided an RF trans¬ ceiver 213, which is capable of sending (or receiving) RF signals to (or from) antenna 215 on line 214. Transceiver 213 is programmably controlled by microprocessor 193 via suitable signals applied on radio control line 217. Trans¬ ceiver 213 is coupled to mode 210 via receive and trans¬ mit signal lines 212, 211 respectively. Modem 210 is programmably controlled by microprocessor 193 via suitable signals applied on modem control line 208, and sends (or receives) digital data to (or from) microprocessor 193 over data bus 209. This RF data interface may be used to transmit data and/or alarm messages to a remote monitoring point. For example, police vehicles may be equipped with receivers tuned to receive data and/or alarm messages transmitted by transceiver 213 thereby alerting the author¬ ities to the presence of a vehicle bearing a license plate identifier matching data in the stored suspect/stolen vehicle data mentioned previously. The RF data interface could also be used to remotely update or change the oper- ational and/or program parameters which control the oper¬ ation of microprocessor 193. For example, the sus¬ pect/stolen vehicle and other data stored in memory 205 could be remotely updated by RF transmission to transceiver 213.
It can thus be seen that microprocessor 193 is capable of controlling the acquisition, processing and manipulation of video data; generating video signals representative of desired images; and, storing such images along with relevant data on suitable media such as magnetic tape. Such stored images and data may then be physically retrieved for non-real time analysis and processing pur¬ poses, used for archival or investigative purposes, etc. Further, radio frequency based communications may be used to convey data to and from the apparatus.
It will be understood that the described polarity or magnitude of analog signals, the polarity of the various logic signals and the use of analog or digital processing means are for descriptive purposes only. The described use of means for detecting objects more luminous than the static background image is not meant to exclude detection of objects which are less luminous than the static back¬ ground image, nor is it intended to exclude the use of bi-directional means for sensing moving objects which are either more or less luminous than the static background image. Such bi-directional sensing means could be employed to sense rapid luminance changes caused by movement of objects in either direction through the viewed image field.
Although specific components and designated arrangements of elements have been stated in the above description of the preferred embodiment of the invention, other suitable equivalent components and arrangements of elements may be used with satisfactory results and various degrees of quality, or other obvious modifications may be made by persons skilled in the art in order to enhance its construction to thereby increase its utility.
For example, although the invention has been described in the context of a system capable of determining both vehicle velocity and license plate information, those skilled in the art will realize that license plate informa¬ tion may be acquired by alternate means not involving video image processing. If the video processing system need not acquire license plate data then there will be greater freedom in selection of the vantage point from which the video camera(s) image the passing stream of vehicles. The invention could for example be employed in the style of the conventional "radar gun" to image vehicles passing a police car or other monitoring point at the side of a highway. In such case the camera's field of view overlaps a portion of the highway immediately adjacent the monitoring point (i.e. vehicles travelling along the highway pass through the field of view from left to right; or, from right to left) . The camera would be positioned so that its scan lines were oriented parallel to the direction in which vehicles pass through the field of view, as described above.
As another example, those skilled in the art will understand that the analog circuitry described above in relation to the preferred embodiment could be replaced by equivalent digital circuity, or by a suitably programmed combination of digital signal sampling and processing circuitry. Given present day capabilities of analog versus digital circuit components and their respective costs, the analog embodiment is presently the cheaper of the two. One difference between the analog and digital embodiments is that the sequence in which the signals are processed need not be the same in both embodiments. For example, the above-described analog embodiment contemplates sequential performance of signal gating, averaging and comparison steps. An equivalent digital embodiment could digitally quantize and sample the signals output by the video cam¬ era(s) and perform direct comparisons of selected signal components without any averaging step, per se.
It will be understood that such changes of de- tails, materials, arrangements of parts, and uses of the invention described and illustrated herein are intended to be included within the principles and scope of the claimed invention, as defined by the following claims.

Claims

WHAT IS CLAIMED IS:
1. A method of determining the velocity "V" of a moving object, comprising:
(a) defining a path along which said object is expected to move in a particular direction;
(b) defining a field of view overlapping a portion of said path, said field of view comprising a plurality of scan lines oriented parallel to said direction; (c) scanning said field of view along each of said scan lines to produce a corresponding plurality of signals representative of the relative lumin¬ ance of each image portion intersected by said respective scan lines; (d) defining entry and exit boundaries which trans¬ versely intersect said scan lines, said bound¬ aries defining a separation distance D;
(e) gating said signals during first and second time intervals corresponding to the respective widths of said entry and exit boundaries;
(f) comparing selected groups of said gated signals to detect signal components in one of said groups which exceed signal components in another of said groups; (g) producing an entry detection signal upon detec¬ tion, within one selected group of said gated ■ entry boundary signals of signal components which exceed signal components in another selected group of said gated entry boundary signals; (h) producing an exit detection signal upon detec¬ tion, within one selected group of said gated exit boundary signals of signal components which exceed signal components in another selected group of said gated exit boundary signals; (i) determining the time differential T between the respective occurrences of said entry and exit detection signals; and, (j) determining said velocity as V = D/T.
2. A method as defined in claim 1, further comprising, before said comparing step (f) , grouping said gated signals by integration over third and fourth time intervals.
3. A method as defined in claim 2, wherein said third time interval is much shorter than said fourth time interval.
4. A method as defined in claim 3, wherein:
(a) said third time interval is the time required to scan several of said scan lines; and, (b) said fourth time interval is the time required to scan substantially all of said scan lines.
5. A method as defined in claim 4, wherein said determin¬ ing step (i) further comprises: (a) determining the time interval Te between:
(i) initial detection within said gated entry boundary signals of signal components in one of said groups of gated entry boundary signals which exceed signal components in another one of said groups of gated entry boundary signals; and, (ii) initial detection within said gated exit boundary signals of signal components in one of said groups of gated exit boundary signals which exceed signal components in another one of said groups of gated exit boundary signals;
(b) determining the time interval Tχ between:
(i) the time during said detection within said selected gated entry boundary of signal components in one group exceeding signal components another group, at which said exceeding relationship ceases; and,
(ii) the time during said detection within said selected gated exit boundary of signal components in one group exceeding signal components another group, at which said exceeding relationship ceases.
6. A method as defined in claim 5, wherein said velocity determining step (j) further comprises:
(a) defining Db as the distance between the leading edges of said respective entry and exit bound¬ aries;
(b) defining Dt as the distance between the trailing edges of said respective entry and exit bound¬ aries; and,
(c) determining the entry velocity Ve at which said object intersects said leading edges as Ve=Db/Te;
(d) determining the exit velocity Vx at which said object intersects said trailing edges as Vχ=Dt/Te; and,
(e) determining said velocity V as the average of said velocities Ve and Vχ.
7. A method as defined in claim 5, further comprising:
(a) determining the time interval Tt between said time intervals Te and Tx; and,
(b) determining said object's physical length Ot = TtV.
8. A method as defined in claim 1, further comprising:
(a) defining a second field of view overlapping a second portion of said path, said second field of view comprising another plurality of scan lines oriented parallel to said direction; and, (b) performing said steps (c) through (j) with respect to said second field of view simulta¬ neously with the performance of said steps (c) through (j) with respect to said first field of view.
9. A method as defined in claim 1, 2, 3, 4, 5 or 6, wherein said object is a motor vehicle and wherein said scanning step is performed at a location dis¬ placed to the rear of said vehicle from which license plate data on said vehicle can be resolved to an acceptable degree within said field of view.
10. A method as defined in claim 8 wherein said object is a motor vehicle and wherein said scanning step is performed at a location displaced to the rear of said vehicle from which license plate data on said vehicle can be resolved to an acceptable degree within said first or second fields of view or within a selected combination thereof.
11. Apparatus for determining the time T in which a moving object traverses a distance D between first and second boundaries which transversely intersect the path (60) along which said object moves in a particular direc¬ tion (50) , said apparatus comprising: (a) scanning means (62, 61) for scanning said path in a direction parallel to said particular direc¬ tion and for producing a plurality of signals (L.,, L2, ... LN) representative of the relative luminance of a corresponding plurality of por¬ tions of said scanned path; (b) gating means (106, 109, 110, 111, 112) for gating said signals during first and second time inter¬ vals; (c) comparison means (129, 130, 131, 132) for compar¬ ing selected groups of said gated signals (230, 231, 232, 233) and for producing signals (133,
134, 135, 136) representative of detection of signal components in one of said groups which exceed signal components in another of said groups; and, (d) timing means (78, 137) for determining said time T as the time differential between production of first and second signals output by said compari¬ son means.
12. Apparatus as defined in claim 11, further comprising signal grouping means (113, 114, 115, 116, 117, 118, 119, 120) coupled between said gating means and said comparison means, said signal grouping means for grouping said gated signals (230, 231, 232, 233) during selected time intervals for input to said comparison means.
13. Apparatus as defined in claim 12, wherein said signal grouping means averages said gated signals during said selected time intervals.
14. Apparatus for determining the time T in which a moving object traverses a distance D between first and second boundaries which transversely intersect the path (60) along which said object moves in a particular direc¬ tion (50) , said apparatus comprising: (a) scanning means (62) for scanning said path in a direction parallel to said particular direction and for producing a plurality of signals (L-,, 1^, ... LH) representative of the relative luminance of a corresponding plurality of portions of said scanned path;
(b) first and second gating means (109, 110) for gating said signals during first and second time intervals;
(c) first and second comparison means (129, 130) for comparing selected groups of said gated signals and for producing signals (133, 134) representa¬ tive of detection of signal components in one of said groups which exceed signal components in another of said groups; and, (d) timing means (78, 137) for determining said time T as the time differential between production of first and second signals output by said compari¬ son means.
15. Apparatus as defined in claim 14, further comprising:
(a) first and second signal grouping means (113, 114) coupled between said first gating means (109) and said first comparison means (129) , said first and second signal grouping means for grouping said gated signals (230) output by said first gating means during a third time interval for input to said first comparison means; and,
(b) third and fourth signal grouping means (115, 116) coupled between said second gating means (110) and said second comparison means (130) , said third and fourth signal grouping means for grouping selected numbers of said gated signals
(231) output by said second gating means during a fourth time interval for input to said second comparison means.
16. Apparatus as defined in claim 15, wherein:
(a) said first and second signal grouping means re¬ spectively average said gated signals output by said first gating means during said third time interval; and, (b) said third and fourth signal grouping means re¬ spectively average said gated signals output by said second gating means during said fourth time interval.
17. Apparatus as defined in claim 15, wherein said third time interval is much shorter than said fourth time interval.
18. Apparatus as defined in claim 15, wherein:
(a) said third time interval is the time required to scan several of said path portions; and, (b) said fourth time interval is the time required to scan substantially all of said path portions.
19. Apparatus for determining the time T in which a moving object traverses a distance D between first and second boundaries which transversely intersect the path (60) along which said object moves in a particular direc¬ tion (50) , said apparatus comprising:
(a) first and second scanning means (62, 61) for scanning first and second segments of said path in a direction parallel to said particular direc¬ tion and for producing first and second plural¬ ities of signals (Lt, L2, ... LN) representative of the relative luminance of corresponding pluralities of portions of said scanned path segments;
(b) first and second gating means (109, 110) for gating said first plurality of signals during first and second time intervals and for producing first and second gated output signals (230, 231) ; (c) third and fourth gating means (111, 112) for gating said second plurality of signals during said first and second time intervals respectively and for producing third and fourth gated output signals (232, 233); (d) first, second, third and fourth comparison means (129, 130, 131, 132) for respectively comparing selected groups of said first, second, third and fourth gated output signals and for producing first, second, third and fourth signals (133, 134, 135, 136) respectively representative of detection, in said gated output signals, of signal components in one of said groups which exceed signal components in another of said groups; and, (e) timing means (78, 137, 138) for determining said time T as the time differential between produc- tion of first and second signals output by said first and second comparison means or output by said third and fourth comparison means.
20. Apparatus as defined in claim 19, further comprising: (a) first and second signal grouping means (113, 114) coupled between said first gating means (109) and said first comparison means (129) , said first and second signal grouping means for grouping said gated signals (230) output by said first gating means during third and fourth time intervals for input (121, 122) to said first comparison means;
(b) third and fourth signal grouping means (115, 116) coupled between said second gating means (110) and said second comparison means (130) , said third and fourth signal grouping means for grouping selected numbers of said gated signals (231) output by said second gating means during said third and fourth time intervals for input (123, 124) to said second comparison means; (c) fifth and sixth signal grouping means (117, 118) coupled between said third gating means (111) and said third comparison means (131) , said fifth and sixth signal grouping means for grouping selected numbers of said gated signals (232) output by said third gating means during said third and fourth time intervals for input (125, 126) to said third comparison means; and,
(b) seventh and eighth signal grouping means (119,
120) coupled between said fourth gating means (112) and said fourth comparison means (132) , said seventh and eighth signal grouping means for grouping selected numbers of said gated signals (233) output by said fourth gating means during said third and fourth time intervals for input (127, 128) to said fourth comparison means.
21. Apparatus as defined in claim 20, wherein said first, second, third, fourth, fifth, sixth, seventh and eighth signal grouping means respectively average said gated signals output by said first, second, third and fourth gating means during said third and fourth time intervals.
PCT/CA1991/000400 1991-11-07 1991-11-07 Video-based object acquisition, identification and velocimetry WO1993009523A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CA1991/000400 WO1993009523A1 (en) 1991-11-07 1991-11-07 Video-based object acquisition, identification and velocimetry

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CA1991/000400 WO1993009523A1 (en) 1991-11-07 1991-11-07 Video-based object acquisition, identification and velocimetry

Publications (1)

Publication Number Publication Date
WO1993009523A1 true WO1993009523A1 (en) 1993-05-13

Family

ID=4172898

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA1991/000400 WO1993009523A1 (en) 1991-11-07 1991-11-07 Video-based object acquisition, identification and velocimetry

Country Status (1)

Country Link
WO (1) WO1993009523A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0695569A1 (en) * 1994-08-01 1996-02-07 Konami Co., Ltd. A system for detecting a position of a movable object without contact
EP0728501A2 (en) * 1995-02-21 1996-08-28 Konami Co., Ltd. A game machine
EP0700017A3 (en) * 1994-08-31 1997-01-29 Nippon Telegraph & Telephone Method and apparatus for directional counting of moving objects
WO1998015934A1 (en) * 1996-10-04 1998-04-16 Robert Bosch Gmbh Device and process for monitoring traffic zones
WO2020261838A1 (en) * 2019-06-25 2020-12-30 ソニー株式会社 Image processing device, image processing method, and program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2204841A1 (en) * 1972-10-30 1974-05-24 France Etat
US4214265A (en) * 1975-10-16 1980-07-22 Lykke Olesen Method and device for supervising the speed of an object
WO1988006326A1 (en) * 1987-02-17 1988-08-25 Regents Of The University Of Minnesota Vehicle detection through image processing for traffic surveillance and control
EP0347090A2 (en) * 1988-06-15 1989-12-20 Eev Limited Vehicle monitoring system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2204841A1 (en) * 1972-10-30 1974-05-24 France Etat
US4214265A (en) * 1975-10-16 1980-07-22 Lykke Olesen Method and device for supervising the speed of an object
WO1988006326A1 (en) * 1987-02-17 1988-08-25 Regents Of The University Of Minnesota Vehicle detection through image processing for traffic surveillance and control
EP0347090A2 (en) * 1988-06-15 1989-12-20 Eev Limited Vehicle monitoring system

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0695569A1 (en) * 1994-08-01 1996-02-07 Konami Co., Ltd. A system for detecting a position of a movable object without contact
US5698861A (en) * 1994-08-01 1997-12-16 Konami Co., Ltd. System for detecting a position of a movable object without contact
EP0700017A3 (en) * 1994-08-31 1997-01-29 Nippon Telegraph & Telephone Method and apparatus for directional counting of moving objects
EP0728501A2 (en) * 1995-02-21 1996-08-28 Konami Co., Ltd. A game machine
EP0728501A3 (en) * 1995-02-21 1996-10-16 Konami Co Ltd A game machine
US5800263A (en) * 1995-02-21 1998-09-01 Konami Co., Ltd. Game machine
AU702260B2 (en) * 1995-02-21 1999-02-18 Konami Co., Ltd. A game machine
WO1998015934A1 (en) * 1996-10-04 1998-04-16 Robert Bosch Gmbh Device and process for monitoring traffic zones
WO2020261838A1 (en) * 2019-06-25 2020-12-30 ソニー株式会社 Image processing device, image processing method, and program

Similar Documents

Publication Publication Date Title
CN110515073B (en) Multi-radar cross-regional networking multi-target tracking identification method and device
US8284996B2 (en) Multiple object speed tracking system
US6760061B1 (en) Traffic sensor
US4214265A (en) Method and device for supervising the speed of an object
JP2669753B2 (en) Video image processor and vehicle detection method
EP1042559B1 (en) Road pavement deterioration inspection system
US5809161A (en) Vehicle monitoring system
US6160494A (en) Machine and method for detecting traffic offenses with dynamic aiming systems
US7460691B2 (en) Image processing techniques for a video based traffic monitoring system and methods therefor
US7663530B2 (en) System and method for monitoring targets
US4758850A (en) Identification of ground targets in airborne surveillance radar returns
US3685012A (en) Apparatus for determining data associated with objects
JPH06194443A (en) Computing system for investigation parameter of traffic of one or more vehicles
KR19980701568A (en) METHOD AND APPARATUS FOR DETECTING OBJECT MOVEMENT WITHIN AN IMAGE SEQUENCE
US4839648A (en) Method of determining the trajectory of a body suitable for moving along a portion of a path, and apparatus for implementing the method
EP0347090A2 (en) Vehicle monitoring system
RU2587662C1 (en) Automated system for detecting road traffic violation at crossroad, railway crossing or pedestrian crossing
US2956117A (en) Telefilm freight car identification system
DE10154861A1 (en) Localizing system for objects uses transmitter for pulsed emission of laser beams and receiver with sensor to pick up reflected beam pulses and to analyze them regarding their execution time
Chachich et al. Traffic sensor using a color vision method
CN115346368B (en) Traffic road side sensing system and method based on integrated fusion of far-view and near-view multiple sensors
CN113627213A (en) Vehicle abnormal behavior monitoring method, device and system
WO1993009523A1 (en) Video-based object acquisition, identification and velocimetry
EP0485192A2 (en) Security system
US20200150228A1 (en) Method of Providing Interference Reduction and a Dynamic Region of Interest in a LIDAR System

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AT AU BB BG BR CA CH CS DE DK ES FI GB HU JP KP KR LK LU MC MG MN MW NL NO PL RO SD SE SU US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IT LU NL SE BF BJ CF CG CI CM GA GN ML MR SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase