New! View global litigation for patent families

US20040125210A1 - Method and apparatus for estimating a camera reference horizon - Google Patents

Method and apparatus for estimating a camera reference horizon Download PDF

Info

Publication number
US20040125210A1
US20040125210A1 US10329320 US32932002A US2004125210A1 US 20040125210 A1 US20040125210 A1 US 20040125210A1 US 10329320 US10329320 US 10329320 US 32932002 A US32932002 A US 32932002A US 2004125210 A1 US2004125210 A1 US 2004125210A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
track
kj
tracks
horizon
reference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US10329320
Other versions
US7253835B2 (en )
Inventor
Yang Chen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HRL Laboratories LLC
Original Assignee
HRL Laboratories LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • G01S3/785Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
    • G01S3/786Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically, i.e. tracking systems
    • G01S3/7864T.V. type tracking systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

A method, apparatus, and computer program product are presented for estimating a camera reference horizon Ŷ0. Operations include receiving images including projections of objects including horizontal edges. Vertical motion is estimated for the images and is stored. The projection of horizontal edges in the images is computed. The projected edges are segmented out and motion is compensated. Projections of the horizontal edges from image to image are used to generate tracks. Active track storage is maintained by adding data to update the storage from each subsequent image. The tracks are screened using a criteria set to produce a subset of tracks to enhance track reliability. Distances to, and heights of, the objects are then estimated. Further screening is performed by calculating an error criteria for each track and selecting a subset satisfying the error criteria. The remaining tracks are used to generate a camera reference horizon estimate Ŷ0.

Description

    RELATED APPLICATIONS
  • [0001]
    This application is related to U.S. patent application Ser. No. 09/886,931, titled “VISION-BASED HIGHWAY OVERHEAD STRUCTURE DETECTION SYSTEM”, filed with the U.S. Patent and Trademark Office on Jun. 20, 2001, which is incorporated by reference herein in its entirety.
  • BACKGROUND
  • [0002]
    (1) Technical Field
  • [0003]
    The present invention relates to image and video processing. More specifically, the present invention relates to a method and apparatus for estimating the location of a camera reference horizon.
  • [0004]
    (2) Discussion
  • [0005]
    There has been notable progress in the development of fully autonomous automobiles capable of navigating through traffic without human intervention. For example, systems may use sensing technology to warn drivers of impending collisions, or even take control of a vehicle in certain situations when the driver either ignores or cannot heed a warning. While the autonomous vehicle will probably exist at some time, existing vehicle systems still have many challenges to overcome in order to be practical.
  • [0006]
    Some collision warning systems use a radar-based detector, combined with a vision-based land-sensing module, to detect and track vehicles ahead of the host vehicle. The radar-based system monitors the moving pattern of all objects tracked by the radar sensor to determine potential threats along the host's path. The host's path is provided by the lane module. The radar sensor has a limitation in that it not only detects moving and stationary vehicles, but also many fixed roadway infrastructures, such as overpasses and overhead signs. A collision warning system that provides frequent false alarms can be a nuisance rather than a help to drivers.
  • [0007]
    Typical radar sensors are designed to have a small vertical field of view (VFOV) −5 degrees in an effort to avoid detection of overhead objects. However, even at a detection range of 120 meters, some overhead objects are still routinely detected. There are a number of possible explanations for the false alarms, including, misalignment of the radar axis relative to the ground, often as a result of the radar transmitter being aimed too high. Other factors include ground reflections, which can create “ghosting”. Additional possible sources of error include radar side lobes, and certain types of terrain. The terrain-based error sources can occur because overhead objects may actually be within the vertical field of view due to the slope of the road. Therefore redesigning a radar sensor to provide a narrower vertical beam will not likely be successful in solving the problem completely. Additionally, many of these solutions could result in a less robust system that might miss actual obstructions and may still generate false alarms. There is a need for a robust system that effectively warns users when there is a potential for collision and simultaneously minimizes or eliminates the number of false alarms.
  • [0008]
    For such a system to function optimally, it would be desirable to provide a subsystem for estimating the camera reference horizon in order to assist in identifying and eliminating overhead structures as objects that can trigger collision warnings.
  • SUMMARY
  • [0009]
    The present invention provides a method, an apparatus, and a computer program product for generating a camera reference horizon estimate. In a first aspect, the operations of the invention comprise:
  • [0010]
    receiving a sequence of images including projections of objects thereon, with projections including horizontal edges;
  • [0011]
    estimating vertical motion occurring in the sequence of images and storing the estimated vertical motion as a motion history;
  • [0012]
    computing the projection of the horizontal edges in the images;
  • [0013]
    segmenting the projected horizontal edges in the images;
  • [0014]
    compensating for motion in the images using the estimated vertical motion from the motion history;
  • [0015]
    tracking the projections of the horizontal edges from image to image to generate tracks representing objects and maintaining an active track storage by adding more data to update the active track storage from each subsequent image;
  • [0016]
    screening the generated tracks using screening criteria to produce a subset of tracks in order to enhance track reliability;
  • [0017]
    receiving the subset of tracks from the screening operation and estimating distances to the objects represented by the tracks and heights of the objects represented by the tracks;
  • [0018]
    further screening the subset of tracks by calculating an error criteria for each track and selecting a subset of remaining tracks satisfying the error criteria; and
  • [0019]
    generating a camera reference horizon estimate Ŷ0 from the remaining tracks.
  • [0020]
    In another aspect, the screening criteria used in the screening operation include rejecting those tracks that are below the current reference horizon; determining a track length for each track and keeping only those tracks that have track lengths in excess of a predetermined track length threshold; determining the dynamic range of the tracks and keeping only those tracks that have a dynamic range exceeding a dynamic range threshold; and determining an abnormality value for each track and ignoring tracks exceeding an abnormality threshold.
  • [0021]
    In a still further aspect, in the operation of further screening the subset of tracks, the error criteria used includes calculating a root-mean-squared fitting error for each track using the estimated distance and height, and excluding those tracks having a root-mean-squared fitting error exceeding a predetermined threshold.
  • [0022]
    In yet another aspect, the operations of the invention further comprise maintaining an active camera reference horizon estimate Ŷ0; and updating the active camera reference horizon estimate Ŷ0 with the generated camera reference horizon estimate based on the fulfillment of a criteria set.
  • [0023]
    In a further aspect, the images include image tops, and in the operation of updating the active camera reference horizon estimate Ŷ0 is triggered based on the fulfillment of a the criteria set including: when a longest track exists in a number of images in the sequence of images, where the number exceeds a predetermined threshold; and when the longest track is within a predetermined distance from an image top.
  • [0024]
    In a yet further aspect, the operation of estimating distances to the objects represented by the tracks and heights of the objects represented by the tracks is performed by estimating the parameters for distance {circumflex over (D)} and height Ĥ in the linear set, A t A [ D ^ H ^ ] = A t B ,
    Figure US20040125210A1-20040701-M00001
  • [0025]
    where matrices AtA and AtB are: A t A = [ i ( Y 0 - r i ) 2 - f i ( Y 0 - r i ) - f i ( Y 0 - r i ) M f 2 ] [ U V V M f 2 ] , and A t B = [ i d i ( Y 0 - r i ) 2 - f i d i ( Y 0 - r i ) ] [ W Z ] , where U = i ( Y 0 - r i ) 2 = i r i 2 - 2 Y 0 i r i + M Y 0 V = - f i ( Y 0 - r i ) = f ( i r i - M Y 0 ) W = i d i ( Y 0 - r i ) 2 = Y 0 2 i d i 2 - 2 Y 0 i d i r i + i d i r i 2 Z = - f i d i ( Y 0 - r i ) = f ( i d i r i - Y 0 i d i ) } , and  where  the  sum i is  for i = 1 to M ; f is  a  camera  focal  length, M is  a  length  of  a  track, r i is  a  track  location  in  an  image, Y 0 is  the  camera  horizon,  and d i is  the  distance  a  vehicle  hosting  the  camera  has  traveled      since  the  first  image;
    Figure US20040125210A1-20040701-M00002
  • [0026]
    thus, the matrices AtA and AtB can be updated incrementally as the HEP track length increases from M to M+1.
  • [0027]
    In another aspect of the present invention, in the operation of generating a camera reference horizon estimate Ŷ0 from the remaining tracks, the reference horizon is determined by solving for Ŷ0 in the following relationship: ( ( C ( N ) ) t C ( N ) ) [ D ^ Y ^ 0 ] = ( C ( N ) ) t G ( N ) , where  the  equations  for  matrices ( C ( N ) ) t C ( N ) and ( C ( N ) ) t G ( N ) can  be  written  as: ( C ( N ) ) t C ( N ) = [ R 1 0 0 P 1 0 0 0 0 R k 0 P k 0 0 0 0 0 R N P N P 1 P k P N Q ] and ( C ( N ) ) t G ( N ) = [ S 1 S k S N T ] t , where R k = ( i , j ) Θ k ( r kj - r ki ) 2 P k = ( i , j ) Θ k ( r kj - r ki ) ( d kj - d ki ) Q = k = 1 N q k , q k = ( i , j ) Θ k ( d kj - d ki ) 2 S k = ( i , j ) Θ k ( r kj - r ki ) ( r kj d kj - r kj d ki ) T = k = 1 N t k , t k = ( i , j ) Θ k ( d kj - d ki ) ( r kj d kj - r kj d ki ) } k = 1 , , N Θ k = { ( i , j ) | j = i + s , 1 i , j M k }
    Figure US20040125210A1-20040701-M00003
  • [0028]
    where:
  • [0029]
    i and j are matrix index variables,
  • [0030]
    θk is an index set including all possible pairings of {i,j} that are separated by a fixed number of frames s,
  • [0031]
    k is an index variable of tracks where the total number of tracks is represented by N,
  • [0032]
    Mk is the length of the kth track among a total of N tracks,
  • [0033]
    rkj is the track position in image row coordinates at a jth image frame of the kth track;
  • [0034]
    rki is the track position in image row coordinates at a ith image frame of the kth track;
  • [0035]
    dkj is a distance traveled by a vehicle at the jth frame of the kth track relative to the first frame, with dkl=0 for all k;
  • [0036]
    dki is a distance traveled by a vehicle at the ith frame of the kth track relative to the first frame, with dkl=0 for all k; and
  • [0037]
    whereby the terms Rk, Pk, qk, Sk, and tk are computed incrementally as new track points are added and new pairs of {i,j} become part of the index set
  • [0038]
    The invention may be implemented as a method, typically performed on a data processing system, as a computer program product in the form of computer-readable instructions stored on a computer-readable medium, and as an apparatus for operating the above-mentioned operations, the apparatus comprising a memory for storing data and processing results, a processor operationally coupled with the memory, the processor including means for receiving a sequence of images including projections of objects thereon, with projections including horizontal edges. The aspect further comprises means for performing the operations mentioned with respect to the various aspects discussed above.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • [0039]
    The objects, features and advantages of the present invention will be apparent from the following detailed descriptions of the various aspects of the invention in conjunction with reference to the following drawings.
  • [0040]
    [0040]FIG. 1 is a block diagram showing the functional blocks (steps) of the present invention;
  • [0041]
    [0041]FIG. 2 is a flow diagram showing the general steps in the method of the present invention, representing a simplified version showing the operations depicted in FIG. 1;
  • [0042]
    [0042]FIG. 3 is a block diagram depicting the components of a computer system used in conjunction with the present invention;
  • [0043]
    [0043]FIG. 4 is an illustrative diagram of a computer program product embodying the present invention;
  • [0044]
    [0044]FIG. 5(a) is a graph depicting the estimated horizon in image rows over a series of images, with the vertical axis indicating the number of image rows in the estimated horizon, and with the horizontal axis indicating the image frame number; and
  • [0045]
    [0045]FIG. 5(b) is an image of the horizontal edge projection tracks used in the estimation.
  • DETAILED DESCRIPTION
  • [0046]
    The present invention relates to image and video processing. More specifically, the present invention relates to a method and apparatus for estimating the location of a camera reference horizon. The following description, taken in conjunction with the referenced drawings, is presented to enable one of ordinary skill in the art to make and use the invention and to incorporate it in the context of particular applications. Various modifications, as well as a variety of uses in different applications, will be readily apparent to those skilled in the art, and the general principles defined herein, may be applied to a wide range of aspects. Thus, the present invention is not intended to be limited to the aspects presented, but is to be accorded the widest scope consistent with the principles and novel features disclosed herein. Furthermore it should be noted that unless explicitly stated otherwise, the figures included herein are illustrated diagrammatically and without any specific scale, as they are provided as qualitative illustrations of the concept of the present invention.
  • [0047]
    In order to provide a working frame of reference, first a glossary of terms used in the description and claims is given as a central resource for the reader. Next, a discussion of various physical aspects of the present invention is provided. After developing the glossary, an overview of the present invention is provided, followed by a discussion of the specific details. Finally an example implementation is discussed in order to aid in providing a clear understanding of an application of the present invention.
  • [0048]
    (1) Glossary
  • [0049]
    Before describing the specific details of the present invention, a centralized location is provided in which various terms used herein and in the claims are defined. The glossary provided is intended to provide the reader with a general understanding of the intended meaning of the terms, but is not intended to convey the entire scope of each term. Rather, the glossary is intended to supplement the rest of the specification in more clearly explaining the terms used.
  • [0050]
    Abnormality—Screening criteria used in the first track screener include checking tracks for abnormalities by determining an abnormality value for each track and eliminating/ignoring tracks having an abnormality value in excess of an abnormality threshold. Many factors may go into the calculation of an abnormality, and the factors will vary depending on the goals of a particular application. An example of abnormality factors include when a track registers a negative height or distance value, determining whether an error vector associated with a track is overly large, etc. The abnormality value provides a check on the data extracted from the images in order to eliminate erroneous tracks.
  • [0051]
    Camera—This term is used herein to refer generally to an image source capable of supplying a sequence of images. In the most general sense, “camera”, as used herein could be a database of stored images. However, in most applications, a camera is an imaging device such as a CCD video camera. The camera may operate in any area of the spectrum, and may, for example, be an optical camera, a radar-based camera, an IR camera, or even an acoustic imaging system. The important characteristic of the camera is the its ability to provide a sequence of images for processing.
  • [0052]
    Means—The term “means” as used with respect to this invention generally indicates a set of operations to be performed on a computer. Non-limiting examples of “means” include computer program code (source or object code) and “hard-coded” electronics. The “means” may be stored in the memory of a computer or on a computer readable medium.
  • [0053]
    (2) Overview
  • [0054]
    In one aspect, the present invention provides a method and apparatus for calibrating the reference horizon of a video camera mounted on a vehicle, such as an automobile. By tracking certain image features in a digitized image sequence from a video camera, the method automatically computes the reference horizon of the video camera images. In this aspect, the reference horizon of an in-vehicle video camera is calibrated for tasks such as lane tracking and collision warning. The reference horizon is one of the important system parameters used by vision systems to perform accurate detection and tracking of objects in video images. Rather than relying on other instruments, the present invention provides real-time updates to the reference horizon. The effective reference horizon of a vehicle may change under a variety of vehicle load and working conditions. Therefore, a fixed and accurate reference horizon does not exist, making it necessary to estimate the reference horizon in a moving vehicle on the road. Since the reference horizon is related to the stable-state vehicle pitch angle, this invention allows one to avoid the difficult task of calibrating the vehicle pitch every time a trip is begun.
  • [0055]
    The present invention will be discussed throughout this description in terms of its use for real-time overhead structure detection. The major functional blocks of the overhead structure detection are shown in FIG. 1, and are discussed briefly in order to facilitate further discussion of the horizon estimation technique. Note that the solid lines indicate the process or action/data flow and the dotted lines represent data flow only. The overhead structure detection system receives digitized video images 100 input from an in-vehicle video camera and a digitizer. Camera parameters such as color vs. black and white, resolution, and frame-rate are selected based on the goals of a particular implementation. The vertical motion estimation module 102 uses the current horizon estimate 130 and computes the relative image motion of two consecutive image frames in the vertical direction using a one-dimensional optical flow approach. Based on the relative motion, a vertical motion offset for each image frame relative to some reference frame is computed and stored in a motion history 104. After the vertical motion has been estimated, a determination regarding whether to reset the motion history 122 is undertaken. If, for example, the motion exceeds a predetermined threshold, then the motion reset 122 is performed, sending a reset signal 124 to reset both the motion history 104 and the track history 110. In the case of a motion reset, the invention continues by receiving digitized video images 100, and proceeds again from block 100.
  • [0056]
    Typically operating simultaneously with the operations described above, a windowed, gradient-based horizontal edge projection (HEP) is carried out for each image 106. The vertical gradient is computed using a vertical Sobel gradient operator in a window. A non-limiting example of a typical window for 240×320 images would be a window of 50-pixel width with the same height as the image, centered horizontally with respect to the image. The gradient image is then projected (summed over) for each row to form the HEP. This technique provides a fast, reliable HEP tracking technique by using raw gradient measurements, rather than segmented edge pixels in the HEPs. Note that the vertical motion estimation 102 and the gradient-based HEP 106 may be performed in any order or simultaneously, depending on the goals of a particular application. The “no” line from the motion reset block 122 to the arrow between blocks 106 and 108 is intended to demonstrate that the gradient-based HEP operation 106 and the vertical motion estimation 102 may be performed in parallel prior to operation of the HEP segmentation and tracking module 108.
  • [0057]
    The HEP segmentation and tracking module 108 extracts the peaks in the HEP corresponding to significant horizontal edges coming from the images of overhead structures and tracks the locations of these peaks in the image across the video sequence. The result is a set of linked HEP peak locations across the image sequence, termed HEP tracks. The HEP segmentation and tracking module 108 also uses the estimated vertical motion from the vertical motion estimation block 102 to help tracking and to compensate for the track location in the image and uses the host vehicle speed to compute the traveled distance for each HEP track. The track history 110 stores this information for later use.
  • [0058]
    The HEP tracks in the track history 110 go through a first track screener block 112 before being used for parameter estimation (note that the operation of the first track screener block 112 is shown as the next action block after the HEP segmentation and tracking module 108). The first track screener block 112 implements a set of criteria to select HEP tracks. These criteria, or constraints, include: (1) Using only the tracks above a reference horizon (usually the center of the image) for parameter estimation. This eliminates most of the noisy tracks caused by distant objects and shadows cast on the ground by overhead structures and vehicles on the road. (2) Checking the results of the estimated parameters for abnormality. This is accomplished, for example, by checking the results of height and distance for negative values and ignoring the corresponding tracks, and, for the remaining tracks, checking the standard deviation of the error vector and determining whether it is a reasonably small value. This ensures that the estimated parameters fit well with the track points.
  • [0059]
    Parameter estimation, performed via an overhead distance and height estimator 114, takes the track histories of eligible HEP tracks (filtered by the first track screener 112) from the track history 110, in combination with information regarding the camera focal length 115 and the current horizon estimate 130, and computes the distance to the host vehicle and the height of the underlying overhead structure corresponding to the HEP tracks. The estimated values, along with the estimation error measures, are passed to a fusion module 116, which combines vision detections of vehicles and overhead structures with those from radar or other detection techniques in order to provide a robust obstacle detection and avoidance system. Note that the eligible HEP tracks may be received directly from the first track screener 112 or they may be received from the track history 110. The mechanics of the data transfer are considered design choices that may vary from implementation to implementation. This same idea applies to the receipt of tracks by the second track screener 118 as well as to other data flows throughout the diagram.
  • [0060]
    The second track screener 118 acts as a finer screener (filter) for the HEP tracks used in parameter estimation by allowing only certain HEP tracks to be used in combination with information regarding the camera focal length 115, for reference horizon estimation by the reference horizon estimator 120. It does so by imposing a limit on the error measure computed in the parameter estimation for each track.
  • [0061]
    After a reference horizon estimate has been generated by the reference horizon estimator 120, a determination may be made, based on an update criteria set, whether to update the reference horizon estimate 126. If the criteria for an update are not met, the new reference horizon estimate is discarded and the process begins again at the image inputting block 100. On the other hand, if the criteria are met, a reference horizon block 128 is activated, updating the active reference horizon estimate 130. The motion history 104 is also reset via resetting block 124, and the process resets, starting with the image input block 100.
  • [0062]
    A flow diagram showing the general steps in the method of the present invention is shown in FIG. 2. Note that the steps shown in FIG. 2 represent the operations discussed with respect to FIG. 2, but avoids detail with regard to the data flow. The steps shown in FIG. 2 depict a method for estimating a camera reference horizon Ŷ0. After starting 200, a sequence of images is received 202. The images include projections of objects, with the projections, in turn, including horizontal edges. Next, vertical motion occurring in the sequence of images is estimated 204, and horizontal edge projections are computed 206. Note that the steps of estimating the vertical motion 204 and of computing the horizontal edge projections 206 may be performed in any order or simultaneously. After the horizontal edge projections have been computed 206, they are segmented in a step of segmenting the projected horizontal edges 208. Additionally, the motion estimate from the vertical motion estimating step 204 is used in a step for compensating for motion in the images 210. Note that during the vertical motion estimating step 204, the vertical motion estimates are stored in a motion history, from which they are recalled during the compensating step 210.
  • [0063]
    Over the sequence of images, the projections of the horizontal edges are tracked to generate tracks representing objects 212. The tracks are maintained in a active track storage by adding more data to update the active track storage from each subsequent image. The stored tracks are screened 214 (using the first track screener 112, as discussed with respect to FIG. 1) using screening criteria to produce a subset of tracks in order to enhance track reliability. The subset of tracks from the screening step 214 is used for estimating distances to the objects represented by the tracks as well as the heights of the objects represented by the tracks 216.
  • [0064]
    After the distances to and heights of the objects have been estimated, the subset of tracks is further screened by a further screening step 218, in which an error criteria is calculated for each track and selecting a subset of remaining tracks satisfying the error criteria. Finally, the tracks remaining after further screening 418 are used for generating a camera reference horizon estimate Ŷ0 220.
  • [0065]
    Typically, the screening criteria used in the screening step 214 causes rejection of those tracks that are below the current reference horizon. The screening criteria also includes determining a track length for each track and keeping only those tracks that have track lengths in excess of a predetermined track length threshold as well as determining the dynamic range of the tracks and keeping only those tracks that have a dynamic range exceeding a dynamic range threshold, and determining an abnormality value for each track and ignoring tracks exceeding an abnormality threshold.
  • [0066]
    After the generation of the camera reference horizon estimate Ŷ0, additional steps may be performed for maintaining an active camera reference horizon estimate Ŷ0. In the first additional step, a decision is made whether to update the active camera reference horizon estimate Ŷ0 with the generated camera reference horizon estimate 222 based on the fulfillment of a criteria set. If the decision is made to update the camera reference horizon estimate Ŷ0, then the update is performed 224. If not, the current estimate is maintained, and the method continues with the receipt of additional images 202. The criteria set used for triggering a reference horizon estimate update results in an update when (1) the longest track in the sequence of images exists over a predetermined number of images (meaning that the longest track is likely a not close enough to warrant inclusion in collision warning decisions, etc.), and (2) the longest track is within a predetermined distance from the image top. These criteria are used in combination to filter out tracks unlikely to be of interest for collision detection.
  • [0067]
    Next various “physical” aspects of the present invention will be described, followed by a more detailed description of the invention.
  • [0068]
    (3) Physical Aspects of the Present Invention
  • [0069]
    The present invention has three principal “physical” aspects. The first is an image-based horizon estimation apparatus, typically, but not limited to, a computer system operating software of in the form of a “hard-coded” instruction set. The second physical aspect is a method, typically in the form of software, operated using a data processing system (computer). The third principal physical aspect is a computer program product. The computer program product generally represents computer readable code stored on a computer readable medium such as an optical storage device, e.g., a compact disc (CD) or digital versatile disc (DVD), or a magnetic storage device such as a floppy disk or magnetic tape. Other, non-limiting examples of computer readable media include hard disks and flash-type memories. These aspects will be described in more detail below.
  • [0070]
    A block diagram depicting the components of a computer system used in the present invention is provided in FIG. 3. The data processing system 300 comprises an input 302 for images from an imaging device such as a CCD camera. The output 304 is connected with the processor for providing output to a user or to other devices or programs for use therein. Typically, the output is used for fusion with other information in an automotive context. The input 302 and the output 304 are both coupled with a processor 306, which may be a general-purpose computer processor or a specialized processor designed specifically for use with the present invention. The processor 306 is coupled with a memory 308 to permit storage of data and software to be manipulated by commands to the processor.
  • [0071]
    An illustrative diagram of a computer program product embodying the present invention is depicted in FIG. 4. The computer program product 400 is depicted as an optical disk such as a CD or DVD. However, as mentioned previously, the computer program product generally represents computer readable code stored on any compatible computer readable medium.
  • [0072]
    (4) The Reference Horizon Estimation Technique
  • [0073]
    Now, more details of the reference horizon estimation technique are presented. A 1-dimensional imaging model may be used in a parameter estimation algorithm to describe the trajectory of HEP tracks from overhead structures: y i = fH Z i i = 0 , , M - 1 ( 1 )
    Figure US20040125210A1-20040701-M00004
  • [0074]
    where yi is the track point's Y coordinate (assuming that the Y axis points up) at frame i, H is the underlying object's height (above the camera), Zi is the distance of the object to the host vehicle at frame i, and M is the number of points in the track, i.e., the length of the track. Letting ri be the track location in image row in frame i, yi can be written as
  • y i =Y 0 −r i  , (2)
  • [0075]
    where Y0 is the reference horizon in the image. Also di represents the distance the host vehicle has traveled from frame 0, with d0=0 (di=0 at i=0). Zi can be expressed in terms of di and D, the distance of the host vehicle at frame i to the object, as:
  • Z i =D−d i  (3)
  • [0076]
    Substituting Equations (2) and (3), above into Equation (1), the equation for the parameter estimation algorithm may be obtained,
  • (Y 0 −r i)D−fH=(Y 0 −r i)d i , i=0, . . . , M−1  (4)
  • [0077]
    These equations can be solved using a least-squares (LSQ) approach for the unknowns of D and H. Obviously, this approach relies on having an accurate estimation of the current camera reference horizon Y0 besides the camera focal length f. An error in the reference horizon will result in a bias in the estimated values for the distance and height of the overhead structure. Unlike the camera focal length, which is generally fixed and can be calibrated relatively accurately off-line, the reference horizon can change due to changes in vehicle load and other mechanical conditions that affect the vehicle pitch. Furthermore, the initial vehicle pitch, which is generally unknown, also introduces a systematic bias to the value of Y0.
  • [0078]
    In the following subsections, the techniques for the estimation of the reference horizon Y0 using both single and multiple HEP tracks are discussed.
  • [0079]
    (a) Single HEP Track Technique
  • [0080]
    To solve for the horizontal reference Yo, Equation (4) serves as a starting point. Equation (4) represents a set of non-linear equations in the parameters D, H, and Y0. It is not possible to solve directly for the parameters using a linear least-squares approach. However, if two of these equations are combined by subtracting one from the other, a linear equation is obtained in only two of the three variables:
  • (Y 0 −r i)D−(Y 0 −r j)D=(Y 0 −r i)d i−(Y 0 −r j)d j , i,j=0, . . . ,M−1, i≈j,  (5)
  • [0081]
    or
  • (r j −r i)D+(d j −d i)Y 0=(r j d j −r i d i), i,j=0, . . . ,M−1, i≈j  . (6)
  • [0082]
    Putting the above in matrix form results in: C [ D Y 0 ] = G , where C = [ r j - r i d j - d i ] , G = [ r j d j - r i d i ] ( 7 )
    Figure US20040125210A1-20040701-M00005
  • [0083]
    Now it is straightforward to estimate a solution to these over-constrained equations using a linear least-squares approach: [ D ^ Y ^ 0 ] = ( C t C ) - 1 C t G , ( 8 )
    Figure US20040125210A1-20040701-M00006
  • [0084]
    where {circumflex over (D)} and Ŷ0 are estimates of D and Y0, respectively.
  • [0085]
    Now the size of matrix C and vector G can be decided. It is possible to have as many as M−1 independent equations in the form of Equation (6) out of the M(M−2)/2 possibilities (2 out of M). However, not all these equations are “created equal.” The goal for choosing these equations is to conserve computational resources, while working with enough constraints to provide proper “resolution”. An example of how to reduce the number of equations is discussed below. However, it will be apparent to one of skill in the art that a number of other techniques or criteria may be used in equation selection.
  • [0086]
    The following set of track point indices is used to select the equations in Equation (6) that will go into the least-squares solution in Equation (8): i = 0 , , M 2 , j = i + M 2 , ( 9 )
    Figure US20040125210A1-20040701-M00007
  • [0087]
    that is, a “frame distance” of about a half of the length of the HEP track is used. Although this choice provides only about half of the possible independent equations, it is enough in most cases. In this case, the reason for choosing these equations is that the separation between i and j should be made as large as possible. This will make the difference term (ri-rj) in Equation (6) larger, therefore reducing the relative noise level introduced by the track location (ri) measurements, which is independent of the magnitude of the actual measurement. Other sets of track point indices may be chosen for equation selection, depending on the goals of a particular implementation, as will be apparent to one of skill in the art.
  • [0088]
    (b) Multiple HEP Track Technique
  • [0089]
    Equation (8) provided one estimate for Y0 for each HEP track, and these estimates must be combined to form a single estimate for Y0. Alternatively, all the eligible HEP tracks may be considered together and solved for a single Y0. Since the Y0s in Equation (7) for different tracks are the same parameter, while the Ds are different, when equations generated according to Equation (7) are combined for different tracks, the following equations result: C ( N ) [ D Y 0 ] = G ( N ) , ( 10 )
    Figure US20040125210A1-20040701-M00008
  • [0090]
    where
  • D=[D1 . . . Dk . . . DN]t  (11)
  • [0091]
    and Dk is the distance of the host vehicle to the overhead structure for the k-th HEP track, 1≦k≦N. The matrix C(N) is formed by arranging the first columns of the C matrices in Equation (7) for all tracks in a staircase fashion, and by moving the second columns of the C matrices to form the last column, and filling the rest with zeros: C ( N ) = [ r 1 j - r 1 i 0 0 0 0 d 1 j - d 1 i 0 0 r kj - r ki 0 0 d kj - d ki 0 0 0 0 r Nj - r Ni d Nj - d Ni ] . ( 12 )
    Figure US20040125210A1-20040701-M00009
  • [0092]
    The vector G(N) is formed by stacking the G vectors from Equation (7) for all tracks: G ( N ) = [ r 1 j d 1 j - r 1 i d 1 i r kj d kj - r ki d ki r Nj d Nj - r Ni d Ni ] ( 13 )
    Figure US20040125210A1-20040701-M00010
  • [0093]
    The least-squares solution to Equation (10) is: [ D ^ Y ^ 0 ] = ( ( C ( N ) ) t C ( N ) ) - 1 ( C ( N ) ) t G ( N ) ( 14 )
    Figure US20040125210A1-20040701-M00011
  • [0094]
    The algorithm based on Equation (14) is termed the “global” algorithm since it uses all eligible HEP tracks.
  • [0095]
    The equations going into Equation (10) can still be selected based on Equation (9). This method, however, is not practical in the real-time environment because M (or Mk for track k) is always changing (incrementing), and all of the computation involved in preparing matrices C(N) and G(N) must be repeated at every frame. Therefore it is desirable to use a fixed frame distance between j and i. That is,
  • s=j−i  (15)
  • [0096]
    for all tracks. This way new equations can be continually added into matrices C(N) and G(N) as more points are gathered in each track, without starting all over at every frame. A typical value for s, for example, is 20 (frames).
  • [0097]
    The reference horizon estimating results, which used the global algorithm outlined here, are shown in FIG. 5. In this figure, the horizon estimation is carried out approximately between frames 180 and 375. Note that FIG. 5(a) shows the estimated reference horizon and that FIG. 5(b) is an image of the HEP tracks used in the estimation. The estimated reference horizon has a biased and noisy start, but quickly converges to a stable value. Toward the end of the HEP tracks, there is a small blip caused by the termination of some of the HEP tracks.
  • [0098]
    (5) Use of the HEP Track Techniques in an Overhead Structure Detection System
  • [0099]
    As can be seen from FIG. 5, the solution provided by Equation (14) still tends to be noisy. The reasons are the noise in the tracks and the effect of adding or removing a HEP track or tracks to or from the algorithm. The problem with adding or removing tracks can be solved or at least alleviated by use of a weighting scheme in which the weights for the newly added tracks would gradually be increased, while those being removed would gradually be decreased. In the implementation discussed in this section, the problem is solved from a system perspective because the goal of estimating the reference horizon is to obtain reliable distance and height estimates for the overhead structures. The approach is as follows. First, only the “good” HEP tracks are chosen for the reference horizon estimation. Second, a set of intuitive criteria is applied regarding when the estimated reference horizon may actually be used. Third, a set of techniques is developed to ensure that the HEP tracking, the parameter estimation, and the reference horizon estimation algorithms produce reliable results. These methods are discussed in detail in the following subsections.
  • [0100]
    (a) The Second Track Screener and Reference Horizon Updating
  • [0101]
    It is important to ensure that the tracks used for reference horizon estimation are good tracks in the sense that the tracks fit the desired mathematical model and contain minimal noise. This is achieved by using the second track screener 118, as was shown in FIG. 1. In the least-squares (LSQ) parameter estimation block, also known as the overhead distance and height estimator 114, a root-mean-squared-error (RMSE) is computed for each track. The role of second track screener 118 is to examine the RMSEs of the tracks to ensure that the track trajectory fits well with the model for the parameter estimation algorithm and, hence, will provide good results with the reference horizon estimation algorithm. To do this, the second track screener 118 imposes a maximum RMSE on all tracks going into the reference horizon estimation algorithm.
  • [0102]
    In this way, the global algorithm generally produces very good estimates for the reference horizon. Next, given that good estimates for the reference horizon exist, they must be put to use. One way of using the reference horizon estimates is to use the newly estimated value in the next frame immediately in the overhead distance and height estimator 114 (the LSQ parameter estimation block). However, doing so runs the risk of producing unstable parameter estimation output (due to their sensitivity to changes in Y0). A more cautious approach is taken in this implementation, with the reference horizon being updated only when the following criteria are met:
  • [0103]
    The longest HEP track is longer than a certain threshold (e.g. 100 frames)
  • [0104]
    The longest HEP track is near the top of the image
  • [0105]
    The purpose of the first criterion above is to avoid short and unreliable tracks. The second criterion is designed to seize the most reliable estimate of the reference horizon. When a HEP track reaches near the top of the image, the track has arrived at its longest possible length. And that is when the algorithm can benefit most from the HEP track because the longer the track is, the tighter the constraint the track has on the model.
  • [0106]
    In this scheme, updating Y0 means replacing the old value with the newly estimated value. However, there are many other ways to update the value of Y0 that the system uses for parameter estimation. One possibility is to use weighted average based on certain error measurements, noise measurements, or track length measurements. It is also possible to apply Kalman filtering or other filtering techniques directly on the estimated Y0.
  • [0107]
    (b) HEP Tracking Reset and Motion Reset
  • [0108]
    Once Y0 has been updated 128, the parameter estimation can no longer continue. This is because parameter estimation depends on the value of Y0. If Y0 is changed, there is likely a discontinuity in the output of parameter estimation due to the change in Y0. This situation may not be optimal for other modules or systems that use this information (such as the fusion module 116). In this case, it is desirable to terminate all HEP tracks and start anew when Y0 is updated. This is called HEP tracking reset. Resetting all tracks may seem an extreme measure to take to the overall system (e.g., the fusion module 116 will suddenly loose overhead track information). In reality, though, due to the criteria outlined above, when a HEP tracking reset occurs, the host vehicle will have been very close to the overhead structure. Therefore any decision that must be made by an obstacle avoidance system should have already been made.
  • [0109]
    Another situation when HEP tracking reset happens is when there is a change in road slope. This instance of the parameter estimation algorithm assumes that the host vehicle is traveling on a flat road. However, when the vehicle goes from a flat section of the road into a down- or up-hill slope, or vice versa, this assumption is violated and the parameters estimated will not be accurate. Such transitions should be detected, and the current HEP tracks should be abandoned by resetting. Once a road slope transition is detected, all current HEP tracks are reset, effectively segmenting potentially very long HEP tracks into pieces corresponding to flat road sections. This not only helps parameter estimation, it also ensures that the model and assumptions for the reference horizon estimation are still valid.
  • [0110]
    One approach to road slope transition detection is to monitor the cumulative motion offset. The cumulative motion offset is the amount of vertical motion estimated at the current image frame relative to a certain reference frame in the past, and is used to perform motion compensation for the HEP tracks and tracking. If the road surface is flat, this offset value should stay relatively small, indicating that the vertical motion is mainly caused by uneven road surface. When the host vehicle goes from a flat road section into a section with a different slope, or a section with changing slope, the absolute value of the cumulative motion offset will be high and stay high. A simple threshold measured in image pixels may be applied to the absolute cumulative motion offset in order to decide if a slope transition has occurred. This method may be further improved, for example, by using the past history of the cumulative motion offset (the motion history 104) to determine the road slope transition more accurately and more robustly. When HEP track reset occurs, the motion history 104 is also reset, and the cumulative motion offset is set to zero.
  • [0111]
    (6) Real-Time Implementation Example
  • [0112]
    Techniques that implement both the parameter estimation and the global reference horizon estimation suitable for real-time execution are discussed in this section. These techniques emphasize the theme of incremental updating to save computation needed to accomplish the tasks. In the following subsections, the design of incremental parameter estimation and the global reference horizon estimation are discussed. The technique for parameter estimation is discussed first since the RMSE is needed for screening tracks used for the reference horizon estimation.
  • [0113]
    (a) LSQ Parameter (Overhead Distance and Height) Estimation
  • [0114]
    Equation (4) for the LSQ parameter estimation may be written as a matrix as follows: A [ D H ] = B , ( 16 )
    Figure US20040125210A1-20040701-M00012
  • [0115]
    where matrix A (M by 2) and the column vector B (size M) are made up of the coefficients of Equation (4): A = [ Y 0 - r 1 - f Y 0 - r i - f Y 0 - r M - f ] , B = [ ( Y 0 - r 1 ) d 1 ( Y 0 - r i ) d i ( Y 0 - r M ) d M ] . ( 17 )
    Figure US20040125210A1-20040701-M00013
  • [0116]
    To compute the LSQ solutions for D and H, Equation (16) is multiplied by the transposition of matrix A, and the resulting set of linear equation is solved using Gaussian elimination: A t A [ D ^ H ^ ] = A t B . ( 18 )
    Figure US20040125210A1-20040701-M00014
  • [0117]
    Next, equations for computing the matrices AtA and AtB are derived incrementally. From Equation (17), A t A = [ i ( Y 0 - r i ) 2 - f i ( Y 0 - r i ) - f i ( Y 0 - r i ) M f 2 ] [ U V V M f 2 ] , and ( 19 ) A t B = [ i d i ( Y 0 - r i ) 2 - f i d i ( Y 0 - r i ) ] [ W Z ] , where ( 20 ) U = i ( Y 0 - r i ) 2 = i r i 2 - 2 Y 0 i r i + M Y 0 V = - f i ( Y 0 - r i ) = f ( i r i - M Y 0 ) W = i d i ( Y 0 - r i ) 2 = Y 0 2 i d i 2 - 2 Y 0 i d i r i + i d i r i 2 Z = - f i d i ( Y 0 - r i ) = f ( i d i r i - Y 0 i d i ) } . ( 21 )
    Figure US20040125210A1-20040701-M00015
  • [0118]
    All of the summations are for i from 1 to M. As can be seen, if Y0 is fixed and known, then the terms U, V, Wand Z in the matrices AtA and AtB can be updated incrementally directly as the HEP track length increases from M to M+1. Otherwise, if Y0 is not known or it may change later, then the components of these terms can be updated incrementally, and the terms themselves can be computed once Y0 is known. Additionally, it is noteworthy that there are many common terms in the expressions of U, V, W, and Z. Therefore only one copy of these terms need be kept to avoid duplicate computations.
  • [0119]
    The RMSE for Equation (18) is: E RMS = 1 M E t E , where E = ( A [ D ^ H ^ ] - B ) . ( 22 )
    Figure US20040125210A1-20040701-M00016
  • [0120]
    Letting P=[{circumflex over (D)} Ĥ] ′, and expanding the term E′E generates
  • E′E=P′A tAP−2P′A tB+B′B.  (23)
  • [0121]
    Incremental updating formulas already exist for AtA and AtB (Equation (19) and Equation (20)). From Equation (17), B′B can be written as: B t B = i ( Y 0 - r i ) 2 d i 2 = Y 0 2 i d i 2 - 2 Y 0 i r i d i 2 + i r i 2 d i 2 ( 24 )
    Figure US20040125210A1-20040701-M00017
  • [0122]
    Thus, B′B can also be computed incrementally.
  • [0123]
    (b) Reference Horizon Estimation
  • [0124]
    The LSQ solution (Equation (14)) for the reference horizon estimation can be obtained by applying Gaussian elimination to the following equations: ( ( C ( N ) ) t C ( N ) ) [ D ^ Y ^ 0 ] = ( C ( N ) ) t G ( N ) ( 25 )
    Figure US20040125210A1-20040701-M00018
  • [0125]
    As before, the equations for the matrices (C(N))tC(N) and (C(N))tG(N) can be written as follows: ( C ( N ) ) t C ( N ) = [ R 1 0 0 P 1 0 0 0 0 R k 0 P k 0 0 0 0 0 R N P N P 1 P k P N Q ] and ( 26 ) ( C ( N ) ) t G ( N ) = [ S 1 S k S N T ] t , where ( 27 ) R k = ( i , j ) Θ k ( r kj - r ki ) 2 P k = ( i , j ) Θ k ( r kj - r ki ) ( d kj - d ki ) Q = k = 1 N q k , q k = ( i , j ) Θ k ( d kj - d ki ) 2 S k = ( i , j ) Θ k ( r kj - r ki ) ( r kj d kj - r kj d ki ) T = k = 1 N t k , t k = ( i , j ) Θ k ( d kj - d ki ) ( r kj d kj - r kj d ki ) } k = 1 , , N Θ k = { ( i , j ) | j = i + s , 1 i , j M k } ( 28 )
    Figure US20040125210A1-20040701-M00019
  • [0126]
    where:
  • [0127]
    i and j are matrix index variables,
  • [0128]
    θk is an index set including all possible pairings of {i,j} that are separated by a fixed number of frames s,
  • [0129]
    k is an index variable of tracks where the total number of tracks is represented by N,
  • [0130]
    Mk is the length of the kth track among a total of N tracks,
  • [0131]
    rkj is the track position in image row coordinates at a jth image frame of the kth track;
  • [0132]
    rki is the track position in image row coordinates at a ith image frame of the kth track;
  • [0133]
    dkj is a distance traveled by a vehicle at the jth frame of the kth track relative to the first frame, with dkl=0 for all k;
  • [0134]
    dki is a distance traveled by a vehicle at the ith frame of the kth track relative to the first frame, with dkl=0 for all k; and
  • [0135]
    where the terms Rk, Pk, qk, Sk, and tk are computed incrementally as new track points are added and new pairs of {i,j} become part of the index set θk.
  • [0136]
    In the above equations, the index set θk includes all possible pairings of {i,j} that are separated by a fixed number of frames s (Equation (15)). As can be seen here, the terms Rk, Pk, qk, Sk, and tk can all be computed incrementally as new HEP track points are added and new pairs of {i,j} become part of the index set θk.

Claims (57)

    What is claimed is:
  1. 1. A method for generating a camera reference horizon estimate Ŷ0, the method comprising steps of:
    receiving a sequence of images including projections of objects thereon, with projections including horizontal edges;
    estimating vertical motion occurring in the sequence of images and storing the estimated vertical motion as a motion history;
    computing the projection of the horizontal edges in the images;
    segmenting the projected horizontal edges in the images;
    compensating for motion in the images using the estimated vertical motion from the motion history;
    tracking the projections of the horizontal edges from image to image to generate tracks representing objects and maintaining an active track storage by adding more data to update the active track storage from each subsequent image;
    screening the generated tracks using screening criteria to produce a subset of tracks in order to enhance track reliability;
    receiving the subset of tracks from the screening step and estimating distances to the objects represented by the tracks and heights of the objects represented by the tracks;
    further screening the subset of tracks by calculating an error criteria for each track and selecting a subset of remaining tracks satisfying the error criteria; and
    generating a camera reference horizon estimate Ŷ0 from the remaining tracks.
  2. 2. A method for generating a camera reference horizon estimate Ŷ0, as set forth in claim 1, wherein the screening criteria used in the screening step include rejecting those tracks that are below the current reference horizon; determining a track length for each track and keeping only those tracks that have track lengths in excess of a predetermined track length threshold; determining the dynamic range of the tracks and keeping only those tracks that have a dynamic range exceeding a dynamic range threshold; and determining an abnormality value for each track and ignoring tracks exceeding an abnormality threshold.
  3. 3. A method for generating a camera reference horizon estimate Ŷ0, as set forth in claim 2, wherein in the step of further screening the subset of tracks, the error criteria used includes calculating a root-mean-squared fitting error for each track using the estimated distance and height, and excluding those tracks having a root-mean-squared fitting error exceeding a predetermined threshold.
  4. 4. A method for generating a camera reference horizon estimate Ŷ0, as set forth in claim 3, further comprising steps of:
    maintaining an active camera reference horizon estimate Ŷ0; and
    updating the active camera reference horizon estimate Ŷ0 with the generated camera reference horizon estimate based on the fulfillment of a criteria set.
  5. 5. A method for generating a camera reference horizon estimate Ŷ0, as set forth in claim 4, wherein the images include image tops, and wherein in the step of updating the active camera reference horizon estimate Ŷ0 is triggered based on the fulfillment of a the criteria set including:
    when a longest track exists in a number of images in the sequence of images, where the number exceeds a predetermined threshold; and
    when the longest track is within a predetermined distance from an image top.
  6. 6. A method for generating a camera reference horizon estimate Ŷ0, as set forth in claim 5, wherein the step of estimating distances to the objects represented by the tracks and heights of the objects represented by the tracks is performed by estimating the parameters for distance {circumflex over (D)} and height Ĥ in the linear set,
    A t A [ D ^ H ^ ] = A t B ,
    Figure US20040125210A1-20040701-M00020
    where matrices AtA and AtB are:
    A t A = [ i ( Y 0 - r i ) 2 - f i ( Y 0 - r i ) - f i ( Y 0 - r i ) M f 2 ] [ U V V M f 2 ] , and A t B = [ i d i ( Y 0 - r i ) 2 - f i d i ( Y 0 - r i ) ] [ W Z ] , where U = i ( Y 0 - r i ) 2 = i r i 2 - 2 Y 0 i r i + M Y 0 V = - f i ( Y 0 - r i ) = f ( i r i - M Y 0 ) W = i d i ( Y 0 - r i ) 2 = Y 0 2 i d i 2 - 2 Y 0 i d i r i + i d i r i 2 Z = - f i d i ( Y 0 - r i ) = f ( i d i r i - Y 0 i d i ) } , and where  the  sum i is  for i = 1 to M ;
    Figure US20040125210A1-20040701-M00021
    f is a camera focal length,
    M is a length of a track,
    ri is a track location in an image,
    Y0 is the camera horizon, and
    di is the distance a vehicle hosting the camera has traveled since the first image;
    whereby the matrices AtA and AtB can be updated incrementally as the HEP track length increases from M to M+1.
  7. 7. A method for generating a camera reference horizon estimate Ŷ0, as set forth in claim 6, wherein in the step of generating a camera reference horizon estimate Ŷ0 from the remaining tracks, the reference horizon is determined by solving for Ŷ0 in the following relationship:
    ( ( C ( N ) ) t C ( N ) ) [ D ^ Y ^ 0 ] = ( C ( N ) ) t G ( N ) ,
    Figure US20040125210A1-20040701-M00022
    where the equations for matrices (C(N))t C(N) and (C(N))t G(N) can be written as:
    ( C ( N ) ) t C ( N ) = [ R 1 0 0 P 1 0 0 0 0 R k 0 P k 0 0 0 0 0 R N P N P 1 P k P N Q ] and ( C ( N ) ) t G ( N ) = [ S 1 S k S N T ] t , where R k = ( i , j ) Θ k ( r kj - r ki ) 2 P k = ( i , j ) Θ k ( r kj - r ki ) ( d kj - d ki ) Q = k = 1 N q k , q k = ( i , j ) Θ k ( d kj - d ki ) 2 S k = ( i , j ) Θ k ( r kj - r ki ) ( r kj d kj - r kj d ki ) T = k = 1 N t k , t k = ( i , j ) Θ k ( d kj - d ki ) ( r kj d kj - r kj d ki ) } k = 1 , , N Θ k = { ( i , j ) | j = i + s , 1 i , j M k } where:
    Figure US20040125210A1-20040701-M00023
    i and j are matrix index variables,
    θk is an index set including all possible pairings of {i,j} that are separated by a fixed number of frames s,
    k is an index variable of tracks where the total number of tracks is represented by N,
    Mk is the length of the kth track among a total of N tracks,
    rkj is the track position in image row coordinates at a ith image frame of the kth track;
    rki is the track position in image row coordinates at a ith image frame of the kth track;
    dkj is a distance traveled by a vehicle at the jth frame of the kth track relative to the first frame, with dkl=0 for all k;
    dki is a distance traveled by a vehicle at the ith frame of the kth track relative to the first frame, with dkl=0 for all k; and
    whereby the terms Rk, Pk, qk, Sk, and tk are computed incrementally as new track points are added and new pairs of {i,j} become part of the index set θk.
  8. 8. A method for generating a camera reference horizon estimate Ŷ0, as set forth in claim 1, wherein in the step of further screening the subset of tracks, the error criteria used includes calculating a root-mean-squared fitting error for each track using the estimated distance and height, and excluding those tracks having a root-mean-squared fitting error exceeding a predetermined threshold.
  9. 9. A method for generating a camera reference horizon estimate Ŷ0, as set forth in claim 8, further comprising steps of:
    maintaining an active camera reference horizon estimate Ŷ0; and
    updating the active camera reference horizon estimate Ŷ0 with the generated camera reference horizon estimate based on the fulfillment of a criteria set.
  10. 10. A method for generating a camera reference horizon estimate Ŷ0, as set forth in claim 9, wherein the images include image tops, and wherein in the step of updating the active camera reference horizon estimate Ŷ0 is triggered based on the fulfillment of a the criteria set including:
    when a longest track exists in a number of images in the sequence of images, where the number exceeds a predetermined threshold; and
    when the longest track is within a predetermined distance from an image top.
  11. 11. A method for generating a camera reference horizon estimate Ŷ0, as set forth in claim 10, wherein the step of estimating distances to the objects represented by the tracks and heights of the objects represented by the tracks is performed by estimating the parameters for distance {circumflex over (D)} and height Ĥ in the linear set,
    A t A [ D ^ H ^ ] = A t B ,
    Figure US20040125210A1-20040701-M00024
    where matrices AtA and AtB are:
    A t A = [ i ( Y 0 - r i ) 2 - f i ( Y 0 - r i ) - f i ( Y 0 - r i ) M f 2 ] [ U V V M f 2 ] , and A t B = [ i d i ( Y 0 - r i ) 2 - f i d i ( Y 0 - r i ) ] [ W Z ] , where U = i ( Y 0 - r i ) 2 = i r i 2 - 2 Y 0 i r i + M Y 0 V = - f i ( Y 0 - r i ) = f ( i r i - M Y 0 ) W = i d i ( Y 0 - r i ) 2 = Y 0 2 i d i 2 - 2 Y 0 i d i r i + i d i r i 2 Z = - f i d i ( Y 0 - r i ) = f ( i d i r i - Y 0 i d i ) } , and where  the  sum i is  for i = 1 to M ;
    Figure US20040125210A1-20040701-M00025
    f is a camera focal length,
    M is a length of a track,
    ri is a track location in an image,
    Y0 is the camera horizon, and
    di is the distance a vehicle hosting the camera has traveled since the first image;
    whereby the matrices AtA and AtB can be updated incrementally as the HEP track length increases from M to M+1.
  12. 12. A method for generating a camera reference horizon estimate Ŷ0, as set forth in claim 11, wherein in the step of generating a camera reference horizon estimate Ŷ0 from the remaining tracks, the reference horizon is determined by solving for Ŷ0 in the following relationship:
    ( ( C ( N ) ) t C ( N ) ) [ D ^ Y ^ 0 ] = ( C ( N ) ) t G ( N ) ,
    Figure US20040125210A1-20040701-M00026
    where the equations for matrices (C(N))t C(N) and (C(N))t G(N) can be written as:
    ( C ( N ) ) t C ( N ) = [ R 1 0 0 P 1 0 0 0 0 R k 0 P k 0 0 0 0 0 R N P N P 1 P k P N Q ] and ( C ( N ) ) t G ( N ) = [ S 1 S k S N T ] t , where R k = ( i , j ) Θ k ( r kj - r ki ) 2 P k = ( i , j ) Θ k ( r kj - r ki ) ( d kj - d ki ) Q = k = 1 N q k , q k = ( i , j ) Θ k ( d kj - d ki ) 2 S k = ( i , j ) Θ k ( r kj - r ki ) ( r kj d kj - r kj d ki ) T = k = 1 N t k , t k = ( i , j ) Θ k ( d kj - d ki ) ( r kj d kj - r kj d ki ) } k = 1 , , N Θ k = { ( i , j ) | j = i + s , 1 <= i , j <= M k }
    Figure US20040125210A1-20040701-M00027
    where:
    i and j are matrix index variables,
    θk is an index set including all possible pairings of {i,j} that are separated by a fixed number of frames s,
    k is an index variable of tracks where the total number of tracks is represented by N,
    Mk is the length of the kth track among a total of N tracks,
    rkj is the track position in image row coordinates at a jth image frame of the kth track;
    rki is the track position in image row coordinates at a ith image frame of the kth track;
    dkj is a distance traveled by a vehicle at the jth frame of the kth track relative to the first frame, with dkl=0 for all k;
    dki is a distance traveled by a vehicle at the ith frame of the kth track relative to the first frame, with dkl=0 for all k; and
    whereby the terms Rk, Pk, qk, Sk, and tk are computed incrementally as new track points are added and new pairs of {i,j} become part of the index set θk.
  13. 13. A method for generating a camera reference horizon estimate Ŷ0, as set forth in claim 1, further comprising steps of:
    maintaining an active camera reference horizon estimate Ŷ0; and
    updating the active camera reference horizon estimate Ŷ0 with the generated camera reference horizon estimate based on the fulfillment of a criteria set.
  14. 14. A method for generating a camera reference horizon estimate Ŷ0, as set forth in claim 13, wherein the images include image tops, and wherein in the step of updating the active camera reference horizon estimate Ŷ0 is triggered based on the fulfillment of a the criteria set including:
    when a longest track exists in a number of images in the sequence of images, where the number exceeds a predetermined threshold; and
    when the longest track is within a predetermined distance from an image top.
  15. 15. A method for generating a camera reference horizon estimate Ŷ0, as set forth in claim 14, wherein the step of estimating distances to the objects represented by the tracks and heights of the objects represented by the tracks is performed by estimating the parameters for distance {circumflex over (D)} and height Ĥ in the linear set,
    A t A [ D ^ H ^ ] = A t B ,
    Figure US20040125210A1-20040701-M00028
    where matrices AtA and AtB are:
    A t A = [ i ( Y 0 - r i ) 2 - f i ( Y 0 - r i ) - f i ( Y 0 - r i ) M f 2 ] [ U V V M f 2 ] , and A t B = [ i ( d i ( Y 0 - r i ) ) 2 - f i d i ( Y 0 - r i ) ] [ W Z ] , where U = i ( Y 0 - r i ) 2 = i r i 2 - 2 Y 0 i r i + M Y 0 V = - f i ( Y 0 - r i ) = f ( i r i - M Y 0 ) W = i ( d i ( Y 0 - r i ) ) 2 = Y 0 2 i d i 2 - 2 Y 0 i d i r i + i d i r i 2 Z = - f i d i ( Y 0 - r i ) = f ( i d i r i - Y 0 i d i ) } , and
    Figure US20040125210A1-20040701-M00029
    f is a camera focal length,
    M is a length of a track,
    ri is a track location in an image,
    Y0 is the camera horizon, and
    di is the distance a vehicle hosting the camera has traveled since the first image;
    whereby the matrices AtA and AtB can be updated incrementally as the HEP track length increases from M to M+1.
  16. 16. A method for generating a camera reference horizon estimate Ŷ0, as set forth in claim 15, wherein in the step of generating a camera reference horizon estimate Ŷ0 from the remaining tracks, the reference horizon is determined by solving for Ŷ0 in the following relationship:
    ( ( C ( N ) ) t C ( N ) ) [ D ^ Y ^ 0 ] = ( C ( N ) ) t G ( N ) ,
    Figure US20040125210A1-20040701-M00030
    where the equations for matrices (C(N))t C(N) and (C(N))t G(N) can be written as:
    ( C ( N ) ) t C ( N ) = [ R 1 0 0 P 1 0 0 0 0 R k 0 P k 0 0 0 0 0 R N P N P 1 P k P N Q ] and ( C ( N ) ) t G ( N ) = [ S 1 S k S N T ] t , where R k = ( i , j ) Θ k ( r kj - r ki ) 2 P k = ( i , j ) Θ k ( r kj - r ki ) ( d kj - d ki ) Q = k = 1 N q k , q k = ( i , j ) Θ k ( d kj - d ki ) 2 S k = ( i , j ) Θ k ( r kj - r ki ) ( r kj d kj - r kj d ki ) T = k = 1 N t k , t k = ( i , j ) Θ k ( d kj - d ki ) ( r kj d kj - r kj d ki ) } k = 1 , , N Θ k = { ( i , j ) | j = i + s , 1 <= i , j <= M k }
    Figure US20040125210A1-20040701-M00031
    where:
    i and j are matrix index variables,
    θk is an index set including all possible pairings of {i,j} that are separated by a fixed number of frames s,
    k is an index variable of tracks where the total number of tracks is represented by N,
    Mk is the length of the kth track among a total of N tracks,
    rkj is the track position in image row coordinates at a jth image frame of the kth track;
    rki is the track position in image row coordinates at a ith image frame of the kth track;
    dkj is a distance traveled by a vehicle at the jth frame of the kth track relative to the first frame, with dkl=0 for all k;
    dki is a distance traveled by a vehicle at the ith frame of the kth track relative to the first frame, with dkl=0 for all k; and
    whereby the terms Rk, Pk, qk, Sk, and tk are computed incrementally as new track points are added and new pairs of {i,j} become part of the index set θk.
  17. 17. A method for generating a camera reference horizon estimate Ŷ0, as set forth in claim 1, wherein the step of estimating distances to the objects represented by the tracks and heights of the objects represented by the tracks is performed by estimating the parameters for distance {circumflex over (D)} and height Ĥ in the linear set,
    A t A [ D ^ H ^ ] = A t B ,
    Figure US20040125210A1-20040701-M00032
    where matrices AtA and AtB are:
    A t A = [ i ( Y 0 - r i ) 2 - f i ( Y 0 - r i ) - f i ( Y 0 - r i ) M f 2 ] [ U V V M f 2 ] , and A t B = [ i ( d i ( Y 0 - r i ) ) 2 - f i d i ( Y 0 - r i ) ] [ W Z ] , where U = i ( Y 0 - r i ) 2 = i r i 2 - 2 Y 0 i r i + M Y 0 V = - f i ( Y 0 - r i ) = f ( i r i - M Y 0 ) W = i ( d i ( Y 0 - r i ) ) 2 = Y 0 2 i d i 2 - 2 Y 0 i d i r i + i d i r i 2 Z = - f i d i ( Y 0 - r i ) = f ( i d i r i - Y 0 i d i ) } , and
    Figure US20040125210A1-20040701-M00033
    f is a camera focal length,
    M is a length of a track,
    ri is a track location in an image,
    Y0 is the camera horizon, and
    di is the distance a vehicle hosting the camera has traveled since the first image;
    whereby the matrices AtA and AtB can be updated incrementally as the HEP track length increases from M to M+1.
  18. 18. A method for generating a camera reference horizon estimate Ŷ0, as set forth in claim 17, wherein in the step of generating a camera reference horizon estimate Ŷ0 from the remaining tracks, the reference horizon is determined by solving for Ŷ0 in the following relationship:
    ( ( C ( N ) ) t C ( N ) ) [ D ^ Y ^ 0 ] = ( C ( N ) ) t G ( N ) ,
    Figure US20040125210A1-20040701-M00034
    where the equations for matrices (C(N))t C(N) and (C(N))t G(N) can be written as:
    ( C ( N ) ) t C ( N ) = [ R 1 0 0 P 1 0 0 0 0 R k 0 P k 0 0 0 0 0 R N P N P 1 P k P N Q ] and ( C ( N ) ) t G ( N ) = [ S 1 S k S N T ] t , where R k = ( i , j ) Θ k ( r kj - r ki ) 2 P k = ( i , j ) Θ k ( r kj - r ki ) ( d kj - d ki ) Q = k = 1 N q k , q k = ( i , j ) Θ k ( d kj - d ki ) 2 S k = ( i , j ) Θ k ( r kj - r ki ) ( r kj d kj - r kj d ki ) T = k = 1 N t k , t k = ( i , j ) Θ k ( d kj - d ki ) ( r kj d kj - r kj d ki ) } k = 1 , , N Θ k = { ( i , j ) | j = i + s , 1 <= i , j <= M k }
    Figure US20040125210A1-20040701-M00035
    where:
    i and j are matrix index variables,
    θk is an index set including all possible pairings of {i,j} that are separated by a fixed number of frames s,
    k is an index variable of tracks where the total number of tracks is represented by N,
    Mk is the length of the kth track among a total of N tracks,
    rkj is the track position in image row coordinates at a jth image frame of the kth track;
    rki is the track position in image row coordinates at a ith image frame of the kth track;
    dkj is a distance traveled by a vehicle at the jth frame of the kth track relative to the first frame, with dkl=0 for all k;
    dki is a distance traveled by a vehicle at the ith frame of the kth track relative to the first frame, with dkl=0 for all k; and
    whereby the terms Rk, Pk, qk, Sk, and tk are computed incrementally as new track points are added and new pairs of {i,j} become part of the index set θk.
  19. 19. A method for generating a camera reference horizon estimate Ŷ0, as set forth in claim 1, wherein in the step of generating a camera reference horizon estimate Ŷ0 from the remaining tracks, the reference horizon is determined by solving for Ŷ0 in the following relationship:
    ( ( C ( N ) ) t C ( N ) ) [ D ^ Y ^ 0 ] = ( C ( N ) ) t G ( N ) ,
    Figure US20040125210A1-20040701-M00036
    where the equations for matrices (C(N))t C(N) and (C(N))t G(N) can be written as:
    ( C ( N ) ) t C ( N ) = [ R 1 0 0 P 1 0 0 0 0 R k 0 P k 0 0 0 0 0 R N P N P 1 P k P N Q ] and ( C ( N ) ) t G ( N ) = [ S 1 S k S N T ] t , where R k = ( i , j ) Θ k ( r kj - r ki ) 2 P k = ( i , j ) Θ k ( r kj - r ki ) ( d kj - d ki ) Q = k = 1 N q k , q k = ( i , j ) Θ k ( d kj - d ki ) 2 S k = ( i , j ) Θ k ( r kj - r ki ) ( r kj d kj - r kj d ki ) T = k = 1 N t k , t k = ( i , j ) Θ k ( d kj - d ki ) ( r kj d kj - r kj d ki ) } k = 1 , , N Θ k = { ( i , j ) | j = i + s , 1 <= i , j <= M k }
    Figure US20040125210A1-20040701-M00037
    where:
    i and j are matrix index variables,
    θk is an index set including all possible pairings of {i,j} that are separated by a fixed number of frames s,
    k is an index variable of tracks where the total number of tracks is represented by N,
    Mk is the length of the kth track among a total of N tracks,
    rkj is the track position in image row coordinates at a jth image frame of the kth track;
    rki is the track position in image row coordinates at a ith image frame of the kth track;
    dkj is a distance traveled by a vehicle at the jth frame of the kth track relative to the first frame, with dkl=0 for all k;
    dki is a distance traveled by a vehicle at the ith frame of the kth track relative to the first frame, with dkl=0 for all k; and
    whereby the terms Rk, Pk, qk, Sk, and tk are computed incrementally as new track points are added and new pairs of {i,j} become part of the index set θk.
  20. 20. A computer program product for generating a camera reference horizon estimate Ŷ0, the computer program product comprising means, residing on a computer-readable medium, for:
    receiving a sequence of images including projections of objects thereon, with projections including horizontal edges;
    estimating vertical motion occurring in the sequence of images and storing the estimated vertical motion as a motion history;
    computing the projection of the horizontal edges in the images;
    segmenting the projected horizontal edges in the images;
    compensating for motion in the images using the estimated vertical motion from the motion history;
    tracking the projections of the horizontal edges from image to image to generate tracks representing objects and maintaining an active track storage by adding more data to update the active track storage from each subsequent image;
    screening the generated tracks using screening criteria to produce a subset of tracks in order to enhance track reliability;
    receiving the subset of tracks from the means for screening and estimating distances to the objects represented by the tracks and heights of the objects represented by the tracks;
    further screening the subset of tracks by calculating an error criteria for each track and selecting a subset of remaining tracks satisfying the error criteria; and
    generating a camera reference horizon estimate Ŷ0 from the remaining tracks.
  21. 21. A computer program product for generating a camera reference horizon estimate Ŷ0, as set forth in claim 20, wherein the screening criteria used by the means for screening include rejecting those tracks that are below the current reference horizon; determining a track length for each track and keeping only those tracks that have track lengths in excess of a predetermined track length threshold; determining the dynamic range of the tracks and keeping only those tracks that have a dynamic range exceeding a dynamic range threshold; and determining an abnormality value for each track and ignoring tracks exceeding an abnormality threshold.
  22. 22. A computer program product for generating a camera reference horizon estimate Ŷ0 as set forth in claim 21, wherein in the means for further screening the subset of tracks, the error criteria used includes calculating a root-mean-squared fitting error for each track using the estimated distance and height, and excluding those tracks having a root-mean-squared fitting error exceeding a predetermined threshold.
  23. 23. A computer program product for generating a camera reference horizon estimate Ŷ0, as set forth in claim 22, further comprising means for:
    maintaining an active camera reference horizon estimate Ŷ0; and
    updating the active camera reference horizon estimate Ŷ0 with the generated camera reference horizon estimate based on the fulfillment of a criteria set.
  24. 24. A computer program product for generating a camera reference horizon estimate Ŷ0, as set forth in claim 23, wherein the images include image tops, and wherein in the means for updating the active camera reference horizon estimate Ŷ0 is triggered based on the fulfillment of a the criteria set including:
    when a longest track exists in a number of images in the sequence of images, where the number exceeds a predetermined threshold; and
    when the longest track is within a predetermined distance from an image top.
  25. 25. A computer program product for generating a camera reference horizon estimate Ŷ0, as set forth in claim 24, wherein the means for estimating distances to the objects represented by the tracks and heights of the objects represented by the tracks is performed by estimating the parameters for distance {circumflex over (D)} and height Ĥ in the linear set,
    A t A [ D ^ H ^ ] = A t B ,
    Figure US20040125210A1-20040701-M00038
    where matrices AtA and AtB are:
    A t A = [ i ( Y 0 - r i ) 2 - f i ( Y 0 - r i ) - f i ( Y 0 - r i ) M f 2 ] [ U V V M f 2 ] , and A t B = [ i ( d i ( Y 0 - r i ) ) 2 - f i d i ( Y 0 - r i ) ] [ W Z ] , where U = i ( Y 0 - r i ) 2 = i r i 2 - 2 Y 0 i r i + M Y 0 V = - f i ( Y 0 - r i ) = f ( i r i - M Y 0 ) W = i ( d i ( Y 0 - r i ) ) 2 = Y 0 2 i d i 2 - 2 Y 0 i d i r i + i d i r i 2 Z = - f i d i ( Y 0 - r i ) = f ( i d i r i - Y 0 i d i ) } , and
    Figure US20040125210A1-20040701-M00039
    f is a camera focal length,
    M is a length of a track,
    ri is a track location in an image,
    Y0 is the camera horizon, and
    di is the distance a vehicle hosting the camera has traveled since the first image;
    whereby the matrices AtA and AtB can be updated incrementally as the HEP track length increases from M to M+1.
  26. 26. A computer program product for generating a camera reference horizon estimate Ŷ0 as set forth in claim 25, wherein in the means for generating a camera reference horizon estimate Ŷ0 from the remaining tracks, the reference horizon is determined by solving for Ŷ0 in the following relationship:
    ( ( C ( N ) ) t C ( N ) ) [ D ^ Y ^ 0 ] = ( C ( N ) ) t G ( N ) ,
    Figure US20040125210A1-20040701-M00040
    where the equations for matrices (C(N))t C(N) and (C(N))t G(N) can be written as:
    ( C ( N ) ) t C ( N ) = [ R 1 0 0 P 1 0 0 0 0 R k 0 P k 0 0 0 0 0 R N P N P 1 P k P N Q ] and ( C ( N ) ) t G ( N ) = [ S 1 S k S N T ] t , where R k = ( i , j ) Θ k ( r kj - r ki ) 2 P k = ( i , j ) Θ k ( r kj - r ki ) ( d kj - d ki ) Q = k = 1 N q k , q k = ( i , j ) Θ k ( d kj - d ki ) 2 S k = ( i , j ) Θ k ( r kj - r ki ) ( r kj d kj - r kj d ki ) T = k = 1 N t k , t k = ( i , j ) Θ k ( d kj - d ki ) ( r kj d kj - r kj d ki ) } k = 1 , , N Θ k = { ( i , j ) | j = i + s , 1 <= i , j <= M k }
    Figure US20040125210A1-20040701-M00041
    where:
    i and j are matrix index variables,
    θk is an index set including all possible pairings of {i,j} that are separated by a fixed number of frames s,
    k is an index variable of tracks where the total number of tracks is represented by N,
    Mk is the length of the kth track among a total of N tracks,
    rkj is the track position in image row coordinates at a jth image frame of the kth track;
    rki is the track position in image row coordinates at a ith image frame of the kth track;
    dkj is a distance traveled by a vehicle at the jth frame of the kth track relative to the first frame, with dkl=0 for all k;
    dki is a distance traveled by a vehicle at the ith frame of the kth track relative to the first frame, with dkl=0 for all k; and
    whereby the terms Rk, Pk, qk, Sk, and tk are computed incrementally as new track points are added and new pairs of {i,j} become part of the index set θk.
  27. 27. A computer program product for generating a camera reference horizon estimate Ŷ0, as set forth in claim 20, wherein in the means for further screening the subset of tracks, the error criteria used includes calculating a root-mean-squared fitting error for each track using the estimated distance and height, and excluding those tracks having a root-mean-squared fitting error exceeding a predetermined threshold.
  28. 28. A computer program product for generating a camera reference horizon estimate Ŷ0, as set forth in claim 27, further comprising means for:
    maintaining an active camera reference horizon estimate Ŷ0; and
    updating the active camera reference horizon estimate Ŷ0 with the generated camera reference horizon estimate based on the fulfillment of a criteria set.
  29. 29. A computer program product for generating a camera reference horizon estimate Ŷ0, as set forth in claim 28, wherein the images include image tops, and wherein in the means for updating the active camera reference horizon estimate Ŷ0 is triggered based on the fulfillment of a the criteria set including:
    when a longest track exists in a number of images in the sequence of images, where the number exceeds a predetermined threshold; and
    when the longest track is within a predetermined distance from an image top.
  30. 30. A computer program product for generating a camera reference horizon estimate Y, as set forth in claim 29, wherein the means for estimating distances to the objects represented by the tracks and heights of the objects represented by the tracks is performed by estimating the parameters for distance {circumflex over (D)} and height Ĥ in the linear set,
    A t A [ D ^ H ^ ] = A t B ,
    Figure US20040125210A1-20040701-M00042
    where matrices AtA and AtB are:
    A t A = [ i ( Y 0 - r i ) 2 - f i ( Y 0 - r i ) - f i ( Y 0 - r i ) M f 2 ] [ U V V M f 2 ] , and A t B = [ i ( d i ( Y 0 - r i ) ) 2 - f i d i ( Y 0 - r i ) ] [ W Z ] , where U = i ( Y 0 - r i ) 2 = i r i 2 - 2 Y 0 i r i + M Y 0 V = - f i ( Y 0 - r i ) = f ( i r i - M Y 0 ) W = i ( d i ( Y 0 - r i ) ) 2 = Y 0 2 i d i 2 - 2 Y 0 i d i r i + i d i r i 2 Z = - f i d i ( Y 0 - r i ) = f ( i d i r i - Y 0 i d i ) } , and
    Figure US20040125210A1-20040701-M00043
    f is a camera focal length,
    M is a length of a track,
    ri is a track location in an image,
    Y0 is the camera horizon, and
    di is the distance a vehicle hosting the camera has traveled since the first image;
    whereby the matrices AtA and AtB can be updated incrementally as the HEP track length increases from M to M+1.
  31. 31. A computer program product for generating a camera reference horizon estimate Ŷ0, as set forth in claim 30, wherein in the means for generating a camera reference horizon estimate Ŷ0 from the remaining tracks, the reference horizon is determined by solving for Ŷ0 in the following relationship:
    ( ( C ( N ) ) t C ( N ) ) [ D ^ Y ^ 0 ] = ( C ( N ) ) t G ( N ) ,
    Figure US20040125210A1-20040701-M00044
    where the equations for matrices (C(N))t C(N) and (C(N))t G(N) can be written as:
    ( C ( N ) ) t C ( N ) = [ R 1 0 0 P 1 0 0 0 0 R k 0 P k 0 0 0 0 0 R N P N P 1 P k P N Q ] and ( C ( N ) ) t G ( N ) = [ S 1 S k S N T ] t , where R k = ( i , j ) Θ k ( r kj - r ki ) 2 P k = ( i , j ) Θ k ( r kj - r ki ) ( d kj - d ki ) Q = k = 1 N q k , q k = ( i , j ) Θ k ( d kj - d ki ) 2 S k = ( i , j ) Θ k ( r kj - r ki ) ( r kj d kj - r kj d ki ) T = k = 1 N t k , t k = ( i , j ) Θ k ( d kj - d ki ) ( r kj d kj - r kj d ki ) } k = 1 , , N Θ k = { ( i , j ) | j = i + s , 1 <= i , j <= M k }
    Figure US20040125210A1-20040701-M00045
    where:
    i and j are matrix index variables,
    θk is an index set including all possible pairings of {i,j} that are separated by a fixed number of frames s,
    k is an index variable of tracks where the total number of tracks is represented by N,
    Mk is the length of the kth track among a total of N tracks,
    rkj is the track position in image row coordinates at a jth image frame of the kth track;
    rki is the track position in image row coordinates at a ith image frame of the kth track;
    dkj is a distance traveled by a vehicle at the jth frame of the kth track relative to the first frame, with dkl=0 for all k;
    dki is a distance traveled by a vehicle at the ith frame of the kth track relative to the first frame, with dkl=0 for all k; and
    whereby the terms Rk, Pk, qk, Sk, and tk are computed incrementally as new track points are added and new pairs of {i,j} become part of the index set θk.
  32. 32. A computer program product for generating a camera reference horizon estimate Ŷ0 as set forth in claim 20, further comprising means for:
    maintaining an active camera reference horizon estimate Ŷ0; and
    updating the active camera reference horizon estimate Ŷ0 with the generated camera reference horizon estimate based on the fulfillment of a criteria set.
  33. 33. A computer program product for generating a camera reference horizon estimate Ŷ0, as set forth in claim 32, wherein the images include image tops, and wherein in the means for updating the active camera reference horizon estimate Ŷ0, is triggered based on the fulfillment of a the criteria set including:
    when a longest track exists in a number of images in the sequence of images, where the number exceeds a predetermined threshold; and
    when the longest track is within a predetermined distance from an image top.
  34. 34. A computer program product for generating a camera reference horizon estimate Ŷ0, as set forth in claim 33, wherein the means for estimating distances to the objects represented by the tracks and heights of the objects represented by the tracks is performed by estimating the parameters for distance {circumflex over (D)} and height Ĥ in the linear set,
    A t A [ D ^ H ^ ] = A t B ,
    Figure US20040125210A1-20040701-M00046
    where matrices AtA and AtB are:
    A t A = [ i ( Y 0 - r i ) 2 - f i ( Y 0 - r i ) - f i ( Y 0 - r i ) M f 2 ] [ U V V M f 2 ] , and A t B = [ i ( d i ( Y 0 - r i ) ) 2 - f i d i ( Y 0 - r i ) ] [ W Z ] , where U = i ( Y 0 - r i ) 2 = i r i 2 - 2 Y 0 i r i + M Y 0 V = - f i ( Y 0 - r i ) = f ( i r i - M Y 0 ) W = i ( d i ( Y 0 - r i ) ) 2 = Y 0 2 i d i 2 - 2 Y 0 i d i r i + i d i r i 2 Z = - f i d i ( Y 0 - r i ) = f ( i d i r i - Y 0 i d i ) } , and
    Figure US20040125210A1-20040701-M00047
    f is a camera focal length,
    M is a length of a track,
    ri is a track location in an image,
    Y0 is the camera horizon, and
    di is the distance a vehicle hosting the camera has traveled since the first image;
    whereby the matrices AtA and AtB can be updated incrementally as the HEP track length increases from M to M+1.
  35. 35. A computer program product for generating a camera reference horizon estimate Ŷ0, as set forth in claim 34, wherein in the means for generating a camera reference horizon estimate Ŷ0 from the remaining tracks, the reference horizon is determined by solving for Ŷ0 in the following relationship:
    ( ( C ( N ) ) t C ( N ) ) [ D ^ Y ^ 0 ] = ( C ( N ) ) t G ( N ) ,
    Figure US20040125210A1-20040701-M00048
    where the equations for matrices (C(N))t C(N) and (C(N))t G(N) can be written as:
    ( C ( N ) ) t C ( N ) = [ R 1 0 0 P 1 0 0 0 0 R k 0 P k 0 0 0 0 0 R N P N P 1 P k P N Q ] and ( C ( N ) ) t G ( N ) = [ S 1 S k S N T ] t , where R k = ( i , j ) Θ k ( r kj - r ki ) 2 P k = ( i , j ) Θ k ( r kj - r ki ) ( d kj - d ki ) Q = k = 1 N q k , q k = ( i , j ) Θ k ( d kj - d ki ) 2 S k = ( i , j ) Θ k ( r kj - r ki ) ( r kj d kj - r kj d ki ) T = k = 1 N t k , t k = ( i , j ) Θ k ( d kj - d ki ) ( r kj d kj - r kj d ki ) } k = 1 , , N Θ k = { ( i , j ) | j = i + s , 1 <= i , j <= M k }
    Figure US20040125210A1-20040701-M00049
    where:
    i and j are matrix index variables,
    θk is an index set including all possible pairings of {i,j} that are separated by a fixed number of frames s,
    k is an index variable of tracks where the total number of tracks is represented by N.
    Mk is the length of the kth track among a total of N tracks, rkj is the track position in image row coordinates at a jth image frame of the kth track;
    rki is the track position in image row coordinates at a ith image frame of the kth track;
    dki is a distance traveled by a vehicle at the jth frame of the kth track relative to the first frame, with dkl=0 for all k;
    dki is a distance traveled by a vehicle at the ith frame of the kth track relative to the first frame, with dkl=0 for all k; and
    whereby the terms Rk, Pk, qk, Sk, and tk are computed incrementally as new track points are added and new pairs of {i, j} become part of the index set θk.
  36. 36. A computer program product for generating a camera reference horizon estimate Ŷ0 as set forth in claim 20, wherein the means for estimating distances to the objects represented by the tracks and heights of the objects represented by the tracks is performed by estimating the parameters for distance {circumflex over (D)} and height Ĥ in the linear set,
    A t A [ D ^ H ^ ] = A t B ,
    Figure US20040125210A1-20040701-M00050
    where matrices AtA and AtB are:
    A t A = [ i ( Y 0 - r i ) 2 - f i ( Y 0 - r i ) - f i ( Y 0 - r i ) M f 2 ] [ U V V M f 2 ] , and A t B = [ i ( d i ( Y 0 - r i ) ) 2 - f i d i ( Y 0 - r i ) ] [ W Z ] , where U = i ( Y 0 - r i ) 2 = i r i 2 - 2 Y 0 i r i + M Y 0 V = - f i ( Y 0 - r i ) = f ( i r i - M Y 0 ) W = i ( d i ( Y 0 - r i ) ) 2 = Y 0 2 i d i 2 - 2 Y 0 i d i r i + i d i r i 2 Z = - f i d i ( Y 0 - r i ) = f ( i d i r i - Y 0 i d i ) } , and where  the  sum i is  for i = 1 to M ;
    Figure US20040125210A1-20040701-M00051
    f is a camera focal length,
    M is a length of a track,
    ri is a track location in an image,
    Y0 is the camera horizon, and
    di is the distance a vehicle hosting the camera has traveled since the first image;
    whereby the matrices AtA and AtB can be updated incrementally as the HEP track length increases from M to M+1.
  37. 37. A computer program product for generating a camera reference horizon estimate Ŷ0, as set forth in claim 36, wherein in the means for generating a camera reference horizon estimate Ŷ0 from the remaining tracks, the reference horizon is determined by solving for Ŷ0 in the following relationship:
    ( ( C ( N ) ) t C ( N ) ) [ D ^ Y ^ 0 ] = ( C ( N ) ) t G ( N ) ,
    Figure US20040125210A1-20040701-M00052
    where the equations for matrices (C(N))t C(N) and (C(N))t G(N) can be written as:
    ( C ( N ) ) t C ( N ) = [ R 1 0 0 P 1 0 0 0 0 R k 0 P k 0 0 0 0 0 R N P N P 1 P k P N Q ] and ( C ( N ) ) t G ( N ) = [ S 1 S k S N T ] t , where R k = ( i , j ) Θ k ( r kj - r ki ) 2 P k = ( i , j ) Θ k ( r kj - r ki ) ( d kj - d ki ) Q = k = 1 N q k , q k = ( i , j ) Θ k ( d kj - d ki ) 2 S k = ( i , j ) Θ k ( r kj - r ki ) ( r kj d kj - r kj d ki ) T = k = 1 N t k , t k = ( i , j ) Θ k ( d kj - d ki ) ( r kj d kj - r kj d ki ) } k = 1 , , N Θ k = { ( i , j ) | j = i + s , 1 <= i , j <= M k }
    Figure US20040125210A1-20040701-M00053
    where:
    i and j are matrix index variables,
    θk is an index set including all possible pairings of {i,j} that are separated by a fixed number of frames s,
    k is an index variable of tracks where the total number of tracks is represented by N,
    Mk is the length of the kth track among a total of N tracks,
    rkj is the track position in image row coordinates at a jth image frame of the kth track;
    rki is the track position in image row coordinates at a ith image frame of the kth track;
    dkj is a distance traveled by a vehicle at the jth frame of the kth track relative to the first frame, with dkl=0 for all k;
    dki is a distance traveled by a vehicle at the ith frame of the kth track relative to the first frame, with dkl=0 for all k; and
    whereby the terms Rk, Pk, qk, Sk, and tk are computed incrementally as new track points are added and new pairs of {i,j} become part of the index set θk.
  38. 38. A computer program product for generating a camera reference horizon estimate Ŷ0, as set forth in claim 20, wherein in the means for generating a camera reference horizon estimate Ŷ0 from the remaining tracks, the reference horizon is determined by solving for Ŷ0 in the following relationship:
    ( ( C ( N ) ) t C ( N ) ) [ D ^ Y ^ 0 ] = ( C ( N ) ) t G ( N ) ,
    Figure US20040125210A1-20040701-M00054
    where the equations for matrices (C(N))t C(N) and (C(N))t G(N) can be written as:
    ( C ( N ) ) t C ( N ) = [ R 1 0 0 P 1 0 0 0 0 R k 0 P k 0 0 0 0 0 R N P N P 1 P k P N Q ] and ( C ( N ) ) t G ( N ) = [ S 1 S k S N T ] t , where R k = ( i , j ) Θ k ( r kj - r ki ) 2 P k = ( i , j ) Θ k ( r kj - r ki ) ( d kj - d ki ) Q = k = 1 N q k , q k = ( i , j ) Θ k ( d kj - d ki ) 2 S k = ( i , j ) Θ k ( r kj - r ki ) ( r kj d kj - r kj d ki ) T = k = 1 N t k , t k = ( i , j ) Θ k ( d kj - d ki ) ( r kj d kj - r kj d ki ) } k = 1 , , N Θ k = { ( i , j ) | j = i + s , 1 <= i , j <= M k }
    Figure US20040125210A1-20040701-M00055
    where:
    i and j are matrix index variables,
    θk is an index set including all possible pairings of {i,j} that are separated by a fixed number of frames s,
    k is an index variable of tracks where the total number of tracks is represented by N,
    Mk is the length of the kth track among a total of N tracks,
    rkj is the track position in image row coordinates at a jth image frame of the kth track;
    rki is the track position in image row coordinates at a ith image frame of the kth track;
    dkj is a distance traveled by a vehicle at the jth frame of the kth track relative to the first frame, with dkl=0 for all k;
    dki is a distance traveled by a vehicle at the ith frame of the kth track relative to the first frame, with dkl=0 for all k; and
    whereby the terms Rk, Pk, qk, Sk, and tk are computed incrementally as new track points are added and new pairs of {i,j} become part of the index set θk.
  39. 39. An apparatus for generating a camera reference horizon estimate Ŷ0, the apparatus comprising:
    a memory for storing data and processing results;
    a processor operationally coupled with the memory, the processor including means for receiving a sequence of images including projections of objects thereon, with projections including horizontal edges, the processor including means for performing operations thereon, for:
    estimating vertical motion occurring in the sequence of images and storing the estimated vertical motion as a motion history;
    computing the projection of the horizontal edges in the images;
    segmenting the projected horizontal edges in the images;
    compensating for motion in the images using the estimated vertical motion from the motion history;
    tracking the projections of the horizontal edges from image to image to generate tracks representing objects and maintaining an active track storage by adding more data to update the active track storage from each subsequent image;
    screening the generated tracks using screening criteria to produce a subset of tracks in order to enhance track reliability;
    receiving the subset of tracks from the means for screening and estimating distances to the objects represented by the tracks and heights of the objects represented by the tracks;
    further screening the subset of tracks by calculating an error criteria for each track and selecting a subset of remaining tracks satisfying the error criteria; and
    generating a camera reference horizon estimate Ŷ0 from the remaining tracks.
  40. 40. A apparatus for generating a camera reference horizon estimate Ŷ0, as set forth in claim 39, wherein the screening criteria used by the means for screening include rejecting those tracks that are below the current reference horizon; determining a track length for each track and keeping only those tracks that have track lengths in excess of a predetermined track length threshold; determining the dynamic range of the tracks and keeping only those tracks that have a dynamic range exceeding a dynamic range threshold; and determining an abnormality value for each track and ignoring tracks exceeding an abnormality threshold.
  41. 41. A apparatus for generating a camera reference horizon estimate Ŷ0, as set forth in claim 40, wherein in the means for further screening the subset of tracks, the error criteria used includes calculating a root-mean-squared fitting error for each track using the estimated distance and height, and excluding those tracks having a root-mean-squared fitting error exceeding a predetermined threshold.
  42. 42. A apparatus for generating a camera reference horizon estimate Ŷ0, as set forth in claim 41, further comprising means for:
    maintaining an active camera reference horizon estimate Ŷ0; and
    updating the active camera reference horizon estimate Ŷ0 with the generated camera reference horizon estimate based on the fulfillment of a criteria set.
  43. 43. A apparatus for generating a camera reference horizon estimate Ŷ0, as set forth in claim 42, wherein the images include image tops, and wherein in the means for updating the active camera reference horizon estimate Ŷ0 is triggered based on the fulfillment of a the criteria set including:
    when a longest track exists in a number of images in the sequence of images, where the number exceeds a predetermined threshold; and
    when the longest track is within a predetermined distance from an image top.
  44. 44. A apparatus for generating a camera reference horizon estimate Ŷ0, as set forth in claim 43, wherein the means for estimating distances to the objects represented by the tracks and heights of the objects represented by the tracks is performed by estimating the parameters for distance {circumflex over (D)} and height Ĥ in the linear set,
    A t A [ D ^ H ^ ] = A t B ,
    Figure US20040125210A1-20040701-M00056
    where matrices AtA and AtB are:
    A t A = [ i ( Y 0 - r i ) 2 - f i ( Y 0 - r i ) - f i ( Y 0 - r i ) M f 2 ] [ U V V M f 2 ] , and A t B = [ i ( d i ( Y 0 - r i ) ) 2 - f i d i ( Y 0 - r i ) ] [ W Z ] , where U = i ( Y 0 - r i ) 2 = i r i 2 - 2 Y 0 i r i + M Y 0 V = - f i ( Y 0 - r i ) = f ( i r i - M Y 0 ) W = i ( d i ( Y 0 - r i ) ) 2 = Y 0 2 i d i 2 - 2 Y 0 i d i r i + i d i r i 2 Z = - f i d i ( Y 0 - r i ) = f ( i d i r i - Y 0 i d i ) } , and where  the  sum i is  for i = 1 to M ;
    Figure US20040125210A1-20040701-M00057
    f is a camera focal length,
    M is a length of a track,
    ri is a track location in an image,
    Y0 is the camera horizon, and
    di is the distance a vehicle hosting the camera has traveled since the first image;
    whereby the matrices AtA and AtB can be updated incrementally as the HEP track length increases from M to M+1.
  45. 45. A apparatus for generating a camera reference horizon estimate Ŷ0, as set forth in claim 44, wherein in the means for generating a camera reference horizon estimate Ŷ0 from the remaining tracks, the reference horizon is determined by solving for Ŷ0 in the following relationship:
    ( ( C ( N ) ) t C ( N ) ) [ D ^ Y ^ 0 ] = ( C ( N ) ) <