RIGHTS IN INVENTION

[0001] This invention was made with support under Government Subcontract No. E80011 Boeing Corp. under Prime Contract N0001997C0009 with the Department of the Navy. The U. S. Government may have certain rights to this invention.
BACKGROUND OF THE INVENTION

[0002]
1. Field of the Invention

[0003]
The present invention relates to imaging systems. More specifically, the present invention relates to correlation based imaging target trackers.

[0004]
2. Description of the Related Art

[0005]
An autotracker is a device which locates a target in each of a sequence of images and generates commands to maintain a sensor lineofsight on the target. Correlation trackers generate servo commands based on the position of the target. The tracker measures this position by finding the shift of the input image that maximizes its crosscorrelation with a reference image formed by averaging recent past input images.

[0006]
During a target tracking operation, it often happens that the lineofsight from the sensor to the target becomes temporarily obscured. For example, in an airtoground scenario, a cloud or a building may pass between the sensor and the target temporarily blocking the lineofsight. One important function of the autotracker is to determine when the target has been obscured, thereby detecting a ‘breaklock’ condition. The tracker is then commanded into a ‘coast mode’ until the target is again visible. In coast mode the sensor is pointed openloop at the predicted position of the target based on its recent motion. While in coast mode the tracker must continually monitor the input image to determine when the target is again visible so that closedloop tracking may resume.

[0007]
Prior attempts to detect breaklock made use of either: 1) a threshold on a statistical average of the residuals after crosscorrelation (see, for example, “Developing a RealTime, Robust, Video Tracker,” K. Plakas, E. Trucco (0780365518/00 IEEE) or 2) a threshold on the change in the difference between the input image and the correlation reference map. Unfortunately, these approaches have proven to be too sensitive to certain image disturbances which commonly occur in dynamic tracking situations. This is due to the reliance by these approaches on an assumption that the difference between the input image and reference map pixel values has a Gaussian density. While this may be true in an idealized situation, in practical tracking situations there are several types of image disturbances which cause the Gaussian assumption to falter. One such disturbance is due to a change of the sensor position relative to the tracked target which causes the target's image to change in size and shape. Another is due to uncommanded sensor lineofsight deviations such as jitter which cause image blurring. These image disturbances will cause the existing breaklock detection methods to falsely declare a breaklock condition when continued tracking is still feasible.

[0008]
Hence, a need exists in the art for an improved system or method for detecting breaklock for correlation trackers.
SUMMARY OF THE INVENTION

[0009]
The need in the art is addressed by the system and method for detecting breaklock for correlation trackers of the present invention. The inventive system includes a first circuit for computing energy E_{L }in a residual image, a second circuit for computing energy E_{I }in an input image, and a third circuit for computing a residual metric R_{L }based on the residual image energy E_{L }scaled by the input image energy E_{I}.

[0010]
In the illustrative embodiment, the system further includes a fourth circuit for comparing the residual metric R_{L }to a threshold T_{R}, and a fifth circuit for setting a breaklock signal based on the output of the fourth circuit.

[0011]
The present invention addresses the problems associated with the prior art by comparing the residual energy to the energy in the input image, rather than the current method of comparing the residual energy to an absolute threshold. By scaling the residual energy by the input image energy, a more robust statistic is obtained which allows closedloop tracking to continue as long as even a small amount of match exists between the input and reference images. The invention has been designed to quickly and accurately determine actual breaklock conditions such as obscurations and extreme uncommanded lineofsight deviations while enabling continued tracking through high sensortotarget relative position rates and sensor lineofsight jitter.
BRIEF DESCRIPTION OF THE DRAWINGS

[0012]
[0012]FIG. 1 is a block diagram of an autotracker system.

[0013]
[0013]FIG. 2 is a block diagram of the components of an autotracker designed in accordance with the teachings of the present invention.

[0014]
[0014]FIG. 3 is a flow chart illustrating the breaklock detection method of the present invention.

[0015]
[0015]FIG. 4 is a block diagram showing a hardware implementation of the autotracker system of the present invention.
DESCRIPTION OF THE INVENTION

[0016]
Illustrative embodiments and exemplary applications will now be described with reference to the accompanying drawings to disclose the advantageous teachings of the present invention.

[0017]
While the present invention is described herein with reference to illustrative embodiments for particular applications, it should be understood that the invention is not limited thereto. Those having ordinary skill in the art and access to the teachings provided herein will recognize additional modifications, applications, and embodiments within the scope thereof and additional fields in which the present invention would be of significant utility.

[0018]
[0018]FIG. 1 is a block diagram of an autotracker system 10. A monochromatic sensor 12, mounted in a gimbal 14 generates a two dimensional image, I(r,c), of the target. Example sensors are an infrared (IR) sensor or a TV camera. This image is processed by the autotracker 16, which generates commands to the servo motors 18 to maintain the sensor lineofsight on the target. In a correlation tracker, the position of the target is measured by finding the shift of the input image that maximizes its crosscorrelation with a reference image formed by averaging recent past input images.

[0019]
The present invention is a system and method for breaklock detection for use in correlation autotracker systems. It uses statistics generated by the correlation process to compute a residual metric that is used to determine, while tracking, when the target is no longer present in the field of view, and, while coasting, when the target is again visible in the field of view. The invention addresses the problems of the prior art by comparing the residual energy to the energy in the input image, rather than the current method of comparing the residual energy to an absolute threshold. By scaling the residual energy by the input image energy, a more robust statistic is obtained which allows closedloop tracking to continue as long as even a small amount of match exists between the input and reference images. The invention has been designed to quickly and accurately determine actual breaklock conditions such as obscurations and extreme uncommanded lineofsight deviations while enabling continued tracking through high sensortotarget relative position rates and sensor lineofsight jitter.

[0020]
[0020]FIG. 2 is a block diagram of the components of an autotracker 16 designed in accordance with the teachings of the present invention. A correlation pixel processing function 20 maintains a reference map and crosscorrelates this map with each input image to determine the track errors and a residual metric. The track errors are used by the closedloop control function 22 to compute servo commands to keep the tracked target in the center of the sensor field of view. The residual metric, the subject of this patent, is used by status processing 24 to derive a breaklock signal. This breaklock signal is used by the closedloop control function 22 to determine when it should point the sensor based on the track errors (closedloop mode) or when it should ignore the track errors and point using the predicted target position (coast mode).

[0021]
While this method can be applied to any correlation tracker, the invention will now be described with reference to an illustrative application to the Fitts correlation pixel processing algorithm described in U.S. Pat. No. 4,133,004, issued Jan. 2, 1979, entitled “Video Correlation Tracker”, the teachings of which are incorporated herein by reference. In accordance with the Fitts teachings, a correlation pixel processing algorithm generates a reference map, denoted M(r,c), in pixel format by averaging input images prior to the current time t. (It is assumed herein that all images mentioned have N_{rows }and N_{cols }columns of pixels for a total of N_{pixels}=N_{rows}*N_{cols }pixels. Thus, the row numbers r, have values in the range from 1 to N_{rows }inclusive, and the column numbers, c, have values in the range from 1 to N_{cols }inclusive.)

[0022]
The correlation pixel processing
20 correlates the reference map M(r,c) with the input image at time t, denoted I(r,c), to determine the track error, denoted
$\delta =\left[\begin{array}{c}{\delta}_{r}\\ {\delta}_{c}\end{array}\right],$

[0023]
which is the estimated shift of the input image relative to the reference image. As described in the Fitts correlation patent, in the process of computing the track error the Fitts algorithm computes gradient images that are approximations to the image spatial derivatives in the row and column directions. The gradient image in the row direction, denoted G
_{r}(r,c), is given by:
$\begin{array}{cc}{G}_{r}\ue8a0\left(r,c\right)=\frac{M\ue8a0\left(r+1,c\right)M\ue8a0\left(r1,c\right)}{2}& \left[1\right]\end{array}$

[0024]
and the gradient image in the column direction, denoted G
_{c}(r,c) is given by:
$\begin{array}{cc}{G}_{c}\ue8a0\left(r,c\right)=\frac{M\ue8a0\left(r,c+1\right)M\ue8a0\left(r,c1\right)}{2}& \left[2\right]\end{array}$

[0025]
The residual image, denoted L(r,c), is defined to be the difference between the input image and the reference map shifted by δ′, where
${\delta}^{\prime}=\left[\begin{array}{c}{\delta}_{r}^{\prime}\\ {\delta}_{c}^{\prime}\end{array}\right]$

[0026]
is the computed track error before drift compensation, as described in the Fitts algorithm patent. The shifted reference map, denoted M′(r,c), may be approximated using a first order Taylor expansion by:

M′(r,c)=M(r,c)+G _{r}(r,c)δ′_{r} +G _{c}(r,c)δ′_{c} [3]

[0027]
Thus, the residual image can be approximated by:

L(r,c)=I(r,c)−M′(r,c)=I(r,c)−M(r,c)−G _{r}(r,c)δ′_{r} −G _{c}(r,c)δ′_{c}, [4]

or

L(r,c)=D(r,c)−G _{r}(r,c)δ′_{r} −G _{c}(r,c)δ′_{c} [5]

[0028]
where D(r,c) is the difference image defined by:

D(r,c)=I(r,c)−M(r,c) [6]

[0029]
The energy in the residual image, denoted E
_{L}, gives an indication of the relative amount of unexplained variation in the residual image, and is defined as:
$\begin{array}{cc}{E}_{L}=\sum _{\left(r,c\right)\in \mathrm{Gate}}\ue89e{\left(L\ue8a0\left(r,c\right)\stackrel{\_}{L}\right)}^{2},& \left[7\right]\end{array}$

[0030]
where {overscore (L)} is the average of the residual image pixel values given by:
$\begin{array}{cc}\stackrel{\_}{L}=\frac{1}{{N}_{\mathrm{pixels}}}\xb7\sum _{\left(r,c\right)\in \mathrm{Gate}}\ue89eL\ue8a0\left(r,c\right)& \left[8\right]\end{array}$

[0031]
and Gate represents the set of (r,c) values for which 1≦r≦N_{rows }and 1≦c≦N_{cols}.

[0032]
Similarly, the energy in the input image, denoted E
_{I}, is defined to be:
$\begin{array}{cc}{E}_{I}=\sum _{\left(r,c\right)\in \mathrm{Gate}}\ue89e{\left(I\ue8a0\left(r,c\right)\stackrel{\_}{I}\right)}^{2},& \left[9\right]\end{array}$

[0033]
where {overscore (I)} is the average of the input image pixel values and is given by:
$\begin{array}{cc}\stackrel{\_}{I}=\frac{1}{{N}_{\mathrm{pixels}}}\xb7\sum _{\left(r,c\right)\in \mathrm{Gate}}\ue89eI\ue8a0\left(r,c\right)& \left[10\right]\end{array}$

[0034]
The residual metric, denoted R
_{L}, is then defined to be:
$\begin{array}{cc}{R}_{L}=\frac{{E}_{L}}{{E}_{I}}& \left[11\right]\end{array}$

[0035]
and represents the relative amount of the energy in the input image that is not explained by modeling it as the shifted reference image.

[0036]
In practice the energy in the residual image can be approximated by the energy in the difference image because the image shifts are relatively small, and the change they induce can often be neglected. Let the energy in the difference image, denoted E
_{D}, be defined as:
$\begin{array}{cc}{E}_{D}=\sum _{\left(r,c\right)\in \mathrm{Gate}}\ue89e{\left(D\ue8a0\left(r,c\right)\stackrel{\_}{D}\right)}^{2}& \left[12\right]\end{array}$

[0037]
where {overscore (D)} is the average of the difference image pixel values given by:
$\begin{array}{cc}\stackrel{\_}{D}=\frac{1}{{N}_{\mathrm{pixels}}}\xb7\sum _{\left(r,c\right)\ue89e\text{\hspace{1em}}\in \text{\hspace{1em}}\ue89e\mathrm{Gate}}\ue89eD\ue8a0\left(r,c\right)& \left[13\right]\end{array}$

[0038]
The difference residual metric, denoted R
_{D}, is then defined to be:
$\begin{array}{cc}{R}_{D}=\frac{{E}_{D}}{{E}_{I}}& \left[14\right]\end{array}$

[0039]
The residual metric, either R_{D }or R_{L}, is used by the status processing function 24 to determine the value of the breaklock signal. If the shifted reference image exactly matches the input image then the residual metric will very nearly equal zero. As the degree of match decreases the residual metric grows in size. Values of this metric close to or greater than one indicate a complete mismatch between the reference image and the input image. Therefore, a threshold, T_{R}, on the residual metric is used to derive the breaklock signal. If the residual metric is less than or equal to T_{R}, the match between the reference image and the input image is considered good enough to continue tracking and the breaklock signal is set to FALSE indicating that closedloop tracking is to continue. If the residual metric is greater than T_{R}, then the match has significantly degraded and the breaklock signal is set to TRUE indicating that the closedloop control should enter coast mode. Empirical experience has shown that tracking can reliably continue with T_{R }values up to 0.9.

[0040]
[0040]FIG. 3 is a flow chart illustrating the breaklock detection method 60 of the present invention. First, the energy E_{L }in the residual image is determined (50), and the energy E_{I }in the input image is determined (52). In the illustrative embodiment, E_{L }is computed using Eqn. 7, or estimated using Eqn. 12. In the illustrative embodiment, E_{I }is computed using Eqn. 9. Next, the residual metric R_{L}=E_{L}/E_{I }is computed (54). Then, the residual metric R_{L }is compared to a threshold T_{R }(56). Finally, if the residual metric R_{L }was greater than the threshold T_{R}, then the Breaklock Signal is set to TRUE; otherwise, it is set to FALSE (58).

[0041]
[0041]FIG. 4 is a block diagram showing a hardware implementation of the autotracker system 10 of the present invention. A sensor 12 mounted in a gimbal 14 receives visible or infrared images which are then processed by a video processor 30. The video processor 30 formats the received images into digital video and, for infrared, performs nonuniformity correction and dead cell replacement. The digital images are then sent to an image processor 32 which computes the track errors and the residual metric. The track errors are used by a servo control processor 34 to generate the torque commands for the sensor gimbal hardware 14 to keep the tracked target in the center of the sensor field of view. The residual metric is used by a system control processor 36 to derive a breaklock signal. This breaklock signal is used by the servo control processor 34 to determine when it should point the sensor based on the track errors (closedloop mode) or when it should ignore the track errors and point using the predicted target position (coast mode). The system control processor 36 controls the operation of the entire autotracker system and may interface to an aircraft.

[0042]
Thus, the present invention has been described herein with reference to a particular embodiment for a particular application. Those having ordinary skill in the art and access to the present teachings will recognize additional modifications, applications and embodiments within the scope thereof.

[0043]
It is therefore intended by the appended claims to cover any and all such applications, modifications and embodiments within the scope of the present invention.

[0044]
Accordingly,