EP1704510A1 - A method and system for adaptive target detection - Google Patents

A method and system for adaptive target detection

Info

Publication number
EP1704510A1
EP1704510A1 EP04816720A EP04816720A EP1704510A1 EP 1704510 A1 EP1704510 A1 EP 1704510A1 EP 04816720 A EP04816720 A EP 04816720A EP 04816720 A EP04816720 A EP 04816720A EP 1704510 A1 EP1704510 A1 EP 1704510A1
Authority
EP
European Patent Office
Prior art keywords
image data
sub
view
field
predetermined
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP04816720A
Other languages
German (de)
French (fr)
Other versions
EP1704510A4 (en
Inventor
Hai-Wen Chen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lockheed Martin Corp
Lockheed Martin Missiles and Fire Control
Original Assignee
Lockheed Missiles and Space Co Inc
Lockheed Martin Missiles and Fire Control
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lockheed Missiles and Space Co Inc, Lockheed Martin Missiles and Fire Control filed Critical Lockheed Missiles and Space Co Inc
Publication of EP1704510A1 publication Critical patent/EP1704510A1/en
Publication of EP1704510A4 publication Critical patent/EP1704510A4/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • G01S3/785Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
    • G01S3/786Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically
    • G01S3/7864T.V. type tracking systems
    • G01S3/7865T.V. type tracking systems using correlation of the live video image with a stored image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/255Detecting or recognising potential candidate objects based on visual cues, e.g. shapes

Definitions

  • the present invention relates generally to image processing. It particularly relates to an image processing target detection system and method that uses adaptive spatial filtering and time-differencing processes to detect and track targets within various background environments.
  • Passive IR (Infrared) sensors are widely used to detect the energy emitted from targets, backgrounds, incoming threats, and the atmosphere for a plurality of applications including military surveillance, missile target and detection systems, crop and forest management, weather forecasting, and other applications.
  • the measures of performance for passive IR sensors include signal- to-noise ratio (S/N), radiation contrast, noise-equivalent temperature difference (NEDT), minimum resolvable temperature difference, and other parameters. These sensors may be designed to enhance one or more of these parameters for optimum performance during a particular application.
  • IRST Infrared search and track
  • FOV field of view
  • FOR field of regard
  • IRST sensors are commonly designed to operate with a small noise-equivalent temperature difference (NEDT) to detect small target-to-background contrast temperatures, and therefore heavy background clutter may strongly hinder accurate target detection and tracking and lead to a higher probability of false alarm (P fa ).
  • NEDT noise-equivalent temperature difference
  • a common target detection and tracking scenario for military applications may be a fighter jet 109 attempting to detect and track incoming fighter jets 122 and/or incoming missiles (bombs) 124 that may be enemy-controlled.
  • FIG. 1B illustrates an exemplary SpatiallRST image processing system 100 found in the prior art.
  • an image 102 input from an IR sensor (not shown) is initially spatially convolved by a matched filter 104 to generate a spatially filtered image output.
  • the matched filter 104 may be generally designed using a well-known system point spread function (PSF) since at a long distance an incoming airborne target may be considered as a point radiant source.
  • PSF system point spread function
  • a point spread function maps the intensity distribution for the received signal at the sensor generated from the point source of light (airborne target at a long distance).
  • the spatially filtered output may be divided by a local background estimation (provided by an estimator 106) using a divider 108 which provides an output image to a CFAR (constant false alarm rate) detector 110.
  • a CFAR detector allows for setting of one or more detection threshold levels to provide a maximum (tolerable) false alarm rate.
  • the detector 110 provides an output signal 112 indicating detection.
  • SpatiallRST may produce a lot of false alarms when the background clutter contains high spatial frequency components. Also, when the background contains both low and heavy clutter sub-regions, traditional SpatiallRST may produce increased false alarms for the heavy clutter sub-regions which also reduces the probability of detection for the low clutter sub-regions. [0006] For light or medium background clutter, generally the SpatiallRST system works well to detect and track targets, but performance suffers with heavy to extremely heavy background clutter (e.g., urban and earth object clutter) leading to a high P fa .
  • extremely heavy background clutter e.g., urban and earth object clutter
  • FIG. 2 illustrates an exemplary ChangelRST image processing system 200 found in the prior art.
  • a reference image (current image frame) 202 and a previous image (the search image) 204 are filtered using a high-pass filter 206 and pixel-wisely registered using a registering device 208 at a particular revisit time (RT).
  • Pixel registration is a well-known technique to align the received images for the same scene.
  • a base image is used as a comparison reference for at least one other (input) image, and the registration process brings the input image into alignment with the base image by applying a spatial transformation to the input image.
  • the registered search image may be subtracted from the reference image to suppress background clutter, and the output difference image may be fed to a CFAR (constant false alarm rate) detector 212 to generate a detection output signal 214.
  • CFAR constant false alarm rate detector 212
  • FIG. 3 another ChangelRST image processing system 300 found in the prior art may be used as shown in FIG. 3. During operation of the alternative arrangement 300, an original large image 302 is under-sampled using a sampler 304 into a smaller matrix containing match point elements.
  • This alternative ChangelRST arrangement 300 uses a multi-resolution approach to reduce the throughput (computing load) requirement for the image registration. However, the registration accuracy is decreased.
  • the method and system of the present invention overcome the previously mentioned problems by providing a target detection and tracking system capable of providing adaptive image processing for an IRST sensor system.
  • the adaptive image processing includes an adaptive spatial filtering technique that uses high-pass filtering and adaptive thresholding to reduce the false alarm rate in the presence of background clutter containing high spatial frequency components.
  • the adaptive spatial filtering technique may be combined with a spot time-differencing technique that performs time-differencing processing only for areas of detection in high clutter sub-regions based on the adaptive spatial filtering results which maintains a low false alarm rate for light clutter sub- regions.
  • FIG. 1 A is a block diagram of an exemplary target detection and tracking scenario for military applications found in the prior art
  • Fig. 1 B is a block diagram of an exemplary target detection image processing system using spatial filtering found in the prior art
  • Fig. 2 is a block diagram of an exemplary target detection image processing system using time-differencing found in the prior art
  • FIG. 3 is a block diagram of an exemplary, alternative target detection image processing system using time-differencing found in the prior art
  • FIG. 4 is a flow process diagram of an exemplary adaptive IRST image processing system in accordance with an embodiment of the present invention.
  • FIG. 5 is a block diagram of an exemplary adaptive IRST image processing system using adaptive spatial filtering in accordance with an embodiment of the present invention.
  • FIG. 6 is a block diagram of an exemplary adaptive IRST image processing system using spot time-differencing in accordance with an embodiment of the present invention.
  • FIG. 7 shows an illustration of exemplary background clutter images in accordance with an embodiment of the present invention.
  • FIG. 8 shows an exemplary point spread function of the optical IRST system in accordance with an embodiment of the present invention.
  • FIG. 9 is an exemplary illustration of target locations in an IRST system in accordance with an embodiment of the present invention.
  • Figs. 10-14 show graphs with exemplary IRST performance sensor sensitivity curves for adaptive spatial filtering and spot time-differencing in accordance with an embodiment of the present invention.
  • Fig. 4 is a flow process diagram of an exemplary adaptive IRST image processing system in accordance with an embodiment of the present invention.
  • a controller may be used to control the flow process steps of the IRST imaging system.
  • a reference (current) image frame and a search (previous) image frame may be input, from an IRST sensor, into the system using a receiver and undergo image pre-processing including noise filtering and other pre-processing.
  • the reference image may be received at a time (t) and the previous image may be received at a previous time (t - n).
  • the reference image is input to an adaptive spatial filtering path (further described below in reference to FIG. 5) for detection of an object within the sensor field of view (e.g., impending threat such as launched missiles, etc.).
  • an object within the sensor field of view e.g., impending threat such as launched missiles, etc.
  • a decision block is reached where it is determined whether the background clutter in the field of view qualifies as high (heavy) clutter in accordance with a predetermined threshold.
  • spot time-differencing processing spot ChangelRST is performed on the reference and search images to reduce the stationary detections due to clutters (such as building and rocks, etc.) and to pass moving detections (such as airborne targets).
  • the confirmation detection from the spot time- difference step may be combined with the detections with low clutter ("no" decision at step 406) from the spatial filtering step (step 404) to produce a detection summation output.
  • extended image processing including classification, identification, and tracking may occur using the summation detection result and the reference image as inputs to initiate and maintain tracking of the detected object.
  • Fig. 5 is a block diagram of exemplary adaptive IRST image processing system 500 using adaptive spatial filtering in accordance with an embodiment of the present invention.
  • adaptive IRST image processing system 500 may be used for the detection/search scenario illustrated in FIG. 1A to replace the prior art systems 100, 200, 300 shown in FIGs. 1B, 2, 3.
  • a controller 509 may be used to control the operation of the system 500.
  • a reference (current) image frame 502 may be input from an IRST sensor field of view (not shown) to a spatial, matching filter 504 using a receiver 507.
  • spatial filter 504 may perform high- pass filtering using a smaller template (incoming pixel frame size for the filter) which enables faster detection by requiring less processing than for larger size templates.
  • the filter 504 operates to use a previously detected object (e.g., tank) as the center for the succeeding pixel frame of a limited size (smaller template) which accelerates accurate correlation and detection.
  • spatial filter 504 may subtract the original image from a local mean to function as an anti-mean high- pass filter.
  • a background estimator 506 may estimate the noise of the background clutter of the IRST sensor field of view using the same anti-mean filter 504 or using a different high-pass filter (e.g., the filter of a point spread function), and divide (using divider 508) the filtered image data input by the background noise estimation to produce an output image signal input to a CFAR (constant false alarm rate) detector 510.
  • a CFAR detector allows for setting of one or more detection threshold levels to provide a maximum (tolerable) false alarm rate for the system 500.
  • anti-mean filter 504 with a smaller template may reduce the false alarm rate when the background clutter of the sensor field of view contains high frequency components.
  • the reference image data 502 may be input to a local/regional sigma (standard noise deviation) estimator 512 to help estimate the standard deviation for noise within the background clutter for the field of view.
  • the estimator 512 divides the image data 502 into a plurality of different spatial sub- regions and determines (measures) the SNR and standard noise deviation for each sub-region including a local region.
  • threshold device 514 may set the SNR threshold levels for each sub-region based on the measurements of the estimator 512.
  • CFAR detector 510 may receive the noise estimation and SNR threshold levels, along with the filtered/divided image data signal output, to determine whether an object is detected (e.g., predetermined threat target) within the sensor field of view and produces a detection output signal 516.
  • image processing may continue using the spot time-differencing system 600 of FIG. 6.
  • Fig. 6 is a block diagram of an exemplary adaptive IRST image processing system 600 using spot time-differencing in accordance with an embodiment of the present invention. As shown in FIG.
  • the reference image 502 and a search (previous) image 601 input to the spatial filter 504 of system 500 may be also input to a high- pass filter/background estimator device 602 for filtering and estimating of the noise level for the background clutter across the plurality of sub-regions within the sensor field of view.
  • the processing of system 600 continues if high clutter is determined (step 406 from FIG. 4) for the particular sub-regions since advantageously spot time-differencing will be applied for detection confirmation in only high background clutter sub-regions.
  • the filtered reference and search image data 502, 601 are input to a registrator 604 for registering of pixel data for the input image data 502, 601 for proper alignment of images from the same scene (field of view).
  • the registrator 604 compares the input image data 502, 601 with base image data to determine whether spatial transformation of the input image data is necessary for proper alignment with the base image data. Thereafter, a differencer 606 may subtract the search image 601 from the reference image 502 to suppress background clutter, and the output difference image 603 fed to a CFAR detector 608 to generate a detection output signal 609 indicating whether an object (e.g., predetermined threat target) is detected in sensor field of view.
  • an object e.g., predetermined threat target
  • FIG. 7 shows an illustration of exemplary background clutter images in accordance with an embodiment of the present invention that may require system 500, 600 for accurate target detection.
  • the images of FIGs. 7A, 7B show background clutter of a mountain view where FIG. 7B shows an image (search image) 704 collected one frame before the image (reference image) 702 in FIG. 7A where the revisit-time between the two images may be approximately 0.33 seconds.
  • FIG. 7C shows the difference image 706 obtained by subtracting the search image 704 of FIG. 7B from the reference image 702 of FIG. 7A.
  • the difference image 706 may produce reduced background clutter (reduced standard noise deviation) to provide a higher probability of detection of the airborne target.
  • FIG. 8 shows an exemplary point spread function (PSF) 800 of the optical IRST system in accordance with an embodiment of the present invention.
  • the PSF is created by considering the incoming airborne target (at a far distance away) as a point radiant (light) source, and mapping the intensity distribution for the received signal at the sensor.
  • the target may be considered a point source (sub-pixel detection).
  • FIG. 8 shows the PSF 800 representing the energy distribution of the IRST sensor (e.g., IR focal plane array - IR FPA) after a point source (e.g., small military fighter) passes through the optical lens of the system.
  • the IRST sensor e.g., IR focal plane array - IR FPA
  • PSF 800 represents the degree of degradation as the light passes through the optical lens of the system since the system optics are not perfect.
  • a PSF with a contrast of 16 counts may be related to a SMF target signature at a distance of 69 km away from the IRST sensor.
  • Fig. 9 is an exemplary illustration of 25 SMF target locations 900 with contrasts of 16 counts randomly (with a uniform distribution) inserted into the reference image 702 of FIG. 7a.
  • Figs. 10-14 show graphs with exemplary IRST performance sensor sensitivity curves for adaptive spatial filtering and spot time-differencing in accordance with an embodiment of the present invention.
  • ROC comparative received operating characteristics
  • the ROC performance (curve) for each one of the plurality of sensors may be generated using likelihood (probability) functions to represent sensor information during target tracking such as target detections, no detections, measured SNRs, and other sensor information obtained from sensor measurements, observations, or other sensor data outputs.
  • FIGs. 10a, 10b show the receiving operating characteristics (ROC) curves for system 500 using three different anti-mean filters.
  • the three different filters may be 1 X 3 row, 3 X 3 square, and 3 X 1 column.
  • the row filter may generate the best performance with a P d (probability of detection) approximately 95% and a P fa (probability of false alarm) approximately 4E-4.
  • FIGs. 11a, 11 b show the detection performance for the row filter. As shown in these figures, the row filter generates high P and low P fa .
  • FIGs. 12, 13 shows the ROC curves for background clutter of medium urban earth (FIG. 12) and heavy rural earth (FIG.
  • system 500, 600 uses a row filter and an SMF (small military fighter) at a range of 69 km, system 500, 600 reduces P fa to approximately 5E-4 and increases P d to approximately 80% as shown in FIG. 12. Also, the false- alarm number is approximately 240/frame for a 724 X 724 FPA (focal point array). As shown in FIG. 13, systems 500, 600 reduce P a to 5.7E-3 and increase P d to approximately 80%, and the false-alarm number is approximately 3010/frame for a 724 X 724 FPA.
  • FIG. 14 shows the ROC curves 1400 for a medium urban earth background clutter for exemplary embodiments of systems 500, 600.
  • a high P d of 80% is maintained, and the P fa is reduced to 3.5E-3 for the adaptive spatial filtering (SP-IRST) of system 500, and further reduced to 9.0E-4 for the spot time-differencing (TD-IRST) that is applied by system 600.
  • SP-IRST adaptive spatial filtering
  • TD-IRST spot time-differencing
  • a plurality of advantages may be provided by the invention described herein.
  • Using an adaptive spatial filtering and spot time-differencing digital image processing algorithm faster throughput may be realized for heavy background clutter in the reference image data as the number of time-differencing correlations are greatly reduced by limiting the time-differencing application to a heavy clutter environment.
  • local and regional noise standard deviation estimation allows adaptive generation of sub-region SNR thresholds to reduce false alarms and increase probability of detection in the sub-regions of the reference image data. Further advantages may also be realized including higher image registration accuracy and other advantages.

Abstract

A target detection and tracking system provides adaptive image processing for an IRST sensor system. The adaptive image processing includes an adaptive spatial filtering (104) technique that uses high-pass filtering (206) and adaptive thresholding to reduce the false alarm rate in the presence of background clutter containing high spatial frequency components. The adaptive spatial filtering technique may be combined with a spot time-differencing technique that performs time-differencing processing only for areas of detection in high clutter (406) sub-regions based on the adaptive spatial filtering results which maintains a low false alarm rate for light clutter sub-regions.

Description

A METHOD AND SYSTEM FOR ADAPTIVE TARGET DETECTION
Technical Field [0001] The present invention relates generally to image processing. It particularly relates to an image processing target detection system and method that uses adaptive spatial filtering and time-differencing processes to detect and track targets within various background environments.
Background of the Invention [0002] Passive IR (Infrared) sensors are widely used to detect the energy emitted from targets, backgrounds, incoming threats, and the atmosphere for a plurality of applications including military surveillance, missile target and detection systems, crop and forest management, weather forecasting, and other applications. The measures of performance for passive IR sensors include signal- to-noise ratio (S/N), radiation contrast, noise-equivalent temperature difference (NEDT), minimum resolvable temperature difference, and other parameters. These sensors may be designed to enhance one or more of these parameters for optimum performance during a particular application.
[0003] Particularly, one type of passive IR sensor, the IRST sensor (Infrared search and track), locates and tracks objects by capturing the energy emitted within the field of view (FOV) or field of regard (FOR) of the sensor. However, IRST sensors are commonly designed to operate with a small noise-equivalent temperature difference (NEDT) to detect small target-to-background contrast temperatures, and therefore heavy background clutter may strongly hinder accurate target detection and tracking and lead to a higher probability of false alarm (Pfa). Importantly for threat detection applications, it is useful for the IRST sensor to detect, declare, and track airborne targets at a long distance (usually larger than 50 km) before the threat can see the intended target, and therefore the IRST sensor performance may be enhanced using a large instantaneous field of view (e.g., 360-degree hemisphere in azimuth and 50 to 90 degrees in elevation). However, the large number of scene pixels produced by an IRST sensor may require computer-controlled image data processing to separate the large number of false targets from the true targets. As shown in FIG. 1A, a common target detection and tracking scenario for military applications may be a fighter jet 109 attempting to detect and track incoming fighter jets 122 and/or incoming missiles (bombs) 124 that may be enemy-controlled.
[0004] Commonly, the IRST sensor uses two image data processing techniques for target (threat) detection and tracking which include SpatiallRST and ChangelRST. FIG. 1B illustrates an exemplary SpatiallRST image processing system 100 found in the prior art. During operation, an image 102 input from an IR sensor (not shown) is initially spatially convolved by a matched filter 104 to generate a spatially filtered image output. The matched filter 104 may be generally designed using a well-known system point spread function (PSF) since at a long distance an incoming airborne target may be considered as a point radiant source. A point spread function maps the intensity distribution for the received signal at the sensor generated from the point source of light (airborne target at a long distance). The spatially filtered output may be divided by a local background estimation (provided by an estimator 106) using a divider 108 which provides an output image to a CFAR (constant false alarm rate) detector 110. Use of a CFAR detector allows for setting of one or more detection threshold levels to provide a maximum (tolerable) false alarm rate. The detector 110 provides an output signal 112 indicating detection.
[0005] However, SpatiallRST may produce a lot of false alarms when the background clutter contains high spatial frequency components. Also, when the background contains both low and heavy clutter sub-regions, traditional SpatiallRST may produce increased false alarms for the heavy clutter sub-regions which also reduces the probability of detection for the low clutter sub-regions. [0006] For light or medium background clutter, generally the SpatiallRST system works well to detect and track targets, but performance suffers with heavy to extremely heavy background clutter (e.g., urban and earth object clutter) leading to a high Pfa. Under these conditions, commonly a ChangelRST image processing system may be used which employs a temporal time-differencing image processing technique which is useful for moving (e.g., airborne) targets. FIG. 2 illustrates an exemplary ChangelRST image processing system 200 found in the prior art. During operation, a reference image (current image frame) 202 and a previous image (the search image) 204 are filtered using a high-pass filter 206 and pixel-wisely registered using a registering device 208 at a particular revisit time (RT). Pixel registration is a well-known technique to align the received images for the same scene. Commonly, a base image is used as a comparison reference for at least one other (input) image, and the registration process brings the input image into alignment with the base image by applying a spatial transformation to the input image. Using a subtractor 210, the registered search image may be subtracted from the reference image to suppress background clutter, and the output difference image may be fed to a CFAR (constant false alarm rate) detector 212 to generate a detection output signal 214. [0007] Alternatively, another ChangelRST image processing system 300 found in the prior art may be used as shown in FIG. 3. During operation of the alternative arrangement 300, an original large image 302 is under-sampled using a sampler 304 into a smaller matrix containing match point elements. These match point elements are registered using registering device 208, and the registration locations are interpolated back to the original space-domain. After interpolation, operation continues similar to FIG. 2 with the subtractor 210 to generate a difference signal input to CFAR detector 212. This alternative ChangelRST arrangement 300 uses a multi-resolution approach to reduce the throughput (computing load) requirement for the image registration. However, the registration accuracy is decreased.
[0008] Therefore, due to the disadvantages of current image processing techniques for IRST sensors, there is a need to provide an adaptive image processing system that provides high probability of detection in various background environments (light, medium, or heavy clutter) while maintaining low probability of false alarm.
Summary of the Invention [0009] The method and system of the present invention overcome the previously mentioned problems by providing a target detection and tracking system capable of providing adaptive image processing for an IRST sensor system. The adaptive image processing includes an adaptive spatial filtering technique that uses high-pass filtering and adaptive thresholding to reduce the false alarm rate in the presence of background clutter containing high spatial frequency components. The adaptive spatial filtering technique may be combined with a spot time-differencing technique that performs time-differencing processing only for areas of detection in high clutter sub-regions based on the adaptive spatial filtering results which maintains a low false alarm rate for light clutter sub- regions.
Brief Description of the Drawings [00010] Fig. 1 A is a block diagram of an exemplary target detection and tracking scenario for military applications found in the prior art; [00011] Fig. 1 B is a block diagram of an exemplary target detection image processing system using spatial filtering found in the prior art; [00012] Fig. 2 is a block diagram of an exemplary target detection image processing system using time-differencing found in the prior art;
[00013] Fig. 3 is a block diagram of an exemplary, alternative target detection image processing system using time-differencing found in the prior art;
[00014] Fig. 4 is a flow process diagram of an exemplary adaptive IRST image processing system in accordance with an embodiment of the present invention.
[00015] Fig. 5 is a block diagram of an exemplary adaptive IRST image processing system using adaptive spatial filtering in accordance with an embodiment of the present invention.
[00016] Fig. 6 is a block diagram of an exemplary adaptive IRST image processing system using spot time-differencing in accordance with an embodiment of the present invention.
[00017] Fig. 7 shows an illustration of exemplary background clutter images in accordance with an embodiment of the present invention.
[00018] Fig. 8 shows an exemplary point spread function of the optical IRST system in accordance with an embodiment of the present invention.
[00019] Fig. 9 is an exemplary illustration of target locations in an IRST system in accordance with an embodiment of the present invention.
[00020] Figs. 10-14 show graphs with exemplary IRST performance sensor sensitivity curves for adaptive spatial filtering and spot time-differencing in accordance with an embodiment of the present invention.
Detailed Description [00021] Fig. 4 is a flow process diagram of an exemplary adaptive IRST image processing system in accordance with an embodiment of the present invention. Advantageously, a controller may be used to control the flow process steps of the IRST imaging system. At step 402, a reference (current) image frame and a search (previous) image frame may be input, from an IRST sensor, into the system using a receiver and undergo image pre-processing including noise filtering and other pre-processing.
[00022] In an exemplary embodiment, the reference image may be received at a time (t) and the previous image may be received at a previous time (t - n). At step 404, the reference image is input to an adaptive spatial filtering path (further described below in reference to FIG. 5) for detection of an object within the sensor field of view (e.g., impending threat such as launched missiles, etc.). At step 406, a decision block is reached where it is determined whether the background clutter in the field of view qualifies as high (heavy) clutter in accordance with a predetermined threshold. If yes, then processing continues at step 408 where spot time-differencing processing (spot ChangelRST) is performed on the reference and search images to reduce the stationary detections due to clutters (such as building and rocks, etc.) and to pass moving detections (such as airborne targets).
[00023] Following at step 410, the confirmation detection from the spot time- difference step (step 408) may be combined with the detections with low clutter ("no" decision at step 406) from the spatial filtering step (step 404) to produce a detection summation output. At step 412, extended image processing including classification, identification, and tracking may occur using the summation detection result and the reference image as inputs to initiate and maintain tracking of the detected object.
[00024] Fig. 5 is a block diagram of exemplary adaptive IRST image processing system 500 using adaptive spatial filtering in accordance with an embodiment of the present invention. Advantageously, adaptive IRST image processing system 500 may be used for the detection/search scenario illustrated in FIG. 1A to replace the prior art systems 100, 200, 300 shown in FIGs. 1B, 2, 3. A controller 509 may be used to control the operation of the system 500. [00025] As shown in Fig. 5, a reference (current) image frame 502 may be input from an IRST sensor field of view (not shown) to a spatial, matching filter 504 using a receiver 507. Advantageously, spatial filter 504 may perform high- pass filtering using a smaller template (incoming pixel frame size for the filter) which enables faster detection by requiring less processing than for larger size templates. The filter 504 operates to use a previously detected object (e.g., tank) as the center for the succeeding pixel frame of a limited size (smaller template) which accelerates accurate correlation and detection. Also, spatial filter 504 may subtract the original image from a local mean to function as an anti-mean high- pass filter.
[00026] Additionally, a background estimator 506 may estimate the noise of the background clutter of the IRST sensor field of view using the same anti-mean filter 504 or using a different high-pass filter (e.g., the filter of a point spread function), and divide (using divider 508) the filtered image data input by the background noise estimation to produce an output image signal input to a CFAR (constant false alarm rate) detector 510. Use of a CFAR detector allows for setting of one or more detection threshold levels to provide a maximum (tolerable) false alarm rate for the system 500. Advantageously, anti-mean filter 504 with a smaller template may reduce the false alarm rate when the background clutter of the sensor field of view contains high frequency components. [00027] Also, the reference image data 502 may be input to a local/regional sigma (standard noise deviation) estimator 512 to help estimate the standard deviation for noise within the background clutter for the field of view. The estimator 512 divides the image data 502 into a plurality of different spatial sub- regions and determines (measures) the SNR and standard noise deviation for each sub-region including a local region. Following the estimator 512, threshold device 514 may set the SNR threshold levels for each sub-region based on the measurements of the estimator 512. Following, the CFAR detector 510 may receive the noise estimation and SNR threshold levels, along with the filtered/divided image data signal output, to determine whether an object is detected (e.g., predetermined threat target) within the sensor field of view and produces a detection output signal 516. [00028] Following generation of the detection output signal 516, image processing may continue using the spot time-differencing system 600 of FIG. 6. Fig. 6 is a block diagram of an exemplary adaptive IRST image processing system 600 using spot time-differencing in accordance with an embodiment of the present invention. As shown in FIG. 6, the reference image 502 and a search (previous) image 601 input to the spatial filter 504 of system 500 may be also input to a high- pass filter/background estimator device 602 for filtering and estimating of the noise level for the background clutter across the plurality of sub-regions within the sensor field of view. The processing of system 600 continues if high clutter is determined (step 406 from FIG. 4) for the particular sub-regions since advantageously spot time-differencing will be applied for detection confirmation in only high background clutter sub-regions. Following, the filtered reference and search image data 502, 601 are input to a registrator 604 for registering of pixel data for the input image data 502, 601 for proper alignment of images from the same scene (field of view). The registrator 604 compares the input image data 502, 601 with base image data to determine whether spatial transformation of the input image data is necessary for proper alignment with the base image data. Thereafter, a differencer 606 may subtract the search image 601 from the reference image 502 to suppress background clutter, and the output difference image 603 fed to a CFAR detector 608 to generate a detection output signal 609 indicating whether an object (e.g., predetermined threat target) is detected in sensor field of view.
[00029] As shown at step 412 of FIG. 4, extended image processing including classification, identification, and tracking may occur using the spatial filtering processing output, time-difference detection output, and original reference image data as inputs to initiate and maintain tracking of the detected object. [00030] Fig. 7 shows an illustration of exemplary background clutter images in accordance with an embodiment of the present invention that may require system 500, 600 for accurate target detection. The images of FIGs. 7A, 7B show background clutter of a mountain view where FIG. 7B shows an image (search image) 704 collected one frame before the image (reference image) 702 in FIG. 7A where the revisit-time between the two images may be approximately 0.33 seconds. FIG. 7C shows the difference image 706 obtained by subtracting the search image 704 of FIG. 7B from the reference image 702 of FIG. 7A. For this example, the difference image 706 may produce reduced background clutter (reduced standard noise deviation) to provide a higher probability of detection of the airborne target.
[00031] Fig. 8 shows an exemplary point spread function (PSF) 800 of the optical IRST system in accordance with an embodiment of the present invention. The PSF is created by considering the incoming airborne target (at a far distance away) as a point radiant (light) source, and mapping the intensity distribution for the received signal at the sensor. For an exemplary embodiment, to detect a SMF (small military fighter) target at a far distance away (e.g., greater than 50 km), the target may be considered a point source (sub-pixel detection). FIG. 8 shows the PSF 800 representing the energy distribution of the IRST sensor (e.g., IR focal plane array - IR FPA) after a point source (e.g., small military fighter) passes through the optical lens of the system. PSF 800 represents the degree of degradation as the light passes through the optical lens of the system since the system optics are not perfect. For example, a PSF with a contrast of 16 counts (image counts-to-the-irradiance factor) may be related to a SMF target signature at a distance of 69 km away from the IRST sensor. Fig. 9 is an exemplary illustration of 25 SMF target locations 900 with contrasts of 16 counts randomly (with a uniform distribution) inserted into the reference image 702 of FIG. 7a. [00032] Figs. 10-14 show graphs with exemplary IRST performance sensor sensitivity curves for adaptive spatial filtering and spot time-differencing in accordance with an embodiment of the present invention. Relying on predetermined measurements and analysis (e.g., testing and/or computer simulation of sensor operation), comparative received operating characteristics (ROC) between each sensor in the IRST system 500, 600 (for a multi-sensor system) may be calculated during a predetermined tracking period. The ROC performance (curve) for each one of the plurality of sensors may be generated using likelihood (probability) functions to represent sensor information during target tracking such as target detections, no detections, measured SNRs, and other sensor information obtained from sensor measurements, observations, or other sensor data outputs.
[00033] FIGs. 10a, 10b show the receiving operating characteristics (ROC) curves for system 500 using three different anti-mean filters. For example, the three different filters may be 1 X 3 row, 3 X 3 square, and 3 X 1 column. As shown in these figures, the row filter may generate the best performance with a Pd (probability of detection) approximately 95% and a Pfa (probability of false alarm) approximately 4E-4. Also, FIGs. 11a, 11 b show the detection performance for the row filter. As shown in these figures, the row filter generates high P and low Pfa. [00034] FIGs. 12, 13 shows the ROC curves for background clutter of medium urban earth (FIG. 12) and heavy rural earth (FIG. 13) for an exemplary embodiment of systems 500, 600. Using a row filter and an SMF (small military fighter) at a range of 69 km, system 500, 600 reduces Pfa to approximately 5E-4 and increases Pd to approximately 80% as shown in FIG. 12. Also, the false- alarm number is approximately 240/frame for a 724 X 724 FPA (focal point array). As shown in FIG. 13, systems 500, 600 reduce P a to 5.7E-3 and increase Pd to approximately 80%, and the false-alarm number is approximately 3010/frame for a 724 X 724 FPA. By introducing a spot time-differencing process (implemented only if a particular clutter threshold is reached at step 406), only several hundred template registrations/correlations need to be performed for medium clutter background, and only approximately 3,000 for heavy clutter background. [00035] FIG. 14 shows the ROC curves 1400 for a medium urban earth background clutter for exemplary embodiments of systems 500, 600. As shown in FIG. 14, a high Pd of 80% is maintained, and the Pfa is reduced to 3.5E-3 for the adaptive spatial filtering (SP-IRST) of system 500, and further reduced to 9.0E-4 for the spot time-differencing (TD-IRST) that is applied by system 600. [00036] A plurality of advantages may be provided by the invention described herein. Using an adaptive spatial filtering and spot time-differencing digital image processing algorithm, faster throughput may be realized for heavy background clutter in the reference image data as the number of time-differencing correlations are greatly reduced by limiting the time-differencing application to a heavy clutter environment. Also, local and regional noise standard deviation estimation allows adaptive generation of sub-region SNR thresholds to reduce false alarms and increase probability of detection in the sub-regions of the reference image data. Further advantages may also be realized including higher image registration accuracy and other advantages.
[00037] Although the invention is primarily described herein using particular embodiments, it will be appreciated by those skilled in the art that modifications and changes may be made without departing from the spirit and scope of the present invention. As such, the method disclosed herein is not limited to what has been particularly shown and described herein, but rather the scope of the present invention is defined only by the appended claims.

Claims

What Is Claimed Is:
1. A method for detecting and tracking an object, comprising: receiving image data from a sensor field of view; dividing the image data into a plurality of different spatial sub-regions using a predetermined spatial filtering algorithm and determining SNR (signal-to-noise ratio) for each spatial sub-region; and detecting whether at least one object is within said field of view based on comparing each SNR to a predetermined threshold for each sub-region and determining whether one or more predetermined thresholds are satisfied.
2. The method of claim 1 , wherein said receiving includes receiving image data output from an infrared search and track sensor having background clutter within the field of view.
3. The method of claim 1 , wherein said dividing includes high-pass filtering the image data and dividing the filtered image data by a noise level estimation for background clutter within the field of view.
4. The method of claim 1 , wherein said dividing includes high-pass filtering the image data with predetermined filter coefficients and dividing the filtered image data by the noise level estimation wherein the noise estimation being estimated using one of said high-pass filtering with the predetermined coefficients and high- pass filtering with different predetermined coefficients.
5. The method of claim 1 , wherein said detecting includes determining said predetermined threshold for each sub-region based on at least one measured standard deviation for noise in background clutter within the field of view.
6. The method of claim 5, wherein said detecting includes determining said predetermined threshold for each sub-region based on the at least one measured standard deviation for noise to reduce probability of false alarm for a sub-region with a noise standard deviation higher than a predetermined threshold or increase probability of detection for a sub-region with a noise standard deviation lower than a predetermined threshold.
7. The method of claim 1 , wherein said dividing includes dividing the image data into at least sixty-four sub-regions.
8. The method of claim 1 , further comprising: for said at least one object detected within a field of view having background clutter greater than a predetermined threshold, comparing said image data with previously received image data using a high-pass filter and registering pixel data for each filtered set of image data; and detecting whether the previous detecting of said at least one object is accurate using difference of previously received, filtered image data with filtered image data.
9. The method of claim 8, wherein said detecting includes detecting accuracy of previous detection using said difference to reduce probability of false alarm.
10. A system for detecting and tracking an object, comprising: a receiver for receiving image data from a sensor field of view; a controller for dividing the image data into a plurality of different spatial sub-regions using a predetermined spatial filtering algorithm and determining SNR (signal-to-noise ratio) for each spatial sub-region; and a detector for detecting whether at least one object is within said field of view based on comparing each SNR to a predetermined threshold for each sub- region and determining whether one or more predetermined thresholds are satisfied.
11. The system of claim 10, wherein said receiver to receive image data output from an infrared search and track sensor having background clutter within the field of view.
12. The system of claim 10, wherein said controller to high-pass filter the image data and divide the filtered image data by a noise level estimation for background clutter within the field of view.
13. The system of claim 10, wherein said controller to high-pass filter the image data with predetermined filter coefficients and divide the filtered image data by the noise level estimation wherein the noise estimation being estimated using one of said high-pass filtering with the predetermined coefficients and high-pass filtering with different predetermined coefficients.
14. The system of claim 10, wherein said detector to detect the at least one object using said predetermined threshold for each sub-region based on at least one measured standard deviation for noise in background clutter within the field of view.
15. The system of claim 10, wherein for said at least one object detected within a field of view having background clutter greater than a predetermined threshold, said controller to compare said image data with previously received image data using a high-pass filter and register pixel data for each filtered set of image data; and said detector to detect whether the previous detecting of said at least one object is accurate using difference of previously received, filtered image data with filtered image data.
16. A machine-readable medium having stored thereon a plurality of executable instructions, the plurality of instructions comprising instructions to: receive image data from a sensor field of view; divide the image data into a plurality of different spatial sub-regions using a predetermined spatial filtering algorithm and determining SNR (signal-to-noise ratio) for each spatial sub-region; and detect whether an object is within said field of view based on comparing each SNR to a predetermined threshold for each sub-region and determining whether one or more predetermined thresholds are satisfied.
17. The medium of claim 16, wherein said instructions to receive include receiving image data output from an infrared search and track sensor having background clutter within the field of view.
18. The medium of claim 16, wherein said instructions to divide include high-pass filtering the image data and dividing the filtered image data by a noise level estimation for background clutter within the field of view.
19. The medium of claim 16, wherein said instructions to divide include high-pass filtering the image data with predetermined filter coefficients and dividing the filtered image data by the noise level estimation wherein the noise estimation being estimated using one of said high-pass filtering with the predetermined coefficients and high-pass filtering with different predetermined coefficients.
20. The medium of claim 16, wherein said instructions to detect include determining said predetermined threshold for each sub-region based on at least one measured standard deviation for noise in background clutter within the field of view.
EP04816720A 2003-12-31 2004-02-24 A method and system for adaptive target detection Withdrawn EP1704510A4 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US74821203A 2003-12-31 2003-12-31
PCT/US2004/005325 WO2005069197A1 (en) 2003-12-31 2004-02-24 A method and system for adaptive target detection

Publications (2)

Publication Number Publication Date
EP1704510A1 true EP1704510A1 (en) 2006-09-27
EP1704510A4 EP1704510A4 (en) 2009-09-09

Family

ID=34794657

Family Applications (1)

Application Number Title Priority Date Filing Date
EP04816720A Withdrawn EP1704510A4 (en) 2003-12-31 2004-02-24 A method and system for adaptive target detection

Country Status (2)

Country Link
EP (1) EP1704510A4 (en)
WO (1) WO2005069197A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7483551B2 (en) * 2004-02-24 2009-01-27 Lockheed Martin Corporation Method and system for improved unresolved target detection using multiple frame association
FR2975807B1 (en) 2011-05-23 2013-06-28 Sagem Defense Securite DETECTION AND TRACKING OF TARGETS IN A SERIES OF IMAGES
CN103632340B (en) * 2012-08-24 2017-11-24 原相科技股份有限公司 Object tracking device and its operating method
RU2525829C1 (en) * 2013-02-13 2014-08-20 Министерство обороны Российской Федерации Radar method of detecting law of variation of angular velocity of turning of tracked aerial object based on successively received signal reflections with carrier frequency adjustment
CN104766079B (en) * 2015-05-05 2018-12-07 四川九洲电器集团有限责任公司 A kind of remote method for detecting infrared puniness target
RU2678822C2 (en) * 2017-07-27 2019-02-04 Акционерное общество "Всероссийский научно-исследовательский институт радиотехники" Signals filtering method during the target detection and device for its implementation
CN109523575A (en) * 2018-11-12 2019-03-26 南通理工学院 Method for detecting infrared puniness target
CN110660065B (en) * 2019-09-29 2023-10-20 云南电网有限责任公司电力科学研究院 Infrared fault detection and identification algorithm
RU2756291C1 (en) * 2021-01-25 2021-09-29 Федеральное государственное казенное военное образовательное учреждение высшего образования "Военная академия войсковой противовоздушной обороны Вооруженных Сил Российской Федерации имени Маршала Советского Союза А.М. Василевского" Министерства обороны Российской Федерации Method for ensuring high resolution of a radio location apparatus in range by selecting the optimal inverse filter regularisation parameter

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0346985A2 (en) * 1988-06-17 1989-12-20 Philips Electronics Uk Limited Target detection systems
WO1998047102A2 (en) * 1997-04-17 1998-10-22 Raytheon Company Adaptive non-uniformity compensation algorithm
US5960097A (en) * 1997-01-21 1999-09-28 Raytheon Company Background adaptive target detection and tracking with multiple observation and processing stages
US6111975A (en) * 1991-03-22 2000-08-29 Sacks; Jack M. Minimum difference processor

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6192156B1 (en) * 1998-04-03 2001-02-20 Synapix, Inc. Feature tracking using a dense feature array

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0346985A2 (en) * 1988-06-17 1989-12-20 Philips Electronics Uk Limited Target detection systems
US6111975A (en) * 1991-03-22 2000-08-29 Sacks; Jack M. Minimum difference processor
US5960097A (en) * 1997-01-21 1999-09-28 Raytheon Company Background adaptive target detection and tracking with multiple observation and processing stages
WO1998047102A2 (en) * 1997-04-17 1998-10-22 Raytheon Company Adaptive non-uniformity compensation algorithm

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
AL BOVIK (ED) ED - BOVIK A (ED): "Handbook of Image and Video Processing, Passages" 1 January 2000 (2000-01-01), HANDBOOK OF IMAGE AND VIDEO PROCESSING; [COMMUNICATIONS, NETWORKING AND MULTIMEDIA], SAN DIEGO, CA : ACADEMIC PRESS, US, PAGE(S) 71 - 267,687 , XP002507635 * §3.1, 3.2 and 3.8 * *
CASPI Y ET AL: "Spatio-temporal alignment of sequences" IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, IEEE SERVICE CENTER, LOS ALAMITOS, CA, US, vol. 24, no. 11, 1 November 2002 (2002-11-01), pages 1409-1424, XP011095006 ISSN: 0162-8828 *
DATABASE INSPEC [Online] THE INSTITUTION OF ELECTRICAL ENGINEERS, STEVENAGE, GB; 23 August 1983 (1983-08-23), MAHMOODI A B ET AL: "Signal processing consideration for a millimeter wave seeker" XP002535025 Database accession no. 2375094 & MILLIMETER WAVE TECHNOLOGY II 23-24 AUG. 1983 SAN DIEGO, CA, USA, vol. 423, 23 August 1983 (1983-08-23), - 24 August 1983 (1983-08-24) pages 87-99, Proceedings of the SPIE - The International Society for Optical Engineering USA ISSN: 0277-786X *
HAI-WEN CHEN ET AL: "Integrated spatiotemporal multiple sensor fusion system design" PROCEEDINGS OF THE SPIE - THE INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING, SPIE, PO BOX 10 BELLINGHAM WA 98227-0010 USA, vol. 4731, 5 April 2002 (2002-04-05), pages 204-215, XP007908961 ISSN: 0277-786X *
JAE-SOO CHO ET AL: "Robust centroid target tracker based on new distance features in cluttered image sequences" IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, INFORMATION & SYSTEMS SOCIETY, TOKYO, JP, vol. E83-D, no. 12, 1 December 2000 (2000-12-01), pages 2142-2151, XP007908982 ISSN: 0916-8532 *
PENG JIA-XIONG ET AL: "Infrared background suppression for segmenting and detecting small target" ACTA ELECTRONICA SINICA CHINESE INST. ELECTRON CHINA, vol. 27, no. 12, December 1999 (1999-12), pages 47-51 , 58, XP8108017 ISSN: 0372-2112 *
See also references of WO2005069197A1 *

Also Published As

Publication number Publication date
WO2005069197A1 (en) 2005-07-28
EP1704510A4 (en) 2009-09-09

Similar Documents

Publication Publication Date Title
US7483551B2 (en) Method and system for improved unresolved target detection using multiple frame association
EP0399180B1 (en) Method and apparatus for search and tracking of targets
AU2004269298B2 (en) Target detection improvements using temporal integrations and spatial fusion
US20060132354A1 (en) Method of detecting a target
US20120242864A1 (en) Flash detection and clutter rejection processor
Wang et al. A robust infrared dim target detection method based on template filtering and saliency extraction
EP1704510A1 (en) A method and system for adaptive target detection
Li et al. DIM moving target detection using spatio-temporal anomaly detection for hyperspectral image sequences
US8558891B2 (en) Method of detecting an object in a scene comprising artifacts
EP1515160A1 (en) A target shadow detector for synthetic aperture radar
WO2016005738A1 (en) Method and system for surveillance using synthetic aperture radar images
Schwering et al. EO system concepts in the littoral
Diani et al. Joint striping noise removal and background clutter cancellation in IR naval surveillance systems
Chen et al. An automated data exploitation system for airborne sensors
Davey et al. Track before detect for space situation awareness
Chen et al. Robust extended target detection using nonlinear morphological operations
US7202940B1 (en) Method for detection of an object using correlation filters with score reaffirmation post processing
Kemper Jr et al. Imaging infrared seeker signal processing overview: image processing, adaptive thresholding, and track processing
Kim et al. Adjacent infrared multitarget detection using robust background estimation
Raji et al. Analgorithmic Framework for Automatic Detection and Tracking Moving Point Targets in IR Image Sequences.
Qiu et al. Amplitude-aided CPHD filter for multitarget tracking in infrared images
Wu et al. An Infrared Target Images Recognition and Processing Method Based on the Fuzzy Comprehensive Evaluation
Chen et al. Image domain moving target tracking with advanced image registration and time-differencing techniques
Kwon et al. Multisensor target detection using adaptive feature-based fusion
Lotspeich et al. Tracking Subpixel Targets with Critically Sampled Optics

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20060712

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PT RO SE SI SK TR

DAX Request for extension of the european patent (deleted)
RIC1 Information provided on ipc code assigned before grant

Ipc: G06K 9/00 20060101AFI20050802BHEP

Ipc: G06T 5/20 20060101ALI20090722BHEP

A4 Supplementary search report drawn up and despatched

Effective date: 20090806

17Q First examination report despatched

Effective date: 20091207

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20150926