CN113034533B - Infrared small target detection method based on space-time stationarity - Google Patents

Infrared small target detection method based on space-time stationarity Download PDF

Info

Publication number
CN113034533B
CN113034533B CN202110376498.1A CN202110376498A CN113034533B CN 113034533 B CN113034533 B CN 113034533B CN 202110376498 A CN202110376498 A CN 202110376498A CN 113034533 B CN113034533 B CN 113034533B
Authority
CN
China
Prior art keywords
image
result
space
target
stationarity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110376498.1A
Other languages
Chinese (zh)
Other versions
CN113034533A (en
Inventor
刘洋
彭真明
代汶罡
邹睿颖
王远博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN202110376498.1A priority Critical patent/CN113034533B/en
Publication of CN113034533A publication Critical patent/CN113034533A/en
Application granted granted Critical
Publication of CN113034533B publication Critical patent/CN113034533B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4007Scaling of whole images or parts thereof, e.g. expanding or contracting based on interpolation, e.g. bilinear interpolation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an infrared small target detection method based on space-time stationarity, and relates to the technical field of infrared image processing and target detection. The method is based on the stationarity of background components in the infrared small target image in space and time, the time average of a plurality of frames before the current frame is obtained, the average result is subtracted from the current frame and threshold segmentation is carried out, then impulse noise in the segmentation result is filtered, and morphological closed operation is carried out, so that a time stationarity target response graph is obtained. And carrying out rapid edge-preserving filtering on the current frame to obtain a space stationarity target response diagram. And fusing the two target response graphs by using AND operation, and analyzing a connected domain of a fusion result to further obtain the estimation of the target centroid position. And finally, marking the detection result in the current frame. The method has better algorithm real-time performance and detection accuracy, and has better robustness for detecting the infrared small target under the complex conditions of radiation intensity change, target scale change, background clutter interference and the like.

Description

Infrared small target detection method based on space-time stationarity
Technical Field
The invention relates to the technical field of infrared image processing and target detection, in particular to an infrared small target detection method based on space-time stationarity.
Background
The infrared target detection is a key technology in an infrared detection system, has very important significance and value in early warning systems, sea defense systems, air defense systems and the like in the field of military reconnaissance, and is also widely applied to civil fields such as medical imaging, robots, automatic driving, traffic management and the like. However, since the infrared imaging system is usually far from the target, the imaging of the target on the image is small, usually only a few or dozens of pixels, and the infrared light is absorbed and scattered by the atmosphere, the contrast and the signal-to-noise ratio of the infrared target in the image are often low, and the infrared target is easily submerged in noise points and background clutter, which brings great difficulty to the detection work, and the signal-to-noise ratio refers to the ratio of the peak-to-peak value of the output signal of the brightness channel of the camera to the effective value of the video clutter under the standard illumination.
Existing infrared small target detection techniques are typically based on modeling of infrared images. It is generally considered that an infrared image is composed of three parts of a target signal, a background signal and a noise signal. The target signal is usually blurred in edge and lacks effective features for description, but the local contrast is relatively high and is weakly correlated with the background; the background signal is basically stable in space and time, has the characteristic of slowly changing low frequency, and has high correlation with the surrounding background signal; the noise signals are generally randomly distributed in time and space, independent of the background image, and the inter-frame distribution is not correlated.
In the prior art, the technology based on infrared small target detection is mainly divided into two types: track Before Detection (TBD) and Track Before Detection (DBT). The TBD algorithm generates a plurality of candidate tracks by tracking the small target, then gradually eliminates false tracks through a certain criterion, and finally obtains the estimation of the target track and the position; the DBT algorithm first estimates and suppresses the background, minimizes the influence of background clutter, and separates the target from the background and noise by threshold segmentation.
The above two methods have disadvantages: the existing DBT algorithm has poor robustness and high false alarm rate in complex environments such as uneven radiation, background clutter interference and the like; the TBD algorithm usually has a large amount of calculation and poor real-time performance.
Disclosure of Invention
The invention provides an infrared small target detection method based on space-time stationarity, aiming at solving the problems of poor instantaneity, high false alarm rate and the like of an infrared small target detection algorithm in complex environments of uneven radiation, background clutter interference and the like in the prior art, and aiming at: the signal-to-clutter ratio of the target is improved, real-time detection of the infrared small target is realized, and the robustness and accuracy of detection are improved.
In order to achieve the purpose, the invention adopts the following technical scheme:
an infrared small target detection method based on space-time stationarity comprises the following steps:
step A: acquiring an image of the infrared small target detection, reading an nth frame of image, preprocessing the nth frame of image, and graying the nth frame of image to obtain a processing result;
and B: acquiring k frame images before the nth frame, calculating the time average of the k frame images before the nth frame, and performing difference calculation on the calculated time average result and the result in the step A to obtain a difference result;
and C: b, threshold segmentation is carried out on the difference result in the step B, impulse noise is filtered out, a filtering result is obtained, then morphological closed operation is carried out on the filtering result, and the obtained morphological closed operation result is used as a time stationarity target response diagram;
step D: b, forming a plane-gray three-dimensional space by taking the gray value as an intensity dimension, mapping the result in the step A to the plane-gray three-dimensional space, and performing average down-sampling to obtain a down-sampling result;
step E: d, performing three-dimensional Gaussian filtering on the down-sampling result in the step D, and performing linear up-sampling on the result obtained after the three-dimensional Gaussian filtering to obtain an up-sampling result;
step F: carrying out normalization processing on the up-sampling result, and then carrying out threshold segmentation to obtain a space stability target response graph;
step G: performing AND operation on the time stationarity target response graph and the space stationarity target response graph to obtain a space-time stationarity target response graph;
step H: and carrying out connected domain analysis on the space-time stationarity target response graph to obtain estimation on the position of the target centroid.
Preferably, the step B includes:
step B.1: defining the gray value of the pixel point positioned in (X, Y) in the nth frame as fn(X, Y), calculating the average value of the gray values of the pixel points of the k frame images before the nth frame, wherein the pixel points are located at (X, Y), and the formula is as follows:
Figure BDA0003007474140000021
step B.2: then, calculating the pixel value difference image Diff of the mean value of the current n frame image and the previous k frame image, wherein the formula is as follows:
Figure BDA0003007474140000022
preferably, the specific steps of step C are:
step C.1: and carrying out threshold segmentation on the difference image Diff:
step C.1.1: calculating the maximum pixel value μ in the difference image Diff as max (Diff (X, Y));
step C.1.2: then, threshold segmentation is carried out on the Diff of the differential image, and the formula is as follows:
Figure BDA0003007474140000023
where α is a constant and μ represents the largest pixel value in the difference image Diff; finally, obtaining an image g subjected to threshold segmentation;
step C.2: then, the obtained image g is subjected to impulse noise filtering:
step C.2.1: designating a window with the size of c X c, which is used for performing sliding window calculation from top to bottom and from left to right on the image g, then obtaining a sliding window with a certain pixel point as a center through the value g (X, Y) of the pixel point in the image g, then sequencing the pixel point values in the range of the sliding window, then replacing the value g (X, Y) of the pixel point with the sequenced median value of the pixel point values, and finally obtaining an image result h after replacing the value of each pixel point in the image g according to the replacement method;
step C.3: and performing morphological closed operation on the image result h:
step C.3.1: selecting a rectangular structural element with the size of 3 multiplied by 3 as se, and setting
Figure BDA0003007474140000035
In order to etch the operation symbol,
Figure BDA0003007474140000036
for the sign of the dilation operation, the image result h is eroded and dilated by se as follows:
Figure BDA0003007474140000031
Figure BDA0003007474140000032
wherein D ishAnd DseThe definition domains are respectively an image result h and a rectangular structural element se;
step C.3.2: and (3) performing closed operation on the image result h by using a rectangular structural element se to obtain a time stationarity target response diagram T, wherein the formula is as follows:
Figure BDA0003007474140000033
preferably, the specific steps of step D are:
step D.1: mapping the current frame image to a plane-gray scale three-dimensional space to form a three-dimensional homogeneous vector (wi, w), wherein the formula is as follows:
i(X,Y,I)=fn(X,Y)
w(X,Y,I)=δ(I-fn(X,Y))
step D.2: calculating the minimum pixel value I in the current frame imagemin=min(fi(X,Y));
Step D.3: coordinate (X, Y, f)n(X, Y)) to a down-sampled form (X, Y, ζ):
Figure BDA0003007474140000034
wherein [. ]]For the rounding operator, ssSampling rate, s, for spatial dimensionsrZeta is the intensity dimension coordinate after down-sampling;
step D.4: the average down-sampled result of the vector (wi, w) is set as the vector (w)i,w) A vector (w)i,w) Initialized to a zero vector and then paired with the vector (w)i,w) Updating:
(wi(x,y,ζ),w(x,y,ζ))=(wi(X,Y,I),w(X,Y,I))
wherein the coordinate relationship of (X, Y, ζ) to (X, Y, I) is given by the formula in step d.3.
Preferably, the specific steps of step E are:
step E.1: vector (w)i,w) And three-dimensional Gaussian nucleus
Figure BDA0003007474140000041
Performing convolution to obtain vector
Figure BDA0003007474140000042
The formula is as follows:
Figure BDA0003007474140000043
wherein, in
Figure BDA0003007474140000044
In the method, a coordinate system is established by taking a geometric center of a Gaussian kernel as an origin, and the coordinate is the value of an element at the position of (m, n, epsilon), as shown in a formula:
Figure BDA0003007474140000045
step E.2: for vector
Figure BDA0003007474140000046
Linear interpolation is performed to obtain an upsampled original size result vector (W)bIb,Wb)。
Preferably, the specific steps of step F are:
step F.1: for the result vector (W)bIb,Wb) Normalized to obtain IbThen, the result graph f is obtainedbThe formula is as follows:
Ib(X,Y,I)=WbIb(X,Y,I)/Wb(X,Y,I)
fb(X,Y)=Ib(X,Y,I)
step F.2: for the result chart fbPerforming threshold segmentation:
step F.2.1: graph f of calculation resultsbInner maximum pixel value mub=max(fb(X,Y));
Step F.2.2: to fbPerforming threshold segmentation to obtain a threshold-segmented image gb
Figure BDA0003007474140000047
Wherein beta is a constant, mubRepresenting a difference image fbThe largest pixel value.
Preferably, the specific steps of step G are:
step G.1: for the image g and the image gbAnd operation is carried out to obtain a response graph G of the space-time stationarity target, which specifically comprises the following steps:
Figure BDA0003007474140000048
preferably, the specific steps of step H are: and analyzing the connected domain of the space-time stationarity target response graph G, and taking the obtained centroid position of each connected region as the centroid position of the candidate target to obtain a detection result.
In summary, due to the adoption of the technical scheme, the invention has the beneficial effects that:
1. the invention comprehensively considers the space-time stationarity characteristics of three signals of a target, a background and noise in an infrared image, combines single-frame and multi-frame information, effectively inhibits background influence and noise interference, obviously improves the detection capability of infrared weak and small targets, and provides a feasible way for solving the problems of poor real-time performance, low robustness in a complex scene and low detection accuracy rate under the conditions of more noise and low signal-to-noise ratio of infrared small target detection based on multiple frames.
2. Aiming at the problem that a large number of false targets exist in infrared small target detection, the method selects candidate targets supported by a time stationarity response diagram and a space stationarity response diagram together with operation, thereby obviously reducing false alarm rate and enhancing detection real-time property. The system can robustly, efficiently and accurately detect the small targets in various complex scenes.
Drawings
FIG. 1 is a schematic flow diagram of the present invention;
fig. 2 is an original image for detecting a small infrared target according to embodiment 1 of the present invention;
FIG. 3 is a time response diagram of the detection of small infrared targets in example 1 of the present invention;
FIG. 4 is a spatial response diagram for detecting infrared small targets in embodiment 1 of the present invention;
FIG. 5 is a diagram of a space-time fusion response of infrared small target detection in embodiment 1 of the present invention;
fig. 6 is a graph of the result of detecting a small infrared target in example 1 of the present invention.
Detailed Description
All of the features disclosed in this specification, or all of the steps in any method or process so disclosed, may be combined in any combination, except combinations of features and/or steps that are mutually exclusive.
The invention will be further described with reference to the accompanying drawings and specific embodiments.
Example 1:
step A: acquiring an image of the infrared small target detection, reading an nth frame of image, preprocessing the nth frame of image, and graying the nth frame of image to obtain a processing result;
the image preprocessing mainly refers to processing the image data type, including that a color image is changed into a gray image, and a pixel value is converted into a double type.
And B: judging whether the number of the nth frame is greater than k, taking k as 5, if not, initializing the time stationarity response diagram as a 1 matrix, namely assigning the value of each pixel of the time stationarity response diagram as 1; if the difference is larger than k, acquiring k frame images before the nth frame, calculating the time average of the k frame images before the nth frame, and carrying out differential calculation on the calculated time average result and the result in the step A to obtain a differential result; the step B comprises the following steps:
step B.1: defining the gray value of the pixel point positioned in (X, Y) in the nth frame as fn(X, Y), calculating the average value of the gray values of the pixel points of the k frame images before the nth frame, wherein the pixel points are located at (X, Y), and the formula is as follows:
Figure BDA0003007474140000051
step B.2: then, calculating the pixel value difference image Diff of the mean value of the current n frame image and the previous k frame image, wherein the formula is as follows:
Figure BDA0003007474140000052
and C: b, threshold segmentation is carried out on the difference result in the step B, impulse noise is filtered out, a filtering result is obtained, then morphological closed operation is carried out on the filtering result, and the obtained morphological closed operation result is used as a time stationarity target response diagram; the concrete steps of the step C are as follows:
step C.1: and carrying out threshold segmentation on the difference image Diff:
step C.1.1: calculating the maximum pixel value μ in the difference image Diff as max (Diff (X, Y));
step C.1.2: then, threshold segmentation is carried out on the Diff of the differential image, and the formula is as follows:
Figure BDA0003007474140000061
wherein alpha is a constant and takes a value of 0.05, and mu represents the maximum pixel value in the differential image DiTf; finally, obtaining an image g subjected to threshold segmentation;
step C.2: then, the obtained image g is subjected to impulse noise filtering:
step C.2.1: designating a window with the size of c multiplied by c, wherein c is set to be 3 and is used for performing sliding window calculation from top to bottom and from left to right on the image g, then obtaining a sliding window with a certain pixel point as a center through the value g (X, Y) of the pixel point in the image g, then sequencing pixel point values in the range of the sliding window, replacing the value g (X, Y) of the pixel point with the sequenced median value of the pixel point, and finally obtaining an image result h after replacing the value of each pixel point in the image g according to the replacement method;
step C.3: and performing morphological closed operation on the image result h:
step C.3.1: selecting a rectangular structural element with the size of 3 multiplied by 3 as se, and setting
Figure BDA0003007474140000066
In order to etch the operation symbol,
Figure BDA0003007474140000065
for the sign of the dilation operation, the image result h is eroded and dilated by se as follows:
Figure BDA0003007474140000062
Figure BDA0003007474140000063
wherein D ishAnd DseThe definition domains are respectively an image result h and a rectangular structural element se;
step C.3.2: and (3) performing closed operation on the image result h by using a rectangular structural element se to obtain a time stationarity target response diagram T, wherein the formula is as follows:
Figure BDA0003007474140000064
the closed operation of the rectangular structural element se is used for filling the gap, and the rectangle with the size of 3 multiplied by 3 is selected to achieve a better effect, meanwhile, the operation amount is not too large, and excessive adhesion can be caused if the rectangular structural element se is too large.
Step D: b, forming a plane-gray three-dimensional space by taking the gray value as an intensity dimension, mapping the result in the step A to the plane-gray three-dimensional space, and performing average down-sampling to obtain a down-sampling result; the specific steps of the step D are as follows:
step D.1: mapping the current frame image to a plane-gray scale three-dimensional space to form a three-dimensional homogeneous vector (wi, w), wherein the formula is as follows:
i(X,Y,I)=fn(X,Y)
w(X,Y,I)=δ(I-fn(X,Y))
step D.2: calculating the minimum pixel value I in the current frame imagemin=min(fi(X,Y));
Step D.3: coordinate (X, Y, f)n(X, Y)) to a down-sampled form (X, Y, ζ):
Figure BDA0003007474140000071
wherein [. ]]For the rounding operator, ssThe sampling rate for the spatial dimension is taken to be 350, srTaking the sampling rate of the intensity dimension as 0.05, and taking zeta as the intensity dimension coordinate after down sampling;
step D.4: the average down-sampled result of the vector (wi, w) is set as the vector (w)i,w) Will vector (w)i,w) Initialized to a zero vector and then paired with the vector (w)i,w) Updating:
(wi(x,y,ζ),w(x,y,ζ))=(wi(X,Y,I),w(X,Y,I))
wherein the coordinate relationship of (X, Y, ζ) to (X, Y, I) is given by the formula in step d.3.
Step E: d, performing three-dimensional Gaussian filtering on the down-sampling result in the step D, and performing linear up-sampling on the result obtained after the three-dimensional Gaussian filtering to obtain an up-sampling result; the concrete steps of the step E are as follows:
step E.1: vector (w)i,w) And three-dimensional Gaussian nucleus
Figure BDA0003007474140000072
Performing convolution to obtain a vector
Figure BDA0003007474140000073
The formula is as follows:
Figure BDA0003007474140000074
wherein, in
Figure BDA0003007474140000075
Middle, sigmasTake 350, σrTaking 0.05, establishing a coordinate system by taking the geometric center of the Gaussian kernel as an origin, wherein the coordinate is the value of an element at the position of (m, n, epsilon), and the formula is as follows:
Figure BDA0003007474140000076
step E.2: for vector
Figure BDA0003007474140000077
Linear interpolation is performed to obtain an upsampled original size result vector (W)bIb,Wb)。
Step F: carrying out normalization processing on the up-sampling result, and then carrying out threshold segmentation to obtain a space stability target response graph; the specific steps of the step F are as follows:
step F.1: for the result vector (W)bIb,Wb) Normalized to obtain IbThen, the result graph f is obtainedbThe formula is as follows:
Ib(X,Y,I)=WbIb(X,Y,I)/Wb(X,Y,I)
fb(X,Y)=Ib(X,Y,I)
step F.2: for the result chart fbPerforming threshold segmentation:
step F.2.1: graph f of calculation resultsbInner maximum pixel value mub=max(fb(X,Y));
Step F.2.2: to fbPerforming threshold segmentation to obtain a threshold-segmented image gb
Figure BDA0003007474140000081
Wherein beta is a constant, taken as 0.72, mubRepresenting a difference image fbThe largest pixel value.
Step G: performing AND operation on the time stationarity target response graph and the space stationarity target response graph to obtain a space-time stationarity target response graph; the specific steps of the step G are as follows:
step G.1: for the image g and the image gbAnd operation is carried out to obtain a space-time stationarity target response graph G, which specifically comprises the following steps:
Figure BDA0003007474140000082
step H: and analyzing the connected domain of the space-time stationarity target response graph G, and taking the obtained centroid position of each connected region as the centroid position of the candidate target to obtain a detection result.
The above are merely representative examples of the many specific applications of the present invention, and do not limit the scope of the invention in any way. All the technical solutions formed by the transformation or the equivalent substitution fall within the protection scope of the present invention.

Claims (8)

1. An infrared small target detection method based on space-time stationarity is characterized by comprising the following steps:
step A: acquiring an image of the infrared small target detection, reading an nth frame of image, preprocessing the nth frame of image, and graying the nth frame of image to obtain a processing result;
and B: reading k frame images before an nth frame, calculating the time average of the k frame images before the nth frame, and carrying out difference calculation on the calculated time average result and the processing result in the step A to obtain a difference result;
and C: b, threshold segmentation is carried out on the difference result in the step B, impulse noise is filtered out, a filtering result is obtained, then morphological closed operation is carried out on the filtering result, and the obtained morphological closed operation result is used as a time stationarity target response diagram;
step D: b, forming a plane-gray three-dimensional space by taking the gray value as an intensity dimension, mapping the processing result in the step A to the plane-gray three-dimensional space, and performing average down-sampling to obtain a down-sampling result;
and E, step E: d, performing three-dimensional Gaussian filtering on the down-sampling result in the step D, and performing linear up-sampling on the result obtained after the three-dimensional Gaussian filtering to obtain an up-sampling result;
step F: carrying out normalization processing on the up-sampling result, and then carrying out threshold segmentation to obtain a space stability target response graph;
g: performing AND operation on the time stationarity target response diagram obtained in the step C and the space stationarity target response diagram obtained in the step F to obtain a space-time stationarity target response diagram;
step H: and carrying out connected domain analysis on the space-time stationarity target response graph to obtain a target centroid position.
2. A method for detecting infrared small targets based on space-time stationarity according to claim 1, wherein the step B comprises:
step B.1: defining the gray value of the pixel point positioned in (X, Y) in the nth frame as fn(X, Y), calculating the average value of the gray values of the pixel points of the k frame images before the nth frame, wherein the pixel points are located at (X, Y), and the formula is as follows:
Figure FDA0003597040560000011
step B.2: then, calculating pixel value difference image Diff of the mean value of the current nth frame image and the previous k frame image, wherein the formula is as follows:
Figure FDA0003597040560000012
3. a method for detecting small infrared targets based on space-time stationarity according to claim 2, characterized in that the specific steps of step C are:
step C.1: and carrying out threshold segmentation on the difference image Diff:
step C.1.1: calculating the maximum pixel value μ in the difference image Diff as max (Diff (X, Y));
step C.1.2: then, threshold segmentation is carried out on the Diff of the differential image, and the formula is as follows:
Figure FDA0003597040560000021
where α is a constant and μ represents the largest pixel value in the difference image Diff; finally, obtaining an image g subjected to threshold segmentation;
step C.2: then, the obtained image g is subjected to impulse noise filtering:
step C.2.1: designating a window with the size of c X c, wherein the value range of c is 3-10, and the window is used for performing sliding window calculation from top to bottom and from left to right on the image g, then obtaining a sliding window with a certain pixel point in the image g as a center through the value g (X, Y) of the pixel point, then sequencing the pixel point values in the sliding window range, replacing the value g (X, Y) of the pixel point with the sequenced median value of the pixel point, and finally obtaining an image result h after replacing the value of each pixel point in the image g according to the replacement method;
step C.3: and performing morphological closed operation on the image result h:
step C.3.1: selecting a rectangular structural element with the size of 3 multiplied by 3 as se, and setting
Figure FDA0003597040560000022
In order to etch the operation symbol,
Figure FDA0003597040560000023
for the sign of the dilation operation, the image result h is eroded and dilated by se as follows:
Figure FDA0003597040560000024
Figure FDA0003597040560000025
wherein D ishAnd DseThe definition domains are respectively an image result h and a rectangular structural element se;
step C.3.2: and (3) performing closed operation on the image result h by using a rectangular structural element se to obtain a time stationarity target response diagram T, wherein the formula is as follows:
Figure FDA0003597040560000026
4. a method for detecting a small infrared target based on space-time stationarity as claimed in claim 1, wherein said step D comprises the following steps:
step D.1: mapping the current frame image to a plane (gray scale three-dimensional space) to form a three-dimensional homogeneous vector (wi, w), wherein the formula is as follows:
i(X,Y,I)=fn(X,Y)
w(X,Y,I)=δ(I-fn(X,Y))
step D.2: calculating the minimum pixel value I in the current frame imagemin=min(fi(X,Y));
Step D.3: coordinate (X, Y, f)n(X, Y)) to a down-sampled form (X, Y, ζ):
Figure FDA0003597040560000027
wherein [. ]]For the rounding operator, ssSampling rate, s, for spatial dimensionsrZeta is the intensity dimension coordinate after down-sampling;
step D.4: the average down-sampled result of the vector (wi, w) is set as the vector (w)i,w) Will vector (w)i,w) Initialized to a zero vector and then paired with the vector (w)i,w) And (3) updating:
(wi(x,y,ζ),w(x,y,ζ))=(wi(X,Y,I),w(X,Y,I))
where the coordinate relationship of (X, Y, ζ) to (X, Y, I) is given by the formula in step d.3.
5. A space-time stationarity-based infrared small target detection method according to claim 4, characterized in that the specific steps of step E are:
step E.1: vector (w)i,w) And three-dimensional Gaussian nucleus
Figure FDA0003597040560000031
Performing convolution to obtain a vector
Figure FDA0003597040560000032
The formula is as follows:
Figure FDA0003597040560000033
wherein, in
Figure FDA0003597040560000034
In the method, a coordinate system is established by taking a geometric center of a Gaussian kernel as an origin, and the coordinate is the value of an element at the position of (m, n, epsilon), as shown in a formula:
Figure FDA0003597040560000035
step E.2: for vector
Figure FDA0003597040560000036
Linear interpolation is performed to obtain an upsampled original size result vector (W)bIb,Wb)。
6. A method for detecting a small infrared target based on space-time stationarity according to claim 5, wherein the specific steps of the step F are specifically:
step F.1: for the result vector (W)bIb,Wb) Normalized to obtain IbThen, the result graph f is obtainedbThe formula is as follows:
Ib(X,Y,I)=WbIb(X,Y,I)/Wb(X,Y,I)
fb(X,Y)=Ib(X,Y,I)
step F.2: for the result chart fbPerforming threshold segmentation:
step F.2.1: graph f of calculation resultsbInner maximum pixel value mub=max(fb(X,Y));
Step F.2.2: to fbPerforming threshold segmentation, thenObtaining a thresholded image gb
Figure FDA0003597040560000037
Wherein beta is a constant, mubRepresenting a difference image fbThe largest pixel value.
7. A method for detecting small infrared targets based on space-time stationarity according to claim 6, wherein the specific steps of the step G are as follows:
step G.1: for the image g and the image gbAnd operation is carried out to obtain a space-time stationarity target response graph G, which specifically comprises the following steps:
Figure FDA0003597040560000038
8. a method for detecting small infrared targets based on space-time stationarity according to claim 7, wherein the specific steps of the step H are as follows: and analyzing the connected domain of the space-time stationarity target response graph G, and taking the obtained centroid position of each connected region as the centroid position of the candidate target to obtain a detection result.
CN202110376498.1A 2021-04-06 2021-04-06 Infrared small target detection method based on space-time stationarity Active CN113034533B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110376498.1A CN113034533B (en) 2021-04-06 2021-04-06 Infrared small target detection method based on space-time stationarity

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110376498.1A CN113034533B (en) 2021-04-06 2021-04-06 Infrared small target detection method based on space-time stationarity

Publications (2)

Publication Number Publication Date
CN113034533A CN113034533A (en) 2021-06-25
CN113034533B true CN113034533B (en) 2022-05-20

Family

ID=76454144

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110376498.1A Active CN113034533B (en) 2021-04-06 2021-04-06 Infrared small target detection method based on space-time stationarity

Country Status (1)

Country Link
CN (1) CN113034533B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104834915A (en) * 2015-05-15 2015-08-12 中国科学院武汉物理与数学研究所 Small infrared object detection method in complex cloud sky background

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102819740B (en) * 2012-07-18 2016-04-20 西北工业大学 A kind of Single Infrared Image Frame Dim targets detection and localization method
CN106251344B (en) * 2016-07-26 2019-02-01 北京理工大学 A kind of multiple dimensioned infrared target self-adapting detecting method of view-based access control model receptive field
CN106570889A (en) * 2016-11-10 2017-04-19 河海大学 Detecting method for weak target in infrared video
CN109461168B (en) * 2018-10-15 2021-03-16 腾讯科技(深圳)有限公司 Target object identification method and device, storage medium and electronic device
CN111222502B (en) * 2019-12-28 2023-05-12 中国船舶重工集团公司第七一七研究所 Infrared small target image labeling method and system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104834915A (en) * 2015-05-15 2015-08-12 中国科学院武汉物理与数学研究所 Small infrared object detection method in complex cloud sky background

Also Published As

Publication number Publication date
CN113034533A (en) 2021-06-25

Similar Documents

Publication Publication Date Title
CN110728697A (en) Infrared dim target detection tracking method based on convolutional neural network
CN109961506A (en) A kind of fusion improves the local scene three-dimensional reconstruction method of Census figure
CN110555868A (en) method for detecting small moving target under complex ground background
CN109859247B (en) Near-ground scene infrared small target detection method
JP2008523454A (en) How to model background and foreground regions
CN111145121B (en) Confidence term filter target tracking method for strengthening multi-feature fusion
CN112364865B (en) Method for detecting small moving target in complex scene
CN111369570B (en) Multi-target detection tracking method for video image
CN111709968B (en) Low-altitude target detection tracking method based on image processing
CN111311644B (en) Moving target detection method based on video SAR
Cvejic et al. The effect of pixel-level fusion on object tracking in multi-sensor surveillance video
CN111539980B (en) Multi-target tracking method based on visible light
Benedek et al. Moving target analysis in ISAR image sequences with a multiframe marked point process model
CN116229359A (en) Smoke identification method based on improved classical optical flow method model
CN111161308A (en) Dual-band fusion target extraction method based on key point matching
CN114648547A (en) Weak and small target detection method and device for anti-unmanned aerial vehicle infrared detection system
CN112288780B (en) Multi-feature dynamically weighted target tracking algorithm
CN113205494A (en) Infrared small target detection method and system based on adaptive scale image block weighting difference measurement
CN113034533B (en) Infrared small target detection method based on space-time stationarity
CN112613456A (en) Small target detection method based on multi-frame differential image accumulation
CN116883897A (en) Low-resolution target identification method
CN109087334B (en) Target tracking method based on significant context sparse representation
CN114820718A (en) Visual dynamic positioning and tracking algorithm
CN108389219B (en) Weak and small target tracking loss re-detection method based on multi-peak judgment
Qi et al. Fast detection of small infrared objects in maritime scenes using local minimum patterns

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant