CN118279397B - Infrared dim target rapid detection method based on first-order directional derivative - Google Patents

Infrared dim target rapid detection method based on first-order directional derivative Download PDF

Info

Publication number
CN118279397B
CN118279397B CN202410687371.5A CN202410687371A CN118279397B CN 118279397 B CN118279397 B CN 118279397B CN 202410687371 A CN202410687371 A CN 202410687371A CN 118279397 B CN118279397 B CN 118279397B
Authority
CN
China
Prior art keywords
image
candidate
target
point
pixel value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410687371.5A
Other languages
Chinese (zh)
Other versions
CN118279397A (en
Inventor
贲广利
王永成
肖辉
钱进
刘纪伟
徐东东
胡雪岩
罗佺佺
孙蕴晗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Original Assignee
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changchun Institute of Optics Fine Mechanics and Physics of CAS filed Critical Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority to CN202410687371.5A priority Critical patent/CN118279397B/en
Publication of CN118279397A publication Critical patent/CN118279397A/en
Application granted granted Critical
Publication of CN118279397B publication Critical patent/CN118279397B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention belongs to the technical field of infrared image processing and target detection, and particularly relates to a method for rapidly detecting an infrared weak and small target based on a first-order directional derivative, which comprises the following steps: s1: calculating first-order directional derivatives of the original infrared image processed by the Gaussian differential filter along 0-degree and 90-degree directions based on Facet models; s2: performing amplitude normalization processing on the first and second first-order direction derivative images; s3: threshold detection is carried out on the first normalized image and the second normalized image; s4: judging whether each candidate point of the first normalized image and the second normalized image has a target position or not; s5: obtaining first and second candidate target images according to the calculation result of the step S4; s6: performing image fusion on the first candidate target image and the second candidate target image to obtain an output image containing an infrared weak and small target; s7: and performing target detection with a pixel value of 1 on the output image to obtain a final detection result. The invention has the advantages of high detection precision and high detection efficiency.

Description

Infrared dim target rapid detection method based on first-order directional derivative
Technical Field
The invention belongs to the technical field of infrared image processing and target detection, and particularly relates to a method for rapidly detecting an infrared dim target based on a first-order directional derivative.
Background
The infrared dim target detection technology is an important component of an infrared searching and tracking system, and the dim target detection based on infrared images has important application in the fields of missile early warning, missile interception and the like. The infrared imaging system detects targets under the influence of long-distance imaging and random complex background interference, and has the characteristics of small imaging size, weak radiation energy, indistinguishable geometric outline, easiness in complex background interference, low signal to noise ratio and the like, namely, the detection of infrared weak targets is more difficult. At present, the infrared dim target detection can be divided into two types according to input sources, namely the infrared dim target detection based on an image sequence and the infrared dim target detection based on a single frame image, and the infrared dim target detection based on the single frame image gradually becomes the mainstream because the calculation amount of an infrared dim target detection algorithm based on the image sequence is large, the requirements on a processor and hardware resources are high, the real-time performance is poor, and compared with the infrared dim target detection algorithm based on the single frame image, the infrared dim target detection based on the single frame image has lower efficiency. The method for detecting the infrared weak and small target based on the single frame image comprises a filtering method based on a space domain or a frequency domain, a contrast method based on a visual system, a method based on a data structure and the like, wherein the former two methods are easy to realize, but the false alarm under a complex background is higher, the spatial resolution is low, the calculation amount of the method based on the data structure is larger, and the method is not easy to realize engineering.
Disclosure of Invention
In view of the above, the invention aims to provide a method for rapidly detecting an infrared weak and small target based on a first-order directional derivative, so as to solve the problem of mutual restriction among false alarms, spatial resolution and algorithm calculated amount of the existing method for detecting the infrared weak and small target, improve the detection precision and detection efficiency of the infrared weak and small target, and reduce the false alarm probability.
In order to achieve the above purpose, the technical scheme of the invention is realized as follows:
A method for rapidly detecting infrared dim targets based on first-order directional derivatives specifically comprises the following steps:
S1: processing an original infrared image by using a Gaussian differential filter, calculating first-order directional derivative of the original infrared image processed by the Gaussian differential filter along the 0-degree direction based on a Facet model, and calculating first-order directional derivative of the original infrared image processed by the Gaussian differential filter along the 90-degree direction based on a Facet model, so as to correspondingly obtain a first-order directional derivative image and a second first-order directional derivative image;
s2: simultaneously carrying out amplitude normalization processing on the first-order direction derivative image and the second first-order direction derivative image, and correspondingly obtaining a first normalized image and a second normalized image;
S3: simultaneously carrying out threshold detection on the first normalized image and the second normalized image to obtain all candidate points of the first normalized image and all candidate points of the second normalized image;
S4: judging whether each candidate point of the first normalized image has a target position along the 0-degree direction and whether each candidate point of the second normalized image has a target position along the 90-degree direction;
S5: obtaining a first candidate target image and a second candidate target image according to the judgment result of the step S4;
s6: performing image fusion on the first candidate target image and the second candidate target image to obtain an output image containing an infrared weak target;
s7: and detecting the target with the pixel value of 1 on the output image, determining the central position of the infrared weak target, and taking the central position of the infrared weak target as a final detection result.
Further, in step S1, a calculation formula for calculating the first-order directional derivative of the original infrared image processed by the gaussian differential filter along the 0 degree direction and the first-order directional derivative of the original infrared image along the 90 degree direction based on the Facet model is as follows:
wherein, For the first-order directional derivative image,For the second first order directional derivative image,Is the coordinates of the pixel point and,Are interpolation coefficients of Facet models.
Further, the calculation formula of the interpolation coefficient of Facet model is:
wherein, For the ith interpolation coefficient of Facet models,For the kernel coefficient corresponding to the i-th interpolation coefficient,The output of the gaussian differential filter T is the transpose of the matrix.
Further, in step S2, the first normalized image and the second normalized image satisfy the following formula:
wherein, For the first normalized image, the first image is normalized,For the second normalized image, the second image is obtained,Is the coordinates of the pixel point.
Further, in step S3, the threshold is set to beBased on the threshold value, all candidate points of the first normalized image and all candidate points of the second normalized image are obtained by:
wherein, For the i-th candidate point of the first normalized image,For the i-th candidate point of the second normalized image,For any coordinatesIs a pixel of (a) a pixel of (b).
Further, in step S4, the specific step of determining whether the target position along the 0 ° direction exists at each candidate point of the first normalized image includes:
S411: taking the ith candidate point as a candidate point to be processed in the first normalized image, and setting a local area R based on the candidate point to be processed, wherein the local area R is a square taking the candidate point to be processed as a center, i= {1,2,3, …, n }, and n is the total number of the candidate points;
s412: selecting a maximum pixel value point and a minimum pixel value point in the local region R according to the pixel value of each pixel point contained in the local region R, executing step S413 when the maximum pixel value point and the minimum pixel value point meet the following formula, otherwise, regarding the current candidate point as a target position along the 0-degree direction, regarding the (i+1) th candidate point as a candidate point to be processed in the first normalized image, and executing step S411:
wherein, The pixel value that is the maximum pixel value point,A pixel value that is a very small pixel value point,Is the position coordinates of the maximum pixel value point,Is the position coordinates of the very small pixel value point,For the amplitude threshold value,Is a spatial distance threshold;
S413: calculating the target position of the current candidate point by the following formula:
Wherein, the method comprises the following steps of ) Coordinates of a target position of the current candidate point along the 0-degree direction;
s414: steps S411 to S413 are repeated, and coordinates of target positions of all the candidate points of the first normalized image in the 0 ° direction are calculated.
Further, in step S4, the specific step of determining whether the target position along the 90 ° direction exists at each candidate point of the second normalized image includes:
S421: taking the ith candidate point as a candidate point to be processed in the second normalized image, and setting a local area R based on the candidate point to be processed, wherein the local area R is a square taking the candidate point to be processed as a center, i= {1,2,3, …, n }, and n is the total number of the candidate points;
s422: selecting a maximum pixel value point and a minimum pixel value point in the local region R according to the pixel value of each pixel point contained in the local region R, executing step S423 when the maximum pixel value point and the minimum pixel value point meet the following formula, otherwise, judging that the current candidate point does not exist as a target position along the 90-degree direction, and executing step S421 by taking the (i+1) th candidate point as a candidate point to be processed in the second normalized image:
wherein, The pixel value that is the maximum pixel value point,A pixel value that is a very small pixel value point,Is the position coordinates of the maximum pixel value point,Is the position coordinates of the very small pixel value point,For the amplitude threshold value,Is a spatial distance threshold;
S423: calculating the target position of the current candidate point by the following formula:
Wherein, the method comprises the following steps of ) Coordinates of a target position of the current candidate point along the 90-degree direction;
S424: steps S421-S423 are repeated to calculate target positions of all the candidate points of the second normalized image in the 90 ° direction.
Further, in step S5, in the first normalized image, the pixel values of the candidate points having the target position in the 0 ° direction are set to 1, and the pixel values of the candidate points not having the target position in the 0 ° direction are set to 0, to obtain a first candidate target image:
in the second normalized image, the pixel values of the candidate points having the target positions in the 90 ° direction are set to 1, and the pixel values of the candidate points having no target positions in the 90 ° direction are set to 0, to obtain a second candidate target image.
Further, in step S6, a calculation formula for performing image fusion on the first candidate target image and the second candidate target image is as follows:
wherein, In order to output an image of the subject,For the first candidate object-image,Is the second candidate target image.
Compared with the prior art, the invention has the following beneficial effects:
(1) The invention creates the quick detection method of the infrared weak and small targets based on the first-order directional derivative, and utilizes the characteristic that the first-order directional derivative images of the infrared weak and small targets in different directions are distributed differently, the candidate target images in two directions of 0 degree and 90 degrees are subjected to image fusion to obtain an output image, the target detection with the pixel value of 1 is carried out on the output image, the central position of the infrared weak and small target is determined, and the central position of the infrared weak and small target is taken as the final detection result.
(2) The method for quickly detecting the infrared weak and small target based on the first-order directional derivative can realize the accurate positioning of the target position, and the detection processes in two directions can be calculated in parallel, so that the method for quickly detecting the infrared weak and small target has the advantages of high instantaneity and high detection efficiency.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention and do not constitute an undue limitation on the invention. In the drawings:
FIG. 1 is a flow chart of a method for rapidly detecting an infrared dim target based on a first-order directional derivative according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a structure for determining coordinates of a target location in a local area according to an embodiment of the present invention;
FIG. 3 is an original infrared image according to an inventive embodiment of the present invention;
FIG. 4 is a first-order directional derivative image according to an inventive embodiment of the present invention;
FIG. 5 is a second order directional derivative image according to an inventive embodiment of the present invention;
FIG. 6 is a schematic three-dimensional simulation of candidate points for acquiring a first normalized image according to an inventive embodiment of the present invention;
FIG. 7 is a schematic three-dimensional simulation of candidate points for obtaining a second normalized image according to an inventive embodiment of the present invention;
FIG. 8 is a first candidate object-image according to an inventive embodiment of the present invention;
FIG. 9 is a second candidate object-image according to an inventive embodiment of the present invention;
fig. 10 is an image of the center position of an infrared dim target according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described in detail with reference to the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not to be construed as limiting the invention.
It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other.
In the description of the invention, it should be understood that the terms "center," "longitudinal," "transverse," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like indicate orientations or positional relationships that are based on the orientation or positional relationships shown in the drawings, merely to facilitate describing the invention and simplify the description, and do not indicate or imply that the device or element being referred to must have a particular orientation, be configured and operate in a particular orientation, and therefore should not be construed as limiting the invention. Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first", "a second", etc. may explicitly or implicitly include one or more such feature. In the description of the invention, unless otherwise indicated, the meaning of "a plurality" is two or more.
In the description of the invention, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the creation of the present invention can be understood by those of ordinary skill in the art in a specific case.
The invention will be described in detail below with reference to the drawings in connection with embodiments.
As shown in fig. 1, the method for quickly detecting the infrared dim target based on the first-order directional derivative provided by the embodiment of the invention specifically comprises the following steps:
S1: and processing the original infrared image by using a Gaussian differential filter, calculating the first-order directional derivative of the original infrared image processed by the Gaussian differential filter along the 0-degree direction based on a Facet model, and calculating the first-order directional derivative of the original infrared image processed by the Gaussian differential filter along the 90-degree direction based on a Facet model, so as to correspondingly obtain a first-order directional derivative image and a second first-order directional derivative image.
In step S1, a calculation formula for calculating the first-order directional derivative of the original infrared image processed by the gaussian differential filter along the 0 degree direction and the first-order directional derivative of the original infrared image along the 90 degree direction based on the Facet model is as follows:
wherein, For the first-order directional derivative image,For the second first order directional derivative image,Is the coordinates of the pixel point and,Are interpolation coefficients of Facet models. The calculation formula of the interpolation coefficient of Facet model is:
wherein, For the ith interpolation coefficient of Facet models,For the kernel coefficient corresponding to the i-th interpolation coefficient,The output of the gaussian differential filter T is the transpose of the matrix.
S2: and simultaneously carrying out amplitude normalization processing on the first-order direction derivative image and the second first-order direction derivative image, and correspondingly obtaining a first normalized image and a second normalized image.
In step S2, the first normalized image and the second normalized image satisfy the following equation:
wherein, For the first normalized image, the first image is normalized,For the second normalized image, the second image is obtained,Is the coordinates of the pixel point.
S3: and simultaneously carrying out threshold detection on the first normalized image and the second normalized image to obtain all candidate points of the first normalized image and all candidate points of the second normalized image.
According to practical application, threshold valueAny one value of {0.5,0.6,0.7,0.8,0.9} is taken.
In step S3, the threshold is set toBased on the threshold value, all candidate points of the first normalized image and all candidate points of the second normalized image are obtained by:
wherein, For the i-th candidate point of the first normalized image,For the i-th candidate point of the second normalized image,For any coordinatesIs a pixel of (a) a pixel of (b).
S4: and judging whether each candidate point of the first normalized image has a target position along the 0-degree direction and whether each candidate point of the second normalized image has a target position along the 90-degree direction.
As shown in fig. 2, in step S4, the specific step of determining whether the target position along the 0 ° direction exists at each candidate point of the first normalized image includes:
S411: taking the ith candidate point as a candidate point to be processed in the first normalized image, and setting a local area R based on the candidate point to be processed, wherein the local area R is a square taking the candidate point to be processed as a center, i= {1,2,3, …, n }, and n is the total number of the candidate points;
s412: selecting a maximum pixel value point and a minimum pixel value point in the local region R according to the pixel value of each pixel point contained in the local region R, executing step S413 when the maximum pixel value point and the minimum pixel value point meet the following formula, otherwise, regarding the current candidate point as a target position along the 0-degree direction, regarding the (i+1) th candidate point as a candidate point to be processed in the first normalized image, and executing step S411:
wherein, The pixel value that is the maximum pixel value point,A pixel value that is a very small pixel value point,Is the position coordinates of the maximum pixel value point,Is the position coordinates of the very small pixel value point,For the amplitude threshold value,Is a spatial distance threshold;
S413: calculating the target position of the current candidate point by the following formula:
Wherein, the method comprises the following steps of ) Coordinates of a target position of the current candidate point along the 0-degree direction;
s414: steps S411 to S413 are repeated, and coordinates of target positions of all the candidate points of the first normalized image in the 0 ° direction are calculated.
In step S4, the specific step of determining whether the target position along the 90 ° direction exists at each candidate point of the second normalized image includes:
S421: taking the ith candidate point as a candidate point to be processed in the second normalized image, setting a local area R based on the candidate point to be processed, wherein the local area R is a square with the candidate point to be processed as a center and the side length of the square is m, i= {1,2,3, …, n }, n is the total number of the candidate points, and the value range of m is 15-30, and can be adjusted according to actual conditions;
s422: selecting a maximum pixel value point and a minimum pixel value point in the local region R according to the pixel value of each pixel point contained in the local region R, executing step S423 when the maximum pixel value point and the minimum pixel value point meet the following formula, otherwise, judging that the current candidate point does not exist as a target position along the 90-degree direction, and executing step S421 by taking the (i+1) th candidate point as a candidate point to be processed in the second normalized image:
wherein, The pixel value that is the maximum pixel value point,A pixel value that is a very small pixel value point,Is the position coordinates of the maximum pixel value point,Is the position coordinates of the very small pixel value point,For the amplitude threshold value,Is a spatial distance threshold;
S423: calculating the target position of the current candidate point by the following formula:
Wherein, the method comprises the following steps of ) Coordinates of a target position of the current candidate point along the 90-degree direction;
S424: steps S421-S423 are repeated to calculate target positions of all the candidate points of the second normalized image in the 90 ° direction.
S5: and (4) obtaining a first candidate target image and a second candidate target image according to the judging result of the step S4.
In step S5, in the first normalized image, the pixel values of the candidate points having the target position in the 0 ° direction are set to 1, and the pixel values of the candidate points not having the target position in the 0 ° direction are set to 0, obtaining a first candidate target image:
in the second normalized image, the pixel values of the candidate points having the target positions in the 90 ° direction are set to 1, and the pixel values of the candidate points having no target positions in the 90 ° direction are set to 0, to obtain a second candidate target image.
S6: and performing image fusion on the first candidate target image and the second candidate target image to obtain an output image containing the infrared weak target.
In step S6, a calculation formula for performing image fusion on the first candidate target image and the second candidate target image is as follows:
wherein, In order to output an image of the subject,For the first candidate object-image,Is the second candidate target image.
S7: and detecting the target with the pixel value of 1 (namely, 1 is not carried out on the pixel value of 0) on the output image, determining the central position of the infrared weak target, and taking the central position of the infrared weak target as a final detection result.
Example 1
The embodiment of the invention provides a method for rapidly detecting infrared dim targets based on first-order directional derivatives, which specifically comprises the following steps:
S1: the original infrared image shown in fig. 3 is processed by using a Gaussian differential filter, and the first-order directional derivative of the original infrared image processed by the Gaussian differential filter along the 0-degree direction and the first-order directional derivative of the original infrared image along the 90-degree direction are calculated based on a Facet model, so that a first-order directional derivative image shown in fig. 4 and a second-order directional derivative image shown in fig. 5 are correspondingly obtained.
S2: and simultaneously carrying out amplitude normalization processing on the first-order direction derivative image and the second first-order direction derivative image, and correspondingly obtaining a first normalized image and a second normalized image.
S3: and simultaneously carrying out threshold detection on the first normalized image and the second normalized image to obtain all candidate points of the first normalized image and all candidate points of the second normalized image.
As shown in fig. 6-7, take(The diamond area is ths), the candidate points of the first normalized image and the candidate points of the second normalized image are obtained by taking points greater than ths as candidate pointsAnd
Wherein,For the i-th candidate point of the first normalized image,For the i-th candidate point of the second normalized image,For any coordinatesIs a pixel of (a) a pixel of (b).
S4: and judging whether each candidate point of the first normalized image has a target position along the 0-degree direction and whether each candidate point of the second normalized image has a target position along the 90-degree direction.
S5: and (4) obtaining a first candidate target image and a second candidate target image according to the judging result of the step S4.
Taking parametersThe first candidate target image and the second candidate target image are shown in fig. 8 and 9, respectively.
S6: and performing image fusion on the first candidate target image and the second candidate target image to obtain an output image containing the infrared weak target.
In step S6, a calculation formula for performing image fusion on the first candidate target image and the second candidate target image is as follows:
wherein, In order to output an image of the subject,For the first candidate object-image,Is the second candidate target image.
S7: and detecting the target with the pixel value of 1 (namely, 1 is not carried out on the pixel value of 0) on the output image, determining the central position of the infrared weak target, and taking the central position of the infrared weak target as a final detection result. The detection result is shown as a bright spot in fig. 10.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present disclosure may be performed in parallel, sequentially, or in a different order, provided that the desired results of the technical solutions of the present disclosure are achieved, and are not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (7)

1. A method for rapidly detecting infrared dim targets based on first-order directional derivatives is characterized by comprising the following steps of: the method specifically comprises the following steps:
S1: processing an original infrared image by using a Gaussian differential filter, calculating first-order directional derivative of the original infrared image processed by the Gaussian differential filter along the 0-degree direction based on a Facet model, and calculating first-order directional derivative of the original infrared image processed by the Gaussian differential filter along the 90-degree direction based on a Facet model, so as to correspondingly obtain a first-order directional derivative image and a second first-order directional derivative image;
S2: simultaneously carrying out amplitude normalization processing on the first-order direction derivative image and the second first-order direction derivative image, and correspondingly obtaining a first normalized image and a second normalized image;
S3: simultaneously carrying out threshold detection on the first normalized image and the second normalized image to obtain all candidate points of the first normalized image and all candidate points of the second normalized image;
S4: meanwhile, judging whether each candidate point of the first normalized image has a target position along the 0-degree direction and whether each candidate point of the second normalized image has a target position along the 90-degree direction;
The specific step of judging whether each candidate point of the first normalized image has a target position along the 0-degree direction comprises the following steps:
S411: taking the ith candidate point as a candidate point to be processed in the first normalized image, and setting a local area R based on the candidate point to be processed, wherein the local area R is a square taking the candidate point to be processed as a center, i= {1,2,3, …, n }, and n is the total number of the candidate points;
S412: selecting a maximum pixel value point and a minimum pixel value point in the local region R according to the pixel value of each pixel point included in the local region R, executing step S413 when the maximum pixel value point and the minimum pixel value point satisfy the following formula, otherwise, regarding the current candidate point as a target position along the 0 ° direction, and regarding the i+1th candidate point as a candidate point to be processed in the first normalized image, and executing step S411:
wherein, The pixel value that is the maximum pixel value point,A pixel value that is a very small pixel value point,Is the position coordinates of the maximum pixel value point,Is the position coordinates of the very small pixel value point,For the amplitude threshold value,Is a spatial distance threshold;
S413: calculating the target position of the current candidate point by the following formula:
Wherein, the method comprises the following steps of ) Coordinates of a target position of the current candidate point along the 0-degree direction;
S414: repeating steps S411-S413, and calculating coordinates of target positions of all candidate points of the first normalized image along the 0-degree direction;
The specific step of judging whether the target position along the 90 DEG direction exists at each candidate point of the second normalized image comprises the following steps:
S421: taking the ith candidate point as a candidate point to be processed in the second normalized image, and setting a local area R based on the candidate point to be processed, wherein the local area R is a square taking the candidate point to be processed as a center, i= {1,2,3, …, n }, and n is the total number of the candidate points;
S422: selecting a maximum pixel value point and a minimum pixel value point in the local region R according to the pixel value of each pixel point included in the local region R, executing step S423 when the maximum pixel value point and the minimum pixel value point satisfy the following formula, otherwise, considering the current candidate point as not having a target position along the 90 ° direction, judging, and executing step S421 with the i+1st candidate point as the candidate point to be processed in the second normalized image:
wherein, The pixel value that is the maximum pixel value point,A pixel value that is a very small pixel value point,Is the position coordinates of the maximum pixel value point,Is the position coordinates of the very small pixel value point,For the amplitude threshold value,Is a spatial distance threshold;
S423: calculating the target position of the current candidate point by the following formula:
Wherein, the method comprises the following steps of ) Coordinates of a target position of the current candidate point along the 90-degree direction;
s424: repeating steps S421-S423, and calculating target positions of all candidate points of the second normalized image along the 90-degree direction;
s5: obtaining a first candidate target image and a second candidate target image according to the judging result of the step S4;
s6: performing image fusion on the first candidate target image and the second candidate target image to obtain an output image containing an infrared weak target;
S7: and detecting the target with the pixel value of 1 on the output image, determining the central position of the infrared weak target, and taking the central position of the infrared weak target as a final detection result.
2. The method for rapidly detecting the infrared small target based on the first-order directional derivative according to claim 1, wherein the method comprises the following steps of: in the step S1, a calculation formula for calculating the first-order directional derivative of the original infrared image processed by the gaussian differential filter along the 0 degree direction and the first-order directional derivative of the original infrared image along the 90 degree direction based on the Facet model is as follows:
wherein, For the first-order directional derivative image,For the second first order directional derivative image,Is the coordinates of the pixel point and,Are interpolation coefficients of Facet models.
3. The method for rapidly detecting the infrared small target based on the first-order directional derivative according to claim 2, wherein the method comprises the following steps of: the calculation formula of the interpolation coefficient of the Facet model is as follows:
wherein, For the ith interpolation coefficient of Facet models,For the kernel coefficient corresponding to the i-th interpolation coefficient,T is the transpose of the matrix, which is the output of the Gaussian differential filter.
4. The method for rapidly detecting the infrared small target based on the first-order directional derivative according to claim 1, wherein the method comprises the following steps of: in the step S2, the first normalized image and the second normalized image satisfy the following formula:
wherein, For the first normalized image, the first image is normalized,For the second normalized image, the second image is obtained,Is the coordinates of the pixel point.
5. The method for rapidly detecting the infrared small target based on the first-order directional derivative according to claim 4, wherein the method comprises the following steps of: in the step S3, the threshold is set to beBased on the threshold, all candidate points of the first normalized image and all candidate points of the second normalized image are obtained by:
wherein, For the i-th candidate point of the first normalized image,For the i-th candidate point of the second normalized image,For any coordinatesIs a pixel of (a) a pixel of (b).
6. The method for rapidly detecting the infrared small target based on the first-order directional derivative according to claim 1, wherein the method comprises the following steps of: in the step S5, in the first normalized image, the pixel values of the candidate points having the target position in the 0 ° direction are set to 1, and the pixel values of the candidate points not having the target position in the 0 ° direction are set to 0, to obtain a first candidate target image:
in the second normalized image, the pixel values of the candidate points having the target positions in the 90 ° direction are set to 1, and the pixel values of the candidate points having no target positions in the 90 ° direction are set to 0, to obtain a second candidate target image.
7. The method for rapidly detecting the infrared small target based on the first-order directional derivative according to claim 1, wherein the method comprises the following steps of: in the step S6, a calculation formula for performing image fusion on the first candidate target image and the second candidate target image is as follows:
wherein, In order to output an image of the subject,For the first candidate object-image,Is the second candidate target image.
CN202410687371.5A 2024-05-30 2024-05-30 Infrared dim target rapid detection method based on first-order directional derivative Active CN118279397B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410687371.5A CN118279397B (en) 2024-05-30 2024-05-30 Infrared dim target rapid detection method based on first-order directional derivative

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410687371.5A CN118279397B (en) 2024-05-30 2024-05-30 Infrared dim target rapid detection method based on first-order directional derivative

Publications (2)

Publication Number Publication Date
CN118279397A CN118279397A (en) 2024-07-02
CN118279397B true CN118279397B (en) 2024-08-13

Family

ID=91645443

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410687371.5A Active CN118279397B (en) 2024-05-30 2024-05-30 Infrared dim target rapid detection method based on first-order directional derivative

Country Status (1)

Country Link
CN (1) CN118279397B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104182992A (en) * 2014-08-19 2014-12-03 哈尔滨工程大学 Method for detecting small targets on the sea on the basis of panoramic vision
CN106548457A (en) * 2016-10-14 2017-03-29 北京航空航天大学 A kind of method for detecting infrared puniness target using multi-direction first-order partial derivative

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111861968B (en) * 2019-04-23 2023-04-28 中国科学院长春光学精密机械与物理研究所 Infrared dim target detection method and detection system
CN112418090B (en) * 2020-11-23 2023-05-05 中国科学院西安光学精密机械研究所 Real-time detection method for infrared weak and small target under sky background
CN113673385B (en) * 2021-08-06 2024-06-21 南京理工大学 Sea surface ship detection method based on infrared image
CN117132617A (en) * 2023-08-21 2023-11-28 北京工业大学 Gear overall error acquisition method based on continuous piecewise fitting Facet model

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104182992A (en) * 2014-08-19 2014-12-03 哈尔滨工程大学 Method for detecting small targets on the sea on the basis of panoramic vision
CN106548457A (en) * 2016-10-14 2017-03-29 北京航空航天大学 A kind of method for detecting infrared puniness target using multi-direction first-order partial derivative

Also Published As

Publication number Publication date
CN118279397A (en) 2024-07-02

Similar Documents

Publication Publication Date Title
CN109961401A (en) A kind of method for correcting image and storage medium of binocular camera
CN110992263B (en) Image stitching method and system
CN115761550A (en) Water surface target detection method based on laser radar point cloud and camera image fusion
CN109118544B (en) Synthetic aperture imaging method based on perspective transformation
CN110796694A (en) Fruit three-dimensional point cloud real-time acquisition method based on KinectV2
CN106982312A (en) Many aperture camera systems and its operating method
CN106296811A (en) A kind of object three-dimensional reconstruction method based on single light-field camera
CN110264528A (en) Quick self-calibration method for fisheye lens binocular camera
CN105258673B (en) A kind of target ranging method based on binocular synthetic aperture focusing image, device
CN110874854A (en) Large-distortion wide-angle camera binocular photogrammetry method based on small baseline condition
CN115601406A (en) Local stereo matching method based on fusion cost calculation and weighted guide filtering
CN107610219A (en) The thick densification method of Pixel-level point cloud that geometry clue perceives in a kind of three-dimensional scenic reconstruct
CN115359130B (en) Radar and camera combined calibration method and device, electronic equipment and storage medium
CN110505398A (en) A kind of image processing method, device, electronic equipment and storage medium
CN118279397B (en) Infrared dim target rapid detection method based on first-order directional derivative
CN111047636A (en) Obstacle avoidance system and method based on active infrared binocular vision
WO2020124091A1 (en) Automatic fine-grained radio map construction and adaptation
CN115205354A (en) Phased array laser radar imaging method based on RANSAC and ICP point cloud registration
CN107479052B (en) Ground concealed target detection method based on Generalized Gaussian Distribution Model
CN109448060A (en) A kind of camera calibration parameter optimization method based on bat algorithm
CN110487254A (en) A kind of submarine target size method for fast measuring for ROV
CN113706391B (en) Real-time splicing method, system, equipment and storage medium for aerial images of unmanned aerial vehicle
CN115100382A (en) Nerve surface reconstruction system and method based on mixed characterization
CN110346117B (en) Light spot high-precision positioning method under ultra-wide view field of fisheye lens
CN113592953A (en) Binocular non-cooperative target pose measurement method based on feature point set

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant