CN112581548B - Method and system for filtering pseudo star target of star sensor - Google Patents

Method and system for filtering pseudo star target of star sensor Download PDF

Info

Publication number
CN112581548B
CN112581548B CN202011241781.5A CN202011241781A CN112581548B CN 112581548 B CN112581548 B CN 112581548B CN 202011241781 A CN202011241781 A CN 202011241781A CN 112581548 B CN112581548 B CN 112581548B
Authority
CN
China
Prior art keywords
target
gray
star
pixel
background
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011241781.5A
Other languages
Chinese (zh)
Other versions
CN112581548A (en
Inventor
王亮
朱生国
黄海
尹伟
李奇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
717th Research Institute of CSIC
Original Assignee
717th Research Institute of CSIC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 717th Research Institute of CSIC filed Critical 717th Research Institute of CSIC
Priority to CN202011241781.5A priority Critical patent/CN112581548B/en
Publication of CN112581548A publication Critical patent/CN112581548A/en
Application granted granted Critical
Publication of CN112581548B publication Critical patent/CN112581548B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a method and a system for filtering a pseudo star target of a star sensor, wherein the method comprises the following steps: performing surface fitting on a target background based on the gray level of a target background area, calculating the gray level variance of pixels in the background area, calculating the segmentation threshold of the pixels in the target area according to the gray level variance of the background area and the gray level of the pixels in the target area, performing threshold segmentation on the target area, and determining target imaging pixels through target clustering; estimating target imaging parameters based on a star target two-dimensional Gaussian distribution imaging model, if the sigma value of the target imaging parameters is larger than a set threshold value, filtering the target as a pseudo-star target, calculating the fitting residual of target gray according to the estimated target imaging parameters, calculating the target star energy distribution confidence coefficient by using the fitting residual, and if the confidence value is larger than the threshold value, filtering the target as the pseudo-star target. By the scheme, the pseudo star target with non-fixed star energy distribution in a complex light environment can be effectively filtered, the target omission factor is low, the reliability is high, the operation speed is high, the accurate output of the real star target can be guaranteed, and the attitude output efficiency and the accuracy of the star sensor are improved.

Description

Method and system for filtering pseudo star target of star sensor
Technical Field
The invention relates to the field of astronomical navigation, in particular to a method and a system for filtering a pseudo star target of a star sensor.
Background
In an astronomical photoelectric observation system, a star measurement image has pseudo star targets at different degrees, the pseudo star target is a spot target formed by gathering a small amount of energy onto a detector target surface by an optical system due to the irradiation of sunlight by engine plume or a non-fixed star space target and the like, and fig. 1 is remote measurement image data of an actual flight test of a certain type of star sensor. The star sensor with the large view field and the high sensitivity greatly improves the detection capability and can detect more space targets with small sizes, however, the fake star target in the star image can influence the star sensor to output an effective real star target, the posture output efficiency of the star sensor is reduced, and the use requirement of a control system cannot be met.
Disclosure of Invention
In view of this, embodiments of the present invention provide a method and a system for filtering a pseudo star target of a star sensor, so as to solve the problem that the pseudo star target affects accurate output of a real star target and reduces attitude output efficiency of the star sensor.
In a first aspect of the embodiments of the present invention, a method for filtering a pseudo star target of a star sensor is provided, including:
s1, performing surface fitting on a target background based on local background gray information around the target to obtain fitting gray of pixels in a background area and background gray of pixels in the target area, and calculating gray variance of pixels in the background area according to real gray and fitting gray of each pixel in the background area;
s2, calculating a segmentation threshold of each pixel in the target area according to the gray variance of the pixels in the background area and the background gray of the pixels in the target area, performing threshold segmentation on the target area to obtain real gray information of the target, and extracting the target from the background;
s3, performing target clustering on the target area after threshold segmentation, performing eight-connected domain area growth clustering by taking a target center pixel as a starting point, and determining a target imaging pixel;
s4, obtaining gray information of the target, estimating target imaging parameters based on a star target two-dimensional Gaussian distribution imaging model to obtain target fitting gray information, and filtering the target as a pseudo star target if a sigma value of the fitting parameter is larger than a first threshold value;
s5, obtaining target real gray information and target fitting gray information, calculating the difference between the target real gray and the target fitting gray to obtain a fitting residual error of the target gray, and calculating the energy distribution confidence coefficient of a target star according to the fitting residual error of the target gray;
and S6, if the confidence value of the target star energy distribution is larger than a second threshold value, judging that the target does not obey the star energy distribution, and filtering the target as a pseudo star target.
In a second aspect of the embodiments of the present invention, a system for filtering a pseudo star target of a star sensor is provided, including:
the target background estimation module is used for performing surface fitting on a target background based on local background gray information around the target to obtain fitting gray of pixels in a background area and background gray of pixels in the target area, and calculating gray variance of the pixels in the background area according to the real gray and the fitting gray of each pixel in the background area;
the target threshold segmentation module is used for calculating the segmentation threshold of each pixel in the target area according to the gray variance of the pixels in the background area, performing threshold segmentation on the target area to obtain the real gray information of the target, and extracting the target from the background;
the target clustering module is used for carrying out target clustering on the target area after threshold segmentation, carrying out eight-connected domain area growing clustering by taking a target center pixel as a starting point and determining a target imaging pixel;
the target imaging parameter estimation module is used for acquiring gray information of a target, estimating target imaging parameters based on a star target two-dimensional Gaussian distribution imaging model, and filtering the target as a pseudo star target if the sigma value of the target imaging parameters is greater than a first threshold value;
the target confidence coefficient calculation module is used for acquiring target real gray information and target fitting gray information, obtaining a fitting residual error of the target gray by calculating the difference between the target real gray and the target fitting gray, and calculating the target star energy distribution confidence coefficient according to the fitting residual error of the target gray;
and the pseudo star target filtering module is used for judging that the target does not comply with the star energy distribution if the target star energy distribution confidence value is greater than a second threshold value, and filtering the target as the pseudo star target.
In the embodiment of the invention, the target is extracted from the background by threshold segmentation of the target area, the target imaging pixel is determined by target clustering, the target imaging parameter is estimated based on a star target two-dimensional Gaussian distribution imaging model, and the pseudo target which is not distributed by star energy under the complex light environment of the star sensor is effectively filtered by utilizing the difference between the pseudo star target and the real star target.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings described below are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a telemetry star map for an actual flight test of a star sensor of the type provided by one embodiment of the present invention;
fig. 2 is a schematic flow chart of a method for filtering a pseudo star target of a star sensor according to an embodiment of the invention;
FIG. 3 is a diagram illustrating the target segmentation effect generated by a conventional threshold segmentation method according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a target area and a background area for target detection according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a target region after threshold segmentation according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a target region growing clustering processing template according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a target region after clustering according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of a target pixel participating in parameter fitting according to an embodiment of the present invention;
FIG. 9 is a schematic diagram illustrating the effect of unfiltered false targets according to an embodiment of the present invention;
fig. 10 is a schematic diagram illustrating an effect of filtering out a false target according to an embodiment of the present invention;
fig. 11 is another schematic flow chart illustrating a method for filtering a pseudo star object of a star sensor according to an embodiment of the invention;
fig. 12 is a schematic structural diagram of a pseudo star object filtering system for a star sensor according to an embodiment of the present invention.
Detailed Description
In order to make the objects, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the embodiments described below are only a part of the embodiments of the present invention, and not all of the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons skilled in the art without any inventive work shall fall within the scope of the present invention, and the principle and features of the present invention shall be described below with reference to the accompanying drawings.
The terms "comprises" and "comprising," when used in this specification and claims, and in the accompanying drawings and figures, are intended to cover non-exclusive inclusions, such that a process, method or system, or apparatus that comprises a list of steps or elements is not limited to the listed steps or elements.
The terms "comprises" and "comprising," when used in this specification and claims, and in the appended drawings, are intended to cover non-exclusive inclusions, such that a process, method or system, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements. In addition, "first" and "second" are used to distinguish different objects, and are not used to describe a specific order.
Referring to fig. 2, fig. 2 is a schematic flow chart of a method for filtering a pseudo star target of a star sensor according to an embodiment of the present invention, including:
s201, performing surface fitting on a target background based on local background gray of the periphery of the target to obtain fitting gray of pixels in a background area and background gray of pixels in the target area, and calculating gray variance of pixels in the background area according to real gray and fitting gray of each pixel in the background area;
it can be understood that before the extraction of the centroid of the target, a local threshold segmentation method is generally adopted to separate the target from the background, and then the centroid of the segmented target is extracted, however, the traditional threshold segmentation method has disadvantages, and some background fluctuating scenes cannot well separate the target from the background, so that the accuracy of the extraction of the centroid of the target is affected, and fig. 3 is a segmentation effect of the traditional local threshold segmentation method. Aiming at the defects of the traditional threshold segmentation method, the target background is estimated based on the surrounding background of the target area, and the target segmentation is carried out by utilizing background estimation information.
The target segmentation is realized by estimating a target background by using target peripheral background gray information, considering that the template size of a target region is N × m pixels (the target energy accounts for more than 95%), the number of pixels of a target edge background region is k, the size of the whole target extraction frame is N =2k (N +2 k) +2km, the target background estimation is as shown in fig. 4, wherein part 1 is the target region, and part 2 is the background region.
The imaging process of the star sensor is influenced by different illumination, and the target background has fluctuation, so that the target background information can be fitted through a plane (curved surface fitting), and the fitting equation of the background information is z = a 0 +a 1 x+a 2 xy+a 3 y, let z ≡ p (x, y), p (x, y) denote the pixel position [ x, y ≡ p (x, y)]The gray values of (a) are:
Figure BDA0002768647660000051
Figure BDA0002768647660000052
Figure BDA0002768647660000053
wherein a represents a target background gray scale fitting parameter vector, H represents a target background pixel position coefficient matrix,
Figure BDA0002768647660000054
representing the target background pixel gray scale vector. And calculating a coefficient a by adopting a least square algorithm, and estimating to obtain background gray information of the target area.
S202, calculating a segmentation threshold of each pixel in the target area according to the gray variance of the pixels in the background area and the background gray value of the pixels in the target area, performing threshold segmentation on the target area to obtain real gray information of the target, and segmenting the target from the background;
in one embodiment, the real gray value of the background area pixel in fig. 4 is obtained, and is denoted as F B (i, j); and acquiring the real gray value of the pixel of the target area, and recording the real gray value as F (i, j).
Calculating the fitting gray value of each pixel in the background area according to the estimated background fitting coefficient, and recording the value as G B (i, j); and calculating the fitting gray value of each pixel in the target area, and recording the fitting gray value as G (i, j).
Calculating the difference between the real gray level and the fitting gray level of the background pixel, wherein the difference is called background gray level fitting residual E (i, j), E (i, j) = F B (i,j)-G B And (i, j), calculating the variance of the background gray scale according to the background gray scale fitting residual E (i, j), and recording the variance as std.
A division threshold value for each pixel of the target region is calculated, T (i, j) = G (i, j) + σ × std, and σ is a threshold division coefficient and is generally 3 to 5.
Calculating the target gray level f (i, j) after threshold segmentation, wherein the calculation formula is as follows:
Figure BDA0002768647660000061
where F (i, j) represents the true gray scale value of the target area pixel, and i, j represents the position coordinate of the target area pixel.
After the threshold segmentation is performed on the target, the obtained target area information is shown in fig. 5, where a gray pixel is a component of a target real pixel, and a black pixel is an isolated point of the edge of the target area. The general target area selection template is larger than the actual target area, so that isolated points with the gray value larger than the segmentation threshold value may exist in the edge pixels of the target area, and the judgment of the subsequent real star target and the centroid extraction precision are influenced.
S203, performing target clustering processing on the target area after threshold segmentation, performing eight-connected domain area growth clustering by taking a target central pixel as a starting point, and determining a target imaging pixel;
based on the problem that isolated points possibly exist after the threshold segmentation of the target area and are taken as target pixels to participate in subsequent target discrimination and centroid extraction, the connected domain method is adopted to perform clustering processing on the target, isolated pixels generated by the threshold segmentation of the target are filtered, and accurate target imaging area pixels are determined. The specific method comprises the following steps:
for the target area divided by the threshold, the eight-connected domain area growth clustering processing is carried out by taking the target central pixel as a starting point (an eight-connected domain template is shown in fig. 6), an accurate target imaging area is determined, the condition that the edge isolated point of the target area is misjudged as the target pixel is eliminated, and the subsequent target centroid extraction and pseudo target filtering processing are facilitated. After the region growing clustering processing is performed on the segmented target, the target imaging region information is as shown in fig. 7.
S204, obtaining gray information of the target, estimating target imaging parameters based on a star target two-dimensional Gaussian distribution imaging model to obtain target fitting gray information, and filtering the target as a pseudo star target if the sigma value of the target imaging parameters is greater than a first threshold value;
specifically, the star target imaging conforms to a two-dimensional Gaussian distribution model, and the imaging model is as follows:
Figure BDA0002768647660000071
/>
wherein the content of the first and second substances,X=[x,y] T is the pixel position coordinate; x C =[x c ,y c ] T Is the coordinates of the centroid position of the star point, G represents the star target energy, R is a 2 multiplied by 2 symmetric covariance matrix, sigma x 、σ y Representing the standard deviation of the star target in the x direction and the y direction, and rho representing the correlation between the x direction and the y direction;
the natural logarithm is taken for the above formula:
Figure BDA0002768647660000072
the method is simplified and can be obtained:
Figure BDA0002768647660000073
wherein D = R -1 Let us order
Figure BDA0002768647660000074
f is the gray level of the target imaging pixel, a 0 Representing a variable;
Figure BDA0002768647660000075
is a 1 × 2 matrix, can be used [ a 1 a 2 ]Is represented by [ d ij ]i, j =1.. 2 represents each element in D, and since R is a positive definite symmetric matrix, its inverse matrix D is also a symmetric matrix, D 12 =d 21 =d 2
The substitution into the above formula expands to obtain:
Figure BDA0002768647660000081
the above formula can be represented as:
Figure BDA0002768647660000082
the abbreviation is:
Figure BDA0002768647660000083
wherein the content of the first and second substances,
Figure BDA0002768647660000084
representing a measured value vector, wherein n is the number of pixels of the star target gray distribution, z' is an unknown 7 multiplied by 1 unknown vector, and R is a covariance matrix of two-dimensional Gaussian distribution, and the structure of the covariance matrix is as follows:
Figure BDA0002768647660000085
σ x 、σ y representing the standard deviation of the star target in the x direction and the y direction, rho representing the correlation between the x direction and the y direction, R being a positive definite symmetric matrix, the inverse matrix D also being a symmetric matrix, D 12 =d 21 =d 2 Then, there are:
Figure BDA0002768647660000086
equivalent to:
Figure BDA0002768647660000087
wherein the content of the first and second substances,
Figure BDA0002768647660000088
representing a vector of measured values, n being the number of pixels of the star target gray distribution, z being an unknown 7 x 1 unknown vector, x i ,y i Represents the position coordinates of the ith pixel of the target, i =1,2, ·, n;
then there are:
z=(A T A) -1 A T F
the target centroid coordinates are then: x C =([a 1 a 2 ]D- 1 ) T
Determining a weight:
Figure BDA0002768647660000091
where W (i, j) is the weight of the target ith pixel, Δ is a small value to prevent the denominator from approaching 0,
Figure BDA0002768647660000092
represents the measured value of the mth target imaging pixel>
Figure BDA0002768647660000093
A measurement value representing the ith pixel;
then there are:
z=(A T WA) -1 A T WF。
as shown in fig. 8, fig. 8 shows the weight distribution of the target pixel participating in the parameter fitting.
S205, obtaining target real gray information and target fitting gray information, obtaining a fitting residual error of the target gray by calculating the difference between the target real gray and the target fitting gray, and calculating a target star energy distribution confidence coefficient according to the fitting residual error of the target gray;
and acquiring real gray information f (i, j) of each pixel after target clustering processing, subtracting a target background gray value from the gray value of each pixel of the target to obtain a target real gray value, and setting the size of the target area to be n multiplied by m.
And acquiring fitting gray scale information g (i, j) of each pixel of the target, and calculating the fitting gray scale information of each pixel of the target by using the estimated target imaging parameters.
And calculating a fitting residual error of each target pixel gray level, and setting the difference between the target real gray level and the target fitting gray level as a target gray level fitting residual error, namely e (i, j), e (i, j) = f (i, j) -g (i, j).
Wherein i =1.. M, j =1.. N.
Further, determining a target star energy distribution confidence coefficient, and calculating the target star energy distribution confidence coefficient according to the target gray fitting residual:
Figure BDA0002768647660000094
/>
Figure BDA0002768647660000101
the target non-sidereal energy distribution confidence is ratio = sumErr/sumEnergy.
The sum of target gray fitting residuals is expressed by sumErr, the sum of target gray is expressed by sumEnergy, the template size of the target area is expressed by n and m, the background gray fitting residuals are expressed by E (i, j), the real gray values of pixels in the target area are expressed by F (i, j), and the target segmentation threshold value is expressed by T (i, j).
Confidence labeling of star targets with unfiltered false targets on the telemetry star map is shown in fig. 9.
And S206, if the confidence value of the target star energy distribution is larger than a second threshold value, judging that the target does not obey the star energy distribution, and filtering the target as a pseudo star target.
Specifically, if the target star energy distribution confidence value (ratio value) is greater than a certain threshold, the target is considered not to be subjected to star energy distribution and is filtered as a pseudo star target. FIG. 10 is a diagram of a telemetry star after filtering out pseudostar targets. It is understood that the second threshold is a preset value, and can be determined according to the practical application.
The method provided by the embodiment can quickly and effectively filter the false target of non-fixed star energy distribution in the complex light environment of the star sensor, has the characteristics of low false target omission ratio, high false target filtering reliability and high operation speed, can improve the false target resistance of the star sensor, reduces the influence of the false star target on the output of the star sensor on the real star target, improves the attitude output efficiency of the star sensor, and can be popularized to the fields of missile-borne/rocket-borne/satellite-borne star sensors.
In another embodiment provided by the present invention, as shown in fig. 11, after a telemetry star map is obtained, a target background is estimated based on gray level information around a target, after threshold segmentation and connected domain target clustering, a target imaging parameter is estimated according to a star target imaging model, and it is determined whether a sigma value of the target imaging parameter is too large, if so, filtering is performed, otherwise, a target energy distribution confidence coefficient is calculated, and it is determined whether the target star energy distribution confidence coefficient is high, if so, outputting as a real star target, otherwise, filtering a pseudo star target is performed.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Fig. 12 is a schematic structural diagram of a system for filtering a pseudo star object of a star sensor according to an embodiment of the present invention, where the system includes:
the target background estimation module 1210 is configured to perform surface fitting on a target background based on local background gray around the target to obtain fitting gray of pixels in a background area and background gray of pixels in the target area, and calculate a gray variance of the pixels in the background area according to the real gray and the fitting gray of each pixel in the background area;
a target threshold segmentation module 1220, configured to calculate a segmentation threshold of each pixel in the target region according to the gray variance of the pixels in the background region and the background gray of the pixels in the target region, perform threshold segmentation on the target region to obtain real gray information of the target, and extract the target from the background;
the target clustering module 1230 is configured to perform target clustering on the target region after the threshold segmentation, perform eight-connected domain region growing clustering with a target center pixel as a starting point, and determine a target imaging pixel;
the target imaging parameter estimation module 1240 is used for acquiring gray information of a target, estimating target imaging parameters based on a star target two-dimensional Gaussian distribution imaging model to obtain target fitting gray information, and filtering the target as a pseudo-star target if the sigma value of the target imaging parameters is greater than a first threshold value;
the target confidence coefficient calculation module 1250 is configured to obtain target real gray information and target fitting gray information, obtain a fitting residual of the target gray by calculating a difference between the target real gray and the target fitting gray, and calculate a target star energy distribution confidence coefficient according to the fitting residual of the target gray;
and the pseudo star target filtering module 1260 is used for judging that the target does not obey the star energy distribution if the target star energy distribution confidence value is greater than the second threshold value, and filtering the target as the pseudo star target.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those skilled in the art will appreciate that all or part of the steps in the method for implementing the above embodiments may be implemented by controlling related hardware through a program or instructions, where the program may be stored in a computer-readable storage medium, and when the program is executed, the program implements pseudo star target filtering in a telemetry star map, and the storage medium includes, for example: ROM/RAM, magnetic disk, optical disk, etc.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it should be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (6)

1. A method for filtering a pseudo star target of a star sensor is characterized by comprising the following steps:
s1, performing surface fitting on a target background based on local background gray of the periphery of the target to obtain fitting gray of pixels in a background area and background gray of pixels in the target area, and calculating gray variance of the pixels in the background area according to real gray and fitting gray of each pixel in the background area;
s2, calculating a segmentation threshold of each pixel in the target area according to the gray variance of the pixels in the background area and the background gray value of the pixels in the target area, performing threshold segmentation on the target area to obtain real gray information of the target, and segmenting the target from the background;
wherein, the calculating the segmentation threshold of each pixel of the target area according to the gray variance of the pixels of the background area and the background gray value of the pixels of the target area comprises:
calculating a segmentation threshold value of each pixel of the target region according to a formula T (i, j) = G (i, j) + sigma multiplied by std, wherein T (i, j) represents the target segmentation threshold value, G (i, j) represents the background fitting gray scale of the pixel of the target region, sigma is a threshold segmentation coefficient, and std represents the gray scale variance of the pixel of the background region;
calculating the target gray level f (i, j) after threshold segmentation, wherein the calculation formula is as follows:
Figure FDA0003865208920000011
in the formula, F (i, j) represents the real gray value of the target area pixel, and i, j represents the position coordinate of the target area pixel;
s3, performing target clustering processing on the target area after threshold segmentation, performing eight-connected domain area growth clustering by taking a target central pixel as a starting point, and determining a target imaging pixel;
s4, obtaining gray information of the target, estimating target imaging parameters based on a star target two-dimensional Gaussian distribution imaging model to obtain target fitting gray information, and filtering the target as a pseudo star target if the sigma value of the target imaging parameters is larger than a first threshold value;
the estimating of target imaging parameters based on the star target two-dimensional Gaussian distribution imaging model to obtain target fitting gray level information comprises the following steps:
the star target imaging accords with a two-dimensional Gaussian distribution model, and the imaging model is as follows:
Figure FDA0003865208920000021
wherein, X = [ X, y] T Is the pixel position coordinate; x C =[x c ,y c ] T Is the coordinates of the centroid position of the star point, G represents the star target energy, R is a 2 multiplied by 2 symmetric covariance matrix, sigma x 、σ y The standard deviation of the star target in the x direction and the y direction is shown, rho represents the correlation between the x direction and the y direction, and the covariance matrix structure is as follows:
Figure FDA0003865208920000022
the star target imaging model formula is obtained by taking a natural logarithm and simplifying:
Figure FDA0003865208920000023
wherein D = R -1 Let us order
Figure DEST_PATH_IMAGE002
f is the gray value of the target imaging pixel, a 0 Representing a variable; />
Figure FDA0003865208920000025
Is a 1 × 2 matrix, can be used [ a 1 a 2 ]Is represented by [ d ij ]i, j =1.. 2 represents each element in D, and since R is a positive definite symmetric matrix, its inverse matrix D is also a symmetric matrix, D 12 =d 21 =d 2
The substitution into the above formula expands to obtain:
Figure FDA0003865208920000026
then there are:
Figure FDA0003865208920000027
equivalent to:
Figure FDA0003865208920000031
wherein the content of the first and second substances,
Figure FDA0003865208920000032
representing a vector of measured values, n being the number of pixels of the star target gray distribution, z being an unknown 7 x 1 unknown vector, x i ,y i Represents the position coordinates of the ith pixel of the target, i =1,2, ·, n;
the calculation can obtain:
Figure FDA0003865208920000033
and weighting the target gray level to calculate target imaging parameters, wherein the weight value determination formula is as follows:
Figure FDA0003865208920000034
wherein W (i, j) is the weight of the ith pixel, Δ is a small value to prevent the denominator from approaching 0,
Figure FDA0003865208920000035
represents the measured value of the mth target imaging pixel>
Figure FDA0003865208920000036
A measurement value representing the ith pixel;
then there are:
Figure FDA0003865208920000037
s5, acquiring target real gray information and target fitting gray information, obtaining a fitting residual error of the target gray by calculating the difference between the target real gray and the target fitting gray, and calculating a target star energy distribution confidence coefficient according to the fitting residual error of the target gray;
and S6, if the confidence value of the target star energy distribution is larger than a second threshold value, judging that the target does not obey the star energy distribution, and filtering the target as a pseudo star target.
2. The method of claim 1, wherein the performing surface fitting on the target background based on the local background gray level around the target to obtain the fitting gray level of the background area pixel and the background gray level of the target area pixel comprises:
the fitting equation of the target background information is z = a 0 +a 1 x+a 2 xy+a 3 y, let z ≡ p (x, y), p (x, y) denote the pixel position [ x, y]Then there are:
Figure FDA0003865208920000041
Figure FDA0003865208920000042
Figure FDA0003865208920000043
wherein a represents a target background gray fitting parameter vector, H represents a target background pixel position coefficient matrix,
Figure FDA0003865208920000044
and expressing a target background pixel gray vector, calculating a coefficient a by adopting a least square algorithm, and estimating to obtain the background gray information of the target.
3. The method of claim 1, wherein calculating the variance of the gray levels of the pixels in the background region according to the true gray levels and the fitted gray levels of the pixels in the background region comprises:
acquiring a real gray value of a pixel in a background area and a real gray value of a pixel in a target area;
calculating the background fitting gray of each pixel in the background area according to the estimated background gray fitting coefficient, and calculating the background fitting gray of each pixel in the target area;
calculating the difference value between the real gray value and the fitting gray value of the pixels in the background area, and taking the difference value as a background gray fitting residual error;
and calculating the gray variance of the pixels in the background area according to the background gray fitting residual error.
4. The method of claim 1, wherein obtaining the target real gray information and the target fitting gray information, and obtaining the fitting residual of the target gray by calculating the difference between the target real gray and the target fitting gray comprises:
acquiring real gray information f (i, j) of each pixel after target clustering processing and fitting gray information g (i, j) of each target pixel;
calculating the difference between the target real gray and the target fitting gray to obtain the fitting residual error of the target gray:
e(i,j)=f(i,j)-g(i,j)
in the formula, i and j represent the position coordinates of the imaging pixels in the target area, and e (i and j) represents the difference between the target real gray scale and the target fitting gray scale, and is called a target gray scale fitting residual error.
5. The method according to claim 1, wherein the calculating the confidence of the target star energy distribution according to the fitting residuals of the target gray levels comprises:
compute sumErr and sumEnergy, respectively:
Figure FDA0003865208920000051
Figure FDA0003865208920000052
the confidence coefficient of the target non-stellar energy distribution is ratio = sumErr/sumEnergy;
wherein sumErr represents the sum of target gray fitting residuals, sumEnergy represents the sum of target gray, n and m represent the size of a template of a target region, E (i, j) represents a background gray fitting residual, F (i, j) represents the real gray value of a pixel of the target region, and T (i, j) represents a target segmentation threshold.
6. A pseudo star target filtering system of a star sensor is characterized by comprising:
the target background estimation module is used for performing surface fitting on a target background based on local background gray information around the target to obtain fitting gray of pixels in a background area and background gray of pixels in the target area, and calculating gray variance of the pixels in the background area according to the real gray and the fitting gray of each pixel in the background area;
the target threshold segmentation module is used for calculating the segmentation threshold of each pixel in the target area according to the gray variance of the pixels in the background area, performing threshold segmentation on the target area to obtain the real gray information of the target, and extracting the target from the background;
wherein, the calculating the segmentation threshold of each pixel of the target area according to the gray variance of the pixels of the background area comprises:
calculating a segmentation threshold value of each pixel of the target region according to a formula T (i, j) = G (i, j) + sigma multiplied by std, wherein T (i, j) represents the target segmentation threshold value, G (i, j) represents the background fitting gray scale of the pixel of the target region, sigma is a threshold segmentation coefficient, and std represents the gray scale variance of the pixel of the background region;
calculating the target gray level f (i, j) after threshold segmentation, wherein the calculation formula is as follows:
Figure FDA0003865208920000061
in the formula, F (i, j) represents the real gray value of the target area pixel, and i, j represents the position coordinate of the target area pixel;
the target clustering module is used for carrying out target clustering on the target area after threshold segmentation, carrying out eight-connected domain area growing clustering by taking a target center pixel as a starting point and determining a target imaging pixel;
the target imaging parameter estimation module is used for acquiring gray information of a target, estimating target imaging parameters based on a star target two-dimensional Gaussian distribution imaging model, and filtering the target as a pseudo star target if the sigma value of the target imaging parameters is greater than a first threshold value;
the method for estimating the target imaging parameters based on the star target two-dimensional Gaussian distribution imaging model comprises the following steps:
the star target imaging accords with a two-dimensional Gaussian distribution model, and the imaging model is as follows:
Figure FDA0003865208920000062
wherein, X = [ X, y] T Is the pixel position coordinate; x C =[x c ,y c ] T Is the coordinates of the centroid position of the star point, G represents the star target energy, R is a 2 multiplied by 2 symmetric covariance matrix, sigma x 、σ y The standard deviation of the star target in the x direction and the y direction is shown, rho represents the correlation between the x direction and the y direction, and the covariance matrix structure is as follows:
Figure FDA0003865208920000071
the star target imaging model formula is obtained by taking a natural logarithm and simplifying:
Figure FDA0003865208920000072
wherein D = R -1 Let us order
Figure 285580DEST_PATH_IMAGE002
f isGrey scale value of target imaging pixel, a 0 Representing a variable;
Figure FDA0003865208920000074
is a 1 × 2 matrix, can be used [ a 1 a 2 ]Is represented by [ d ij ]i, j =1.. 2 represents each element in D, and since R is a positive definite symmetric matrix, its inverse matrix D is also a symmetric matrix, D 12 =d 21 =d 2
The substitution into the above formula expands to obtain:
Figure FDA0003865208920000075
/>
then there are:
Figure FDA0003865208920000076
equivalent to:
Figure FDA0003865208920000077
wherein the content of the first and second substances,
Figure FDA0003865208920000078
representing a vector of measured values, n being the number of pixels of the star target gray distribution, z being an unknown 7 x 1 unknown vector, x i ,y i Represents the position coordinates of the ith pixel of the target, i =1,2, ·, n;
the calculation can obtain:
Figure FDA0003865208920000079
and weighting the target gray level to calculate target imaging parameters, wherein the weight value determination formula is as follows:
Figure FDA0003865208920000081
where W (i, j) is the weight of the target ith pixel, Δ is a small value to prevent the denominator from approaching 0,
Figure FDA0003865208920000082
represents the measured value of the mth target imaging pixel>
Figure FDA0003865208920000083
A measurement value representing the ith pixel;
then there are:
Figure FDA0003865208920000084
the target confidence coefficient calculation module is used for acquiring target real gray information and target fitting gray information, obtaining a fitting residual error of the target gray by calculating the difference between the target real gray and the target fitting gray, and calculating the target star energy distribution confidence coefficient according to the fitting residual error of the target gray;
a pseudo star target filtering module used for filtering the pseudo star target if the target star energy distribution confidence value is larger than a second threshold value, and judging that the target does not obey the star energy distribution and filtering the target as a pseudo star target.
CN202011241781.5A 2020-11-09 2020-11-09 Method and system for filtering pseudo star target of star sensor Active CN112581548B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011241781.5A CN112581548B (en) 2020-11-09 2020-11-09 Method and system for filtering pseudo star target of star sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011241781.5A CN112581548B (en) 2020-11-09 2020-11-09 Method and system for filtering pseudo star target of star sensor

Publications (2)

Publication Number Publication Date
CN112581548A CN112581548A (en) 2021-03-30
CN112581548B true CN112581548B (en) 2023-04-07

Family

ID=75122511

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011241781.5A Active CN112581548B (en) 2020-11-09 2020-11-09 Method and system for filtering pseudo star target of star sensor

Country Status (1)

Country Link
CN (1) CN112581548B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113916382B (en) * 2021-09-14 2023-09-12 中国科学院上海技术物理研究所 Star energy extraction method based on sensitivity model in pixel
CN116935027A (en) * 2022-03-29 2023-10-24 脸萌有限公司 Object identification method and device, electronic equipment and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104567865A (en) * 2014-12-29 2015-04-29 北京控制工程研究所 Attitude capture method of star sensor under space particle interference condition

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2929395B1 (en) * 2008-03-27 2010-05-21 Centre Nat Etd Spatiales ATTITUDE ESTIMATING METHOD OF A STELLAR SENSOR
CN102496015B (en) * 2011-11-22 2013-08-21 南京航空航天大学 High-precision method for quickly positioning centers of two-dimensional Gaussian distribution spot images
CN109764893B (en) * 2018-12-31 2022-06-10 华中光电技术研究所(中国船舶重工集团有限公司第七一七研究所) Method for testing stray light suppression angle of star sensor

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104567865A (en) * 2014-12-29 2015-04-29 北京控制工程研究所 Attitude capture method of star sensor under space particle interference condition

Also Published As

Publication number Publication date
CN112581548A (en) 2021-03-30

Similar Documents

Publication Publication Date Title
KR102414452B1 (en) Target detection and training of target detection networks
CN109255317B (en) Aerial image difference detection method based on double networks
CN110728658A (en) High-resolution remote sensing image weak target detection method based on deep learning
CN112001958B (en) Virtual point cloud three-dimensional target detection method based on supervised monocular depth estimation
CN110458877B (en) Navigation method based on bionic vision for fusing infrared and visible light information
CN114820465B (en) Point cloud detection model training method and device, electronic equipment and storage medium
CN112581548B (en) Method and system for filtering pseudo star target of star sensor
EP1462994B1 (en) Method and system for identifying objects in an image
CN110428425B (en) Sea-land separation method of SAR image based on coastline vector data
CN109829423B (en) Infrared imaging detection method for frozen lake
Shaoqing et al. The comparative study of three methods of remote sensing image change detection
CN114155501A (en) Target detection method of unmanned vehicle in smoke shielding environment
CN110889399A (en) High-resolution remote sensing image weak and small target detection method based on deep learning
CN109165603B (en) Ship detection method and device
CN113822352A (en) Infrared dim target detection method based on multi-feature fusion
CN114821358A (en) Optical remote sensing image marine ship target extraction and identification method
CN115115601A (en) Remote sensing ship target detection method based on deformation attention pyramid
CN113933828A (en) Unmanned ship environment self-adaptive multi-scale target detection method and system
CN107369163B (en) Rapid SAR image target detection method based on optimal entropy dual-threshold segmentation
CN111027512B (en) Remote sensing image quayside ship detection and positioning method and device
CN111089586B (en) All-day star sensor star point extraction method based on multi-frame accumulation algorithm
CN106204596B (en) Panchromatic waveband remote sensing image cloud detection method based on Gaussian fitting function and fuzzy mixed estimation
CN109948571B (en) Optical remote sensing image ship detection method
CN114565653B (en) Heterologous remote sensing image matching method with rotation change and scale difference
CN107784285B (en) Method for automatically judging civil and military attributes of optical remote sensing image ship target

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant