CN112017156A - Space point target rotation period estimation method based on multispectral video - Google Patents
Space point target rotation period estimation method based on multispectral video Download PDFInfo
- Publication number
- CN112017156A CN112017156A CN202010693034.9A CN202010693034A CN112017156A CN 112017156 A CN112017156 A CN 112017156A CN 202010693034 A CN202010693034 A CN 202010693034A CN 112017156 A CN112017156 A CN 112017156A
- Authority
- CN
- China
- Prior art keywords
- spectral
- target
- frame
- spectrum
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10036—Multispectral image; Hyperspectral image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20056—Discrete and fast Fourier transform, [DFT, FFT]
Abstract
The invention discloses a space point target rotation period estimation method based on a multispectral video, and aims to solve the technical problem that the luminosity-based space point target rotation period estimation method in the prior art has insufficient precision under the condition of complex illumination. Firstly, acquiring a plurality of frames of spectral images, and then solving an average spectral curve of each frame of spectral image; then calculating the difference of the spectral angles; then acquiring a spectrum time-varying curve; selecting candidate points; acquiring a period corresponding to the candidate point; finally, carrying out period verification and calculation of a final rotation period; the method can solve the problem that the traditional luminosity estimation method has larger error in periodic estimation of the point target, and can better distinguish different postures of the space target by utilizing multispectral information.
Description
Technical Field
The invention relates to a space target, in particular to a space point target rotation period estimation method based on a multispectral video.
Background
The space target comprises space debris, satellites, space aircrafts and the like, and some satellites and aircrafts with abnormal or failed working states are in a free rotation state on the track after losing power, and the rotation period of the space target is estimated, so that the space target is helpful for judging whether the target works normally or not. Since most of the targets, especially the high-orbit targets, have a wide distribution range, and are usually observed at a long distance (over 100 km) in order to ensure the observation efficiency and the safety of the targets, the shape information of the targets is unknown, and the targets are imaged as one point or a scattered spot in a spatial background.
Most of the existing space point target attitude state estimation methods are estimated based on photometric information, namely, the rotation period of the space point target attitude state estimation method is estimated through the periodic change of the reflected illumination intensity of the space point target attitude state estimation method. Since the luminosity of a plurality of surfaces of targets such as satellites and space debris may be similar, and the intensity of the luminosity is also easily influenced by environmental illumination factors, the luminosity-based space point target rotation period estimation method has the defect of insufficient accuracy.
Disclosure of Invention
The invention aims to solve the technical problem that the luminosity-based space point target rotation period estimation method in the prior art has insufficient precision under the complex illumination condition, and the target rotation period estimation method based on the spectral information is provided by considering that most space targets, including satellites, space vehicles and partial space fragments, are made of various materials and have different spectral reflectivities in different areas of the surfaces.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
a space point target rotation period estimation method based on multispectral video is characterized by comprising the following steps:
1) acquiring continuous multi-frame spectral images of a target area by using a video spectrometer, wherein each frame of spectral image comprises at least four spectra of different spectral bands;
2) detecting pixel points occupied by a target in the first frame of spectral image by using a self-adaptive threshold method;
3) solving the maximum pixel value of each pixel point in each spectral band to form a maximum pixel value image;
4) in the maximum pixel value image, taking m% of the maximum pixel value in all pixel values as a threshold, carrying out threshold segmentation on the maximum pixel value image, and extracting all pixel points with the pixel values larger than the threshold to form a target pixel point sequence, wherein m is more than or equal to 20 and less than or equal to 40;
5) finding out a plurality of spectrums corresponding to any pixel point in a target pixel point sequence in an original first frame spectrum image, and solving the spectrum average value of the pixel point;
6) repeating the step 5), and solving the average value of the spectrums of the other pixel points in the target pixel point sequence;
7) fitting the spectrum average values of all pixel points in the target pixel point sequence to obtain an average spectrum curve of the first frame of spectrum image, and recording the average spectrum curve as S1;
8) Repeating the step 2) to the step 7), respectively calculating the average spectrum curve of each rest frame of spectrum image, and recording the average spectrum curve of the t-th frame as St;
9) Calculating the spectral angle difference d between each frame of spectral image and the first frame of spectral imaget;
10) All spectral angular differences d to be calculatedtConnected into a one-dimensional vector D ═ D1,d2,d3…dNN is the total frame number and is marked as a target spectrum time-varying curve D;
11) fast Fourier transform is carried out on the spectrum time-varying curve D to obtain a frequency domain vector F, and the frequency domain vector F is larger than 0.5. FmAs candidate points, FmIs the maximum value of F;
12) calculating the period corresponding to the candidate point in the frequency domain vector F:
wherein n is the sequence number of each candidate point in the frequency domain vector F;
13) taking a period as a section, and enabling the spectral time-varying curve D to be in accordance with the period LnDividing the obtained product into K sections;
14) respectively calculating the normalized average difference R of each of the signals from the 2 nd segment to the Kth segment and the signal from the 1 st segmentn;
15) Take all RnPeriod L corresponding to the maximum value ofnThe period of the signal to be determined is divided by the frame rate of the video spectrometer used to obtain the actual rotation period of the target.
Further, the value of m in the step 4) is 30.
Further, in step 9), the spectral angle difference dtThe calculation formula of (2) is as follows:
further, in step 14), the normalized average difference RnThe calculation method of (c) is as follows:
further, in step 5), a weighted average method is adopted to obtain a spectrum average value.
The invention has the beneficial effects that:
1. the method can solve the problem that the traditional luminosity estimation method has larger error in periodic estimation of the point target, and can better distinguish different postures of the space target by utilizing multispectral information.
2. The method measures the difference of the postures of the point targets in different frames by calculating the spectrum angle of the video spectrum image sequence, and can be applied to video spectrometer data of various principles.
3. The method can accurately calculate the rotation period of the space target by using a mode of frequency domain estimation and time domain verification.
Drawings
FIG. 1 is a flow chart of a spatial point target rotation period estimation method based on multispectral video according to the present invention;
FIG. 2 is a graph of 5 spectral data generated by simulation at the same angle;
FIG. 3 is a three-spectral-segment synthesized pseudo-color image corresponding to a multi-spectral video frame;
FIG. 4(a) is a calculated spectral time-varying graph for a sequence of multi-spectral video images;
FIG. 4(b) is a graph of true rotation angle corresponding to the actual rotation period of FIG. 4 (a);
FIG. 5 is a frequency-amplitude diagram of a frequency domain transform;
fig. 6 is a cycle-amplitude diagram of a frequency domain transform.
Detailed Description
To make the objects, advantages and features of the present invention clearer, the following describes the multispectral video-based spatial point object rotation period estimation method in detail with reference to the accompanying drawings and specific embodiments. The advantages and features of the present invention will become more apparent from the following detailed description. It should be noted that: the drawings are in simplified form and are not to precise scale, the intention being solely for the convenience and clarity of illustrating embodiments of the invention; second, the structures shown in the drawings are often part of actual structures.
The invention discloses a space point target rotation period estimation method based on a multispectral video, which comprises the following steps of:
1. acquiring continuous multi-frame spectral images of a target area by using a video spectrometer, wherein each frame of spectral image comprises five spectra of different spectral bands as shown in fig. 2;
2. calculating an average spectrum curve of each frame of spectrum image;
2.1) detecting pixel points occupied by a target in the first frame of spectral image by using a self-adaptive threshold method;
2.2) solving the maximum pixel value of each pixel point in each spectral band to form a maximum pixel value image;
2.3) in the maximum pixel value image, taking 30% of the maximum pixel value in all pixel values as a threshold, carrying out threshold segmentation on the maximum pixel value image, and extracting all pixel points with the pixel values larger than the threshold to form a target pixel point sequence;
2.4) finding out a plurality of spectrums corresponding to any pixel point in the target pixel point sequence in the original first frame spectrum image, and calculating the spectrum average value of the pixel point by using a weighted average method;
2.5) repeating the step 2.4) to obtain the spectral average value of the rest pixel points in the target pixel point sequence;
2.6) fitting the average value of the spectra of all the pixel points in the target pixel point sequence to obtain the average spectral curve of the first frame spectral image, and recording the average spectral curve as S1;
2.7) repeating the step 2.1) to the step 2.6), respectively calculating the average spectrum curve of each rest frame of spectrum image, and recording the average spectrum curve of the t-th frame as St;
3. Calculating the difference of the spectral angles;
calculating the spectral angle difference d between each frame of spectral image and the first frame of spectral imaget:
4. Acquiring a spectrum time-varying curve;
the spectral angle difference d between each frame of spectral image and the first frame of spectral imagetConnected into a one-dimensional vector D ═ D1,d2,d3…dNN is the total frame number and is marked as a target spectrum time-varying curve D;
5. selecting candidate points;
performing fast Fourier transform on the spectrum time-varying curve D to obtain a frequency domain vector F, and as shown in FIG. 5, dividing the frequency domain vector F into more than 0.5. FmAs shown in fig. 6, the circled points in fig. 6 are candidate points in which the final real period will be generated, FmIs the maximum value of F;
6. acquiring a period corresponding to the candidate point;
calculating the period corresponding to the candidate point in the frequency domain vector F:
wherein n is the sequence number of each candidate point in the frequency domain vector F;
7. cycle verification and calculation of the final rotation cycle;
7.1) taking one period as a section, and enabling the spectral time-varying curve D to be in accordance with the period LnDividing the obtained product into K sections;
7.2) respectively calculating the normalized average difference R of each of the signals from the 2 nd segment to the K th segment and the signal from the 1 st segmentn:
7.3) taking all RnPeriod L corresponding to the maximum value ofnThe period of the signal to be determined is divided by the frame rate of the video spectrometer used to obtain the actual rotation period of the target.
The core technology of the invention is that the difference of the average spectral curves of the space point target in the multispectral video is utilized to establish a one-dimensional signal reflecting the target spin period, and the target rotation period is finally determined by the methods of frequency domain peak value selection and time domain verification. The invention relates to a method for calculating a point target spectrum difference curve and calculating a target rotation period through time-frequency combination. Compared with the traditional estimation method based on luminosity difference, the method has better accuracy, can be applied to judging a space failure target through a spinning cycle, and has application values in the aspects of space service, on-orbit maintenance of a space vehicle and the like.
As shown in fig. 3, there are 8 small images in the figure, each of which is a pseudo color synthesized by three spectral bands of a frame of spectral data, and it can be seen from the figure that there is a difference in corresponding multispectral images under different attitudes of the satellite, and this difference in periodic variation is the basis for performing the estimation of the rotation period.
As shown in fig. 4, fig. 4(a) is a spectral time-varying curve D generated using a simulated sequence of multispectral video frames; fig. 4(b) shows the true rotation angle of the satellite when used for simulation, and a comparison between fig. 4(a) and fig. 4(b) shows that the spectral time-varying curve D has a significant periodicity, and the period of the spectral time-varying curve D is substantially consistent with the true period, and the true period can be estimated by the spectral time-varying curve D.
Claims (5)
1. A space point target rotation period estimation method based on multispectral video is characterized by comprising the following steps:
1) acquiring continuous multi-frame spectral images of a target area by using a video spectrometer, wherein each frame of spectral image comprises at least four spectra of different spectral bands;
2) detecting pixel points occupied by a target in the first frame of spectral image by using a self-adaptive threshold method;
3) solving the maximum pixel value of each pixel point in each spectral band to form a maximum pixel value image;
4) in the maximum pixel value image, taking m% of the maximum pixel value in all pixel values as a threshold, carrying out threshold segmentation on the maximum pixel value image, and extracting all pixel points with the pixel values larger than the threshold to form a target pixel point sequence, wherein m is more than or equal to 20 and less than or equal to 40;
5) finding out a plurality of spectrums corresponding to any pixel point in a target pixel point sequence in an original first frame spectrum image, and solving the spectrum average value of the pixel point;
6) repeating the step 5), and solving the average value of the spectrums of the other pixel points in the target pixel point sequence;
7) fitting the spectrum average values of all pixel points in the target pixel point sequence to obtain an average spectrum curve of the first frame of spectrum image, and recording the average spectrum curve as S1;
8) Repeating the step 2) to the step 7), respectively calculating the average spectrum curve of each rest frame of spectrum image, and recording the average spectrum curve of the t-th frame as St;
9) Calculating the spectral angle difference d between each frame of spectral image and the first frame of spectral imaget;
10) All spectral angular differences d to be calculatedtConnected into a one-dimensional vector D ═ D1,d2,d3…dNN is the total frame number and is marked as a target spectrum time-varying curve D;
11) fast Fourier transform is carried out on the spectrum time-varying curve D to obtain a frequency domain vector F, and the frequency domain vector F is larger than 0.5. FmAs candidate points, FmIs the maximum value of F;
12) calculating the period corresponding to the candidate point in the frequency domain vector F:
wherein n is the sequence number of each candidate point in the frequency domain vector F;
13) taking a period as a section, and enabling the spectral time-varying curve D to be in accordance with the period LnDividing the obtained product into K sections;
14) respectively calculating the normalized average difference R of each of the signals from the 2 nd segment to the Kth segment and the signal from the 1 st segmentn;
15) Take all RnPeriod L corresponding to the maximum value ofnThe period of the signal to be determined is divided by the frame rate of the video spectrometer used to obtain the actual rotation period of the target.
2. The method according to claim 1, wherein the multispectral video-based method for estimating the rotation period of the spatial point target comprises: in the step 4), the value of m is 30.
5. the method according to claim 1, wherein the multispectral video-based method for estimating the rotation period of the spatial point target comprises: and 5), calculating a spectrum average value by adopting a weighted average method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010693034.9A CN112017156B (en) | 2020-07-17 | 2020-07-17 | Space point target rotation period estimation method based on multispectral video |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010693034.9A CN112017156B (en) | 2020-07-17 | 2020-07-17 | Space point target rotation period estimation method based on multispectral video |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112017156A true CN112017156A (en) | 2020-12-01 |
CN112017156B CN112017156B (en) | 2023-02-14 |
Family
ID=73500023
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010693034.9A Active CN112017156B (en) | 2020-07-17 | 2020-07-17 | Space point target rotation period estimation method based on multispectral video |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112017156B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103646244A (en) * | 2013-12-16 | 2014-03-19 | 北京天诚盛业科技有限公司 | Methods and devices for face characteristic extraction and authentication |
CN103776540A (en) * | 2013-12-30 | 2014-05-07 | 华中科技大学 | Multiband common-optical-path spectrum combined remote sensing measurement system and method thereof |
CN108627667A (en) * | 2018-05-15 | 2018-10-09 | 中国人民解放军战略支援部队航天工程大学 | Based on luminosity sequence while estimation space unstability target precession and spin rate method |
US10361802B1 (en) * | 1999-02-01 | 2019-07-23 | Blanding Hovenweep, Llc | Adaptive pattern recognition based control system and method |
-
2020
- 2020-07-17 CN CN202010693034.9A patent/CN112017156B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10361802B1 (en) * | 1999-02-01 | 2019-07-23 | Blanding Hovenweep, Llc | Adaptive pattern recognition based control system and method |
CN103646244A (en) * | 2013-12-16 | 2014-03-19 | 北京天诚盛业科技有限公司 | Methods and devices for face characteristic extraction and authentication |
CN103776540A (en) * | 2013-12-30 | 2014-05-07 | 华中科技大学 | Multiband common-optical-path spectrum combined remote sensing measurement system and method thereof |
CN108627667A (en) * | 2018-05-15 | 2018-10-09 | 中国人民解放军战略支援部队航天工程大学 | Based on luminosity sequence while estimation space unstability target precession and spin rate method |
Non-Patent Citations (2)
Title |
---|
D POMMET ET AL: "《Imaging using limited-angle backscattered data from real targets》", 《IEEE》 * |
王云鹏: "《基于微多普勒效应激光探测的目标分类研究》", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Also Published As
Publication number | Publication date |
---|---|
CN112017156B (en) | 2023-02-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10706551B2 (en) | Object motion mapping using panchromatic and multispectral imagery from single pass electro-optical satellite imaging sensors | |
CN108805904B (en) | Moving ship detection and tracking method based on satellite sequence image | |
CN109741356B (en) | Sub-pixel edge detection method and system | |
CN109523506B (en) | Full-reference stereo image quality objective evaluation method based on visual salient image feature enhancement | |
JP6016552B2 (en) | Antireflection system | |
CN111080617B (en) | Railway wagon brake beam pillar round pin loss fault identification method | |
CN103400388A (en) | Method for eliminating Brisk (binary robust invariant scale keypoint) error matching point pair by utilizing RANSAC (random sampling consensus) | |
CN110598613B (en) | Expressway agglomerate fog monitoring method | |
CN105139375A (en) | Satellite image cloud detection method combined with global DEM and stereo vision | |
CN109492525B (en) | Method for measuring engineering parameters of base station antenna | |
EP3359978A1 (en) | Method for processing an sar image and associated target-detecting method | |
AU2007287418A1 (en) | Target orientation | |
Somawirata et al. | Road detection based on the color space and cluster connecting | |
CN114972083A (en) | Image restoration method based on measured data under complex optical imaging condition | |
CA3038176C (en) | Object motion mapping from single-pass electro-optical satellite imaging sensors | |
CN110095774B (en) | Moving target detection method for circular track video SAR | |
CN106895794B (en) | A kind of method and device obtaining laser beam scan path | |
US11915435B2 (en) | Resampled image cross-correlation | |
CN105389582B (en) | ISAR contour extraction of objects method based on CLEAN algorithm scattering centers extraction | |
CN112017156B (en) | Space point target rotation period estimation method based on multispectral video | |
CN112836707B (en) | ISAR image aerial target length feature extraction method | |
CN113223074A (en) | Underwater laser stripe center extraction method | |
CN112907626A (en) | Moving object extraction method based on satellite time-exceeding phase data multi-source information | |
CN114549448B (en) | Complex multi-type defect detection evaluation method based on infrared thermal imaging data analysis | |
CN112686880B (en) | Method for detecting abnormity of railway locomotive component |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |