CN112504264B - Super-resolution imaging method for star sensor attitude measurement - Google Patents

Super-resolution imaging method for star sensor attitude measurement Download PDF

Info

Publication number
CN112504264B
CN112504264B CN202011140096.3A CN202011140096A CN112504264B CN 112504264 B CN112504264 B CN 112504264B CN 202011140096 A CN202011140096 A CN 202011140096A CN 112504264 B CN112504264 B CN 112504264B
Authority
CN
China
Prior art keywords
star
image
attitude
current frame
super
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011140096.3A
Other languages
Chinese (zh)
Other versions
CN112504264A (en
Inventor
王子寒
李玉明
郑然�
程会艳
武延鹏
王立
王苗苗
严微
曹哲
刘山山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Control Engineering
Original Assignee
Beijing Institute of Control Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Control Engineering filed Critical Beijing Institute of Control Engineering
Priority to CN202011140096.3A priority Critical patent/CN112504264B/en
Publication of CN112504264A publication Critical patent/CN112504264A/en
Application granted granted Critical
Publication of CN112504264B publication Critical patent/CN112504264B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/02Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by astronomical means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4053Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Astronomy & Astrophysics (AREA)
  • Automation & Control Theory (AREA)
  • Image Processing (AREA)

Abstract

The patent relates to a super-resolution imaging method for star sensor attitude measurement, which comprises the following steps: based on the traditional super-resolution imaging theory, a star sensor super-resolution imaging model is established, image registration is carried out through an original resolution image which is continuously acquired in a time-sharing mode, two-dimensional pixel point cloud is established, and super-resolution image restoration is realized by using an image reconstruction method; based on a super-resolution imaging model of the star sensor, a realization method for recognizing the attitude of the planet sensor by using a super-resolution imaging method is provided by combining the working mode of the star sensor.

Description

Super-resolution imaging method for star sensor attitude measurement
Technical Field
The patent relates to the field of super-resolution imaging and star sensors, in particular to a super-resolution imaging method for star sensor attitude measurement.
Background
As shown in fig. 1, the star sensor is used as an important attitude determination device of a satellite platform, and the working principle of the star sensor is to image a space star to obtain a star observation vector, determine the direction of a three-axis of the star sensor in an inertial space by adopting a star map matching method, and determine the attitude of a satellite in the inertial space by utilizing a conversion relation between a sensor body coordinate system and a satellite body coordinate system. The star point positioning accuracy of the star sensor is a key factor for determining the attitude measurement accuracy of the star sensor, and the resolution of the image sensor is a core index of the star point positioning accuracy of the star sensor. Compared with a low-resolution image sensor, the high-resolution image sensor can restore more detailed information of a real scene, such as edge contours of star points.
Currently, the resolution of mature star sensors has reached 2K. Higher resolution image sensors (4K, 5K, etc.) are still under development and validation, which also results in higher material, labor costs and development cycles. In addition, the development of high-resolution image sensors faces the problems of difficult acquisition of large area arrays, high cost, high power consumption and the like.
The super-resolution imaging method can fuse a plurality of frames of low-resolution images with sub-pixel displacement into one frame of high-resolution image. For a type of imaging system, i.e. an imaging system with a strong spatial resolution capability (small diffraction limit) of an optical system and a weak resolution capability (large pixel size and low spatial sampling rate) of an image sensor, the super-resolution imaging effect is equivalent to reducing the pixel size of the image sensor, improving the spatial sampling frequency, and simultaneously not affecting the sensitivity of the image sensor.
The imaging system can be seen as a low pass filter for the optical information. For a star sensor, a typical star point image has tens of pixels, and the granular sensation at the edge of the star point is strong, so that high-frequency information at the edge of the star point image is lost. The super-resolution imaging method improves the imaging resolution, is an effective image restoration method, and can restore the high-frequency information of the image edge. Most of the existing researches on the star point positioning accuracy of the star sensor are concentrated on software algorithms such as point spread function calibration, pixel calibration and the like, the researches on the super-resolution imaging method mainly face the fields of natural scenes, medicine and the like, and no research result is available for applying the super-resolution imaging method to the attitude measurement of the star sensor.
Disclosure of Invention
The purpose of this patent lies in: the method is characterized in that a novel method for preprocessing the star map of the star sensor is provided, the super-resolution imaging technology is utilized to improve the imaging resolution, the high-resolution image restoration is realized, and further the star point positioning precision and the attitude measurement precision are improved. The method mainly aims to improve the existing software algorithm and is suitable for most of star sensors.
The technical scheme of this patent is: a super-resolution imaging method of a star sensor comprises the following steps:
(1) calculating the attitude of the current frame image by using the forecasted star point coordinates and a single-frame star map attitude calculation method;
(2) calculating the relative attitude of the previous M-1 frame relative to the current frame image;
(3) mapping the pixels of the star point window image of the previous M-1 frame to the coordinate system of the current frame image, and respectively processing star point images by judging navigation star numbers;
(4) calculating to obtain a reconstructed high-resolution image;
(5) calculating the posture of the reconstructed high-resolution image by using a star map processing algorithm;
(6) judging whether the current posture meets an iteration termination condition; if yes, carrying out the next step, otherwise, updating the posture of the current frame image, and returning to the step (2);
(7) and updating the historical star map and outputting the current posture.
The specific process of calculating the relative attitude of the previous M-1 frame relative to the current frame image in the step (2) is as follows:
the star sensor rotates at a low angular speed along with the satellite, the current frame image and the previous M-1 frame image are considered to have the same view field and observation star, and sub-pixel displacement exists between the adjacent images; calculating a relative attitude matrix of the ith frame and the current frame image, wherein the calculation formula is as follows:
Figure BDA0002737968130000021
wherein A is0Is the attitude matrix of the current frame, AiIs the attitude matrix of the ith frame.
The step (3) realizes the conversion from the previous M-1 frame star point image to the current frame image coordinate system by the following method:
mapping the image of the same star point to the coordinate system of the current frame image by utilizing the calculated relative attitude matrix and the identified observing star number; the conversion formula from the pixel coordinate of the ith frame to the coordinate system of the current frame image is as follows:
[ui0,vi0]=F(ui,vi)=C(x,y,z)Ai0K(ui,vi) (2)
wherein, K (u, v) is a conversion function from the pixel coordinate (u, v) to the incident light space three-dimensional vector (x, y, z), C (x, y, z) is a conversion function from the incident light space three-dimensional vector (x, y, z) to the pixel coordinate (u, v), and K (u, v) and C (x, y, z) are determined by the star sensor calibration parameters.
And (4) obtaining a reconstructed high-resolution image by using a grid method and an interpolation method.
The iteration termination conditions include two conditions, where condition 1 is that the number of iterations reaches a threshold limit number, and condition 2 is that the attitude quality parameter TASTE is smaller than the check value tasmp.
Compared with the prior art, the patent has the advantages that:
the method provides a super-resolution imaging method which has strong universality and high flexibility and is suitable for imaging sensors of various types, sizes and types and a novel star point extraction and attitude calculation method of the star sensor. The super-resolution imaging technology is used for the star sensor for the first time, the space information acquired by imaging the fixed star for multiple times is fused, the star point positioning precision is improved on the imaging layer, and compared with the prior art, the super-resolution imaging technology has substantial characteristics and progress. The super-resolution imaging method provided by the method is flexible in design and strong in expansibility, and algorithm parameters can be flexibly configured according to requirements (such as precision or resolution requirements) of different products and the computing power of the star sensor. The method can be directly embedded into the algorithm of the existing star sensor as a functional module, and is convenient for software upgrading of the existing product and research and test of the product under study. In addition, the traditional super-resolution imaging technology needs image registration, the operation amount is large, the star sensor can realize quick and accurate star map coordinate mapping by using a special star recognition technology, image registration is not needed, and the practicability is high. The method is easy for software simulation and hardware realization, has low requirement on the hardware configuration of the star sensor, has strong practicability, can greatly save the labor, materials and time cost compared with the use of an image sensor with higher resolution, and has wide application prospect.
Drawings
FIG. 1 is a super-resolution imaging schematic;
FIG. 2 is a schematic diagram of sub-pixel displacement imaging;
fig. 3 is a star sensor attitude recognition flow chart, 3a is a traditional star sensor star map processing flow chart, and 3b is a star sensor star map processing flow chart utilizing a super-resolution imaging method.
Detailed Description
This patent is described in detail below with reference to fig. 3 and the specific examples.
Step 1: calculating the attitude of the current frame image by using the forecasted star point coordinates and the single-frame star map attitude calculation method
Firstly, extracting star point window images by utilizing forecasted star point coordinate information to obtain star point coordinates, then calculating an observation star angular distance by utilizing a calibration coefficient, then carrying out star point matching identification by a triangle/quadrangle matching method, and then carrying out attitude calculation.
In the example of the patent, the star map processing flow shown in fig. 3 is taken as an example, and the patent is still applicable to other star map processing flows based on super-resolution image reconstruction.
Step 2: calculating the relative attitude of the previous M-1 frame with respect to the current frame image
Different from the traditional super-resolution imaging technology, the star sensor can calculate the accurate star-sensitive space attitude through matching and identifying fixed stars, the accuracy of the star-sensitive space attitude is superior to the sub-pixel level, the accuracy of the star-sensitive space attitude is superior to the accuracy of a traditional image registration algorithm, and the relative attitude between images at different moments can be calculated by utilizing the attitude information of the star sensor, namely the high-accuracy image registration is realized.
The star sensor rotates at a low angular velocity along with the satellite, and it can be considered that the current frame image and the previous M-1 frame image have the same field of view and observation star, and a sub-pixel displacement exists between the adjacent images, as shown in fig. 2. Respectively calculating relative attitude matrixes of the ith frame and the current frame image, wherein the calculation formula is as follows:
Figure BDA0002737968130000041
wherein A is0Is the attitude matrix of the current frame, AiIs the attitude matrix of the ith frame.
And step 3: mapping the pixels of the star point window image of the previous M-1 frame to the coordinate system of the current frame image
And mapping the image of the same star point to the coordinate system of the current frame image by utilizing the calculated relative attitude matrix and the identified observing star number. The transformation formula of the ith frame pixel coordinate and the current frame image coordinate system is as follows:
[ui0,vi0]=F(ui,vi)=C(x,y,z)Ai0K(ui,vi) (2)
wherein, K (u, v) is a conversion function from the pixel coordinate (u, v) to the incident light space three-dimensional vector (x, y, z), C (x, y, z) is a conversion function from the incident light space three-dimensional vector (x, y, z) to the pixel coordinate (u, v), and K (u, v) and C (x, y, z) are determined by the star sensor calibration parameters.
And 4, step 4: obtaining a reconstructed high-resolution image by using a grid method and an interpolation method
Taking the window resolution of a single star point image as an example of 16 x 16, performing super-resolution image reconstruction on a 4-frame (M x 4) image, after constructing a two-dimensional point cloud, establishing a 32 x 32 subdivision grid for pixel area division, performing interpolation operation on each pixel node by using adjacent pixel values, and obtaining the pixel value of a final image, thereby obtaining a high-resolution star point window image.
In the example of the patent, the window resolution of the star point image is 16 x 16, the super-resolution image reconstruction is carried out on 4 frames (M is 4), and the patent still applies to other window sizes and the number of processed image frames.
And 5: computing pose of reconstructed high resolution image using star map processing algorithm
The treatment method is the same as the step 1.
Step 6: judging whether the current posture meets the iteration termination condition
And if the iteration termination condition is met, the next step is carried out, otherwise, the posture of the current frame image is updated, and the step 2 is returned. The iteration termination condition 1 is the number of times that the iteration number reaches the threshold limit, and the condition 2 is that the attitude quality parameter TASTE is smaller than the check value tasmp.
And 7: updating the historical star map and outputting the current attitude
And storing the current frame star point window image and the corresponding posture into a historical star atlas database, and deleting the image and posture data of the earliest frame at the moment.
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present patent and are not limited. Although the present patent has been described in detail with reference to the embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the spirit and scope of the present patent.
The non-specified portions of this patent are within the common general knowledge of those skilled in the art.

Claims (3)

1. A super-resolution imaging method of a star sensor is characterized by comprising the following steps:
(1) calculating the attitude of the current frame image by using the forecasted star point coordinates and a single-frame star map attitude calculation method;
(2) calculating the relative attitude of the previous M-1 frame relative to the current frame image;
(3) mapping the pixels of the star point window image of the previous M-1 frame to the coordinate system of the current frame image, and respectively processing the star point image by judging the navigation star number;
(4) calculating to obtain a reconstructed high-resolution image;
(5) calculating the posture of the reconstructed high-resolution image by using a star map processing algorithm;
(6) judging whether the current posture meets an iteration termination condition; if yes, carrying out the next step, otherwise, updating the posture of the current frame image, and returning to the step (2);
(7) updating the historical star map and outputting the current attitude;
the specific process of calculating the relative attitude of the previous M-1 frame relative to the current frame image in the step (2) is as follows:
the star sensor rotates at a low angular speed along with the satellite, the current frame image and the previous M-1 frame image are considered to have the same view field and observation star, and sub-pixel displacement exists between the adjacent images; calculating a relative attitude matrix of the ith frame and the current frame image, wherein the calculation formula is as follows:
Figure FDA0003169315730000011
wherein A is0Is the attitude matrix of the current frame, AiAn attitude matrix of the ith frame;
the step (3) realizes the conversion from the previous M-1 frame star point image to the current frame image coordinate system by the following method:
mapping the image of the same star point to the coordinate system of the current frame image by utilizing the calculated relative attitude matrix and the identified observing star number; the conversion formula from the pixel coordinate of the ith frame to the coordinate system of the current frame image is as follows:
[ui0,vi0]=F(ui,vi)=C(x,y,z)Ai0K(ui,vi) (2)
wherein, K (u, v) is a conversion function from the pixel coordinate (u, v) to the incident light space three-dimensional vector (x, y, z), C (x, y, z) is a conversion function from the incident light space three-dimensional vector (x, y, z) to the pixel coordinate (u, v), and K (u, v) and C (x, y, z) are determined by the star sensor calibration parameters.
2. The method of claim 1, wherein: and (4) obtaining a reconstructed high-resolution image by using a grid method and an interpolation method.
3. The method of claim 1, wherein: the iteration termination conditions include two conditions, where condition 1 is that the number of iterations reaches a threshold limit number, and condition 2 is that the attitude quality parameter TASTE is smaller than the check value tasmp.
CN202011140096.3A 2020-10-22 2020-10-22 Super-resolution imaging method for star sensor attitude measurement Active CN112504264B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011140096.3A CN112504264B (en) 2020-10-22 2020-10-22 Super-resolution imaging method for star sensor attitude measurement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011140096.3A CN112504264B (en) 2020-10-22 2020-10-22 Super-resolution imaging method for star sensor attitude measurement

Publications (2)

Publication Number Publication Date
CN112504264A CN112504264A (en) 2021-03-16
CN112504264B true CN112504264B (en) 2021-12-07

Family

ID=74954888

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011140096.3A Active CN112504264B (en) 2020-10-22 2020-10-22 Super-resolution imaging method for star sensor attitude measurement

Country Status (1)

Country Link
CN (1) CN112504264B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1874499A (en) * 2006-05-12 2006-12-06 北京理工大学 High dynamic equipment for reconstructing image in high resolution
CN107590777A (en) * 2017-07-17 2018-01-16 中国人民解放军国防科学技术大学 A kind of star sensor star point image enchancing method
CN110060209A (en) * 2019-04-28 2019-07-26 北京理工大学 A kind of MAP-MRF super-resolution image reconstruction method based on posture information constraint
CN110849354A (en) * 2019-11-28 2020-02-28 上海航天控制技术研究所 Star point extraction and compensation method under condition of last life stage of star sensor

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101282718B1 (en) * 2012-12-28 2013-07-05 한국항공우주연구원 Absolute misalignment calibration method between attitude sensors and linear array image sensor

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1874499A (en) * 2006-05-12 2006-12-06 北京理工大学 High dynamic equipment for reconstructing image in high resolution
CN107590777A (en) * 2017-07-17 2018-01-16 中国人民解放军国防科学技术大学 A kind of star sensor star point image enchancing method
CN110060209A (en) * 2019-04-28 2019-07-26 北京理工大学 A kind of MAP-MRF super-resolution image reconstruction method based on posture information constraint
CN110849354A (en) * 2019-11-28 2020-02-28 上海航天控制技术研究所 Star point extraction and compensation method under condition of last life stage of star sensor

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Improving centroiding by super-resolution reconstruction of sodium layer density in Shack–Hartmann wavefront sensors;Alexandre J.T.S.Mello.et al;《Applied Optics》;20160510;第55卷(第14期);第3701-3710页 *
star trackers for attitude determination;liebe .et al;《IEEE AES Systems Magazine》;19950630;第10卷(第6期);第10-16页 *
基于超分辨率重构的航天器位置姿态测量方法;杜小平等;《北京理工大学学报》;20050331;第25卷(第3期);第220-224页 *
超分辨率重建技术及其研究进展;刘妍妍等;《中国光学与应用光学》;20090430;第2卷(第2期);第102-111页 *

Also Published As

Publication number Publication date
CN112504264A (en) 2021-03-16

Similar Documents

Publication Publication Date Title
CN106679648B (en) Visual inertia combination SLAM method based on genetic algorithm
CN111709981A (en) Registration method of laser point cloud and analog image with characteristic line fusion
CN104657981B (en) Dynamic compensation method for three-dimensional laser distance metering data of mobile robot in moving process
WO2019029099A1 (en) Image gradient combined optimization-based binocular visual sense mileage calculating method
CN111553245A (en) Vegetation classification method based on machine learning algorithm and multi-source remote sensing data fusion
CN110570449A (en) positioning and mapping method based on millimeter wave radar and visual SLAM
CN111650579B (en) InSAR mining area three-dimensional deformation estimation method and device for rock migration parameter adaptive acquisition and medium
CN110443881B (en) Bridge deck morphological change recognition bridge structure damage CNN-GRNN method
CN115049945B (en) Unmanned aerial vehicle image-based wheat lodging area extraction method and device
Zhu et al. Robust registration of aerial images and LiDAR data using spatial constraints and Gabor structural features
CN108053445A (en) The RGB-D camera motion methods of estimation of Fusion Features
CN103077559A (en) Cluster three-dimensional rebuilding method based on sequence image
CN112419512A (en) Air three-dimensional model repairing system and method based on semantic information
CN114494371A (en) Optical image and SAR image registration method based on multi-scale phase consistency
CN111104850B (en) Remote sensing image building automatic extraction method and system based on residual error network
CN117788296B (en) Infrared remote sensing image super-resolution reconstruction method based on heterogeneous combined depth network
CN107123128B (en) A kind of state of motion of vehicle estimation method guaranteeing accuracy
Shi et al. Fusion of a panoramic camera and 2D laser scanner data for constrained bundle adjustment in GPS-denied environments
CN112504264B (en) Super-resolution imaging method for star sensor attitude measurement
Seetharaman et al. A piecewise affine model for image registration in nonrigid motion analysis
CN116912645A (en) Three-dimensional target detection method and device integrating texture and geometric features
CN116758419A (en) Multi-scale target detection method, device and equipment for remote sensing image
CN115984592A (en) Point-line fusion feature matching method based on SuperPoint + SuperGlue
CN115546760A (en) Point cloud sequence data processing method and device, computer equipment and storage medium
CN111028178B (en) Remote sensing image data automatic geometric correction method based on deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant