CN112926237B - Space target key feature identification method based on photometric signals - Google Patents
Space target key feature identification method based on photometric signals Download PDFInfo
- Publication number
- CN112926237B CN112926237B CN202110117950.2A CN202110117950A CN112926237B CN 112926237 B CN112926237 B CN 112926237B CN 202110117950 A CN202110117950 A CN 202110117950A CN 112926237 B CN112926237 B CN 112926237B
- Authority
- CN
- China
- Prior art keywords
- space target
- target
- space
- parameters
- gesture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 34
- 230000005855 radiation Effects 0.000 claims abstract description 12
- 230000002457 bidirectional effect Effects 0.000 claims abstract description 4
- 238000005315 distribution function Methods 0.000 claims abstract description 4
- 238000002310 reflectometry Methods 0.000 claims abstract description 4
- 238000012937 correction Methods 0.000 claims description 18
- 230000009466 transformation Effects 0.000 claims description 12
- 238000001914 filtration Methods 0.000 claims description 8
- 230000003287 optical effect Effects 0.000 claims description 8
- 230000005540 biological transmission Effects 0.000 claims description 3
- 210000001747 pupil Anatomy 0.000 claims description 3
- 238000004458 analytical method Methods 0.000 claims description 2
- 238000005312 nonlinear dynamic Methods 0.000 claims description 2
- 239000013598 vector Substances 0.000 description 15
- 238000005070 sampling Methods 0.000 description 11
- 238000004364 calculation method Methods 0.000 description 8
- 239000011159 matrix material Substances 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- PYWVYCXTNDRMGF-UHFFFAOYSA-N rhodamine B Chemical compound [Cl-].C=12C=CC(=[N+](CC)CC)C=C2OC2=CC(N(CC)CC)=CC=C2C=1C1=CC=CC=C1C(O)=O PYWVYCXTNDRMGF-UHFFFAOYSA-N 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
- G06F30/23—Design optimisation, verification or simulation using finite element methods [FEM] or finite difference methods [FDM]
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D21/00—Measuring or testing not otherwise provided for
- G01D21/02—Measuring two or more variables by means not covered by a single other subclass
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Evolutionary Computation (AREA)
- Geometry (AREA)
- General Engineering & Computer Science (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a space target key feature identification method based on photometric signals, which comprises the following steps: step 1: setting an initial state of a space target, predicting the position of the space target based on a finite element model and a solar radiation pressure model, and estimating the posture of the space target according to a posture control mode; step 2: analyzing the bin visibility of the surface of the space target; step 3: predicting a space target luminosity signal based on the bidirectional reflectivity distribution function, and correcting the space target state predicted in the step 1 according to the actual observed value; step 4: estimating the observed value of the corrected parameter based on the space target state in the step 3, and correcting the possible overfitting phenomenon in the step 3 by using the actual observed value; step 5: and (3) repeating the steps 1 to 4 based on the space target parameters estimated in the step 4, and updating and obtaining the space target position and the gesture to realize the identification of the key features of the space target.
Description
Technical Field
The invention relates to the technical field of space information detection, in particular to the technical field of space target key feature identification based on photometric signals.
Background
The orbit period of the medium and high orbit satellites such as the geosynchronous orbit satellite has smaller phase difference with the earth rotation period, can stare at the ground surface target for a long time under the condition of not generating background image shift, accurately identifies the working type and the working state of the space target such as the medium and high orbit satellite, and has great significance for space information detection.
The foundation optical system is free from space platform resources, has better optical information acquisition and processing capacity than the space-based optical system, has lower use cost and is a main means of space target monitoring. The distance of foundation observation is far, the imaging resolution of the optical system is low, and the optical system is easily influenced by uncontrollable factors such as illumination, non-cooperative target attitude and orbit change and the like; at this time, it is often difficult to effectively acquire the spatial target features, so there is a limit in the research method based on the optical image. The photometric signal is an energy signal, contains key characteristic information such as the position, the gesture and the track of a space target, has higher sensitivity to the information change, and is suitable for identifying the key characteristic of the space target.
Disclosure of Invention
Aiming at the technical problems, the invention provides a space target key feature identification method based on photometric signals, which comprises the following steps:
Step 1: setting an initial state of a space target, predicting the position of the space target based on a finite element model and a solar radiation pressure model, and estimating the posture of the space target according to a posture control mode;
step 2: analyzing the bin visibility of the surface of the space target;
Step 3: predicting a space target luminosity signal based on the bidirectional reflectivity distribution function, and correcting the space target state predicted in the step 1 according to the actual observed value;
Step 4: estimating the observed value of the corrected parameter based on the space target state in the step 3, and correcting the possible overfitting phenomenon in the step 3 by using the actual observed value;
step 5: and (3) repeating the steps 1 to 4 based on the space target parameters estimated in the step4, and updating and obtaining the space target position and the gesture to realize the identification of the key features of the space target.
The beneficial effects of the invention are as follows: according to the medium and high orbit space target phase angle and luminosity curve observed by the foundation photoelectric telescope, the phase angle and luminosity data are fused by a nonlinear filtering method, so that the synchronous estimation of the motion parameters such as the space target position, the gesture and the speed and the characteristic parameters such as the quality, the shape and the albedo is realized.
Drawings
FIG. 1 is a schematic flow chart of the method of the present invention.
Detailed Description
The technical solution of the present invention will now be fully described with reference to fig. 1. The following description is of some, but not all embodiments of the invention. All other embodiments, based on the embodiments of the invention, which are obtained by a person skilled in the art without making any inventive effort, are within the scope of the claims of the present invention.
The method for identifying the key features of the space target based on the photometric signals comprises the following steps:
Step 1: setting an initial state of a space target, predicting the position of the space target based on a finite element model and a solar radiation pressure model, and estimating the posture of the space target according to a posture control mode;
step 2: analyzing the bin visibility of the surface of the space target;
Step 3: predicting a space target luminosity signal based on the bidirectional reflectivity distribution function, and correcting the space target state predicted in the step 1 according to the actual observed value;
Step 4: estimating the observed value of the corrected parameter based on the space target state in the step 3, and correcting the possible overfitting phenomenon in the step 3 by using the actual observed value;
step 5: and (3) repeating the steps 1 to 4 based on the space target parameters estimated in the step4, and updating and obtaining the space target position and the gesture to realize the identification of the key features of the space target.
Preferably, in step 1, the steps of predicting and estimating the position and posture of the spatial target are as follows:
(1) Predicting nonlinear system state parameters and observed values through lossless transformation;
(2) The numerical value transmission and updating are completed by fitting the nonlinear system state parameter mean value and covariance;
(3) Constructing a motion update model of the gesture quaternion;
(4) And updating the space target gesture according to the track position and the speed.
Preferably, in step 2, the step of analyzing the visibility of the bin of the space object surface is as follows:
(1) Finite element division is carried out on the geometric model of the space target, and face element parameters are derived;
(2) Determining a shielding relation between the surface elements based on the corner positions of the surface elements and the direction of the light clusters;
(3) Based on occlusion relationships between the bins, a visible bin and its visibility are determined.
Preferably, in step 3, the step of correcting the photometric signal and the state prediction value of the spatial target is as follows:
(1) Calculating the radiation illuminance of the visible surface element at the entrance pupil of the foundation detector based on the solar radiation illuminance to obtain a single-frame luminosity signal;
(2) Based on the space target motion parameters, combining the sun, the optical detector position parameters, the space target position and the gesture parameters, predicting a space target single frame luminosity signal;
(3) And (3) correcting the space target state based on the luminosity signals predicted in the step (2) and the actual observed value.
Preferably, in step 4, the overfitting phenomenon correction step for the observation estimation and correction is as follows:
(1) For a position correction process, determining a quantitative expression of the overfitting correction scale based on a linear transformation assumption;
(2) For the pose correction process, a quantitative expression of the overfitting correction scale is determined based on the linear transformation assumption.
Example 1
1. Prediction of spatial target position and pose
If the gravitational influence of other celestial bodies is ignored and only the perturbation influence of solar radiation pressure on the orbit of the space target is considered, the acceleration expression of the space target is as follows
Where a per is acceleration due to solar radiation pressure, μ is an gravitational constant, and r is a distance from the spatial target to the earth center.
Definition of quaternion is as follows
q=[q0 μT]T
Wherein q 0 and μ are defined as
Wherein v is Euler rotation angle,Is an Euler rotation axis.
Most space targets do not have attitude adjustment capability and belong to spin stabilization systems. The space target with stable spin rotates at a constant speed around a fixed rotation axis, and the motion update model of the attitude quaternion is as follows
For any one 3X 1 vector a, [ a X ] is expressed as
The three-axis stable spatial target pose is related to its position and instantaneous velocity. Taking satellite staring a satellite point below the satellite as an example, enabling an object body coordinate system to point in a motion direction along an x-axis, enabling a z-axis to point in the direction of the satellite point below the satellite, enabling a y-axis, the x-axis and the z-axis to form a right-hand coordinate system, and forming a space object body coordinate system, wherein a three-axis stable satellite pointed at the point below the satellite is a gesture model:
β=γ×α
R=[α β γ]T
Where r k and v k are the position and instantaneous velocity of the spatial target.
Then, the updated calculation formula of the spatial target track position and velocity is as follows:
And updating the space target state parameter at each moment according to the calculation process, and completing iteration by combining a single-frame simulation process to realize continuous-frame photometric signal curve simulation.
The mass and position of the spatial target at time k corresponds to the observations at time k, while the presence of the orbit dynamics and attitude control system affects the observations at the next time. Therefore, the target state parameter at the k+1 time can be estimated through the target state parameter at the k time, and then the predicted target state parameter at the k+1 time is corrected based on the actual observed value at the k+1 time until the residual error between the predicted observed value and the actual observed value is smaller than the set threshold value.
The method for inverting the shape characteristic information of the space target by utilizing the luminosity data mainly comprises a Gaussian surface density method, a geometric model matching method, a vector method based on nonlinear filtering and a multi-model self-adaptive estimation method. The shape inversion method based on the nonlinear filtering technology is wide in application range, small in error and good in robustness.
The nonlinear filtering method regards the inversion of parameters such as the target gesture as the filtering estimation problem of a nonlinear dynamic system, estimates the state of the target by using observation values such as luminosity data and the like, can accurately solve the gesture inversion problem of the space target, and is a research hot spot of the current motion information inversion method.
By using the nonlinear filtering method, the problem of attitude inversion of the space target can be accurately solved. The prediction of the state parameter and the observed value of the nonlinear system used by the method can be completed through lossless transformation according to the mean value of the state parameter xAnd covariance mean P, and obtaining { χ i } by adopting lossless transformation. If symmetrical sampling is adopted, the state parameter formula is as follows
In the point set obtained by sampling in the method, each point has a weight corresponding to the point, and the weight is calculated by a formula of re-fitting the point set to a new state parameter and covariance distribution
The parameter alpha is used for adjusting the relation between the sampling point and the mean value point, and the larger the value is, the smaller the sampling point is affected by the mean value point; the parameter beta is used for fitting the high-order error of the Taylor expansion, and the value of the parameter is 2 for Gaussian distribution. For the above point set, after obtaining y i through a nonlinear system, the mean value thereof can be fitted by the following formulaAnd covariance P yy, completing the transfer and update of the state parameter mean value and the distribution covariance of the nonlinear system:
For pose estimation, the quaternion has a constraint of modulo length 1. In order to meet the above constraint and not destroy the physical meaning of the sampling point, an intermediate variable, the rodrich parameter δp, is introduced. And sampling the gesture by using the parameter to obtain a sampling quaternion, and representing the distance between the sampling point and the quaternion mean value. The interconversion formula between quaternion and the rodrich parameter is as follows
Where a is a parameter ranging from 0 to 1, and f=2 (a+1).
The state parameters of the space target are divided into global parameters and local parameters, the global parameters are used for representing the gesture by using quaternions x k, and the local parameters delta x k are used for representing the gesture by using the rodrich parameters:
xk=[q0 q1 q2 q3 ωx ωy ωz]T
δxk=[p1 p2 p3 ωx ωy ωz]T
The average value is set to 000 when the local error rhodamine parameter is estimated every frame, and the local error can be obtained through lossless transformation The local error quaternion/>, can be calculated by
δμ=f-1(a+δq0)δp
The local error quaternion represents the sampling distance between the sampling point and the mean value, and the calculation formula of the global parameter is that
After the calculation process, the state update and the state observation are carried out through global quaternions, and the updated prediction local error quaternion calculation formula is as follows
2. Bin visibility analysis of a spatial target surface
A certain shielding relation exists between the geometric surface elements, and the shielding relation can be obtained by calculation through a triangular ray method. The method is characterized in that an intersection point of a space ray and a plane where a triangular surface element is located is calculated first, and then whether the intersection point is located inside the surface element is judged. When judging the intersection point position, selecting any one side of the triangular surface element, and verifying whether the intersection point and the surface element vertex opposite to the side are positioned on the same side of the side; each side of the triangular surface element is selected in turn in this way, and if the intersection point and the vertex of the surface element are always located on the same side, it is known that the intersection point is located inside the surface element.
Solar radiation is far from a space target and can be regarded as parallel light, so that a light cluster is simplified into a plurality of light rays with different point sources and the same direction, and the following mathematical relationship exists between the parallel light rays:
Om=Cm-tDm
Dm=uSun
Where O m is the source of ray m, D m is the unit direction vector of ray m, For the intersection of ray m and bin n, C m is the center coordinate of the bin in the geodetic coordinate system, and u Sun is the unit direction vector of the sun pointing to the center of the target.
3. Photometric signal and state prediction value correction of spatial target
The intensity of the radiation reflected by the spatial target can be calculated by:
where k a is ambient light, R d is diffuse reflectance, For the vector pointing to the mth light source, N is the normal vector of the bin, R s is the specular reflectance,/>Is the unit vector of the center of the bin pointing to the detector, and alpha is the specular reflection coefficient. /(I)For/>The specular reflection direction vector of (a) is calculated as
And simulating the space target luminosity signal by adopting a finite element idea. Firstly, establishing a space target geometric model as a basis of finite element division; then finite element division is carried out on the model, and parameters of each face element of the target are derived; and finally describing a finite element expression model of the space target under a J2000 coordinate system through quaternion.
The quaternion and the rotation matrix can be mutually converted. And obtaining a rotation matrix by using the quaternion, converting the target position coordinate from a body coordinate system to a J2000 coordinate system, and multiplying the inverse matrix of the rotation matrix. The conversion formula of the corner coordinates of the finite element model is as follows
Then, for each visible bin, after integrating the energy of the sun in the visible light band to obtain the irradiance E 0 at the sun surface, the irradiance E at the space target of the sun can be obtained according to the light energy transmission formula to be
Where R 0 is the distance of the spatial target to the sun and R 0 is the sun radius. The irradiance produced at the entrance pupil of the ground-based detector for each visible bin is calculated as
Wherein n is a normal vector of the bin, k 1 is a unit vector of the bin center pointing to the light source, k 2 is a unit vector of the bin center pointing to the detector, r Obs,i is a vector from the space target to the detector, A i is the area of the bin i, and the calculation formula of the parameter ρ Total,i is
Where h is the intermediate vector of k 1 and k 2, u and v are the unit vectors orthogonal to each other in the plane of the bin, and n u and n v are used to characterize the degree of reflection in the u and v directions, respectively.
Based on the orbit motion model, phase angle observation data can be calculated by the position coordinates of the spatial target. The phase angle observation data includes an altitude and an azimuth of the target to a base observation station coordinate system. Firstly, converting Cartesian coordinates of a target under a ground inertial coordinate system into coordinates under a foundation observation station coordinate system; and then calculating phase angle observation data of the target according to the converted coordinate parameters. The conversion relation between the target position coordinates in the foundation observation station coordinate system and the target position coordinates in the ground inertial coordinate system is as follows:
Wherein ρ is a vector from the ground-based observation station to the space target in the ground-based inertial coordinate system, and θ and λ are longitude and latitude angles of the ground-based observation station relative to the ground-based inertial coordinate system.
By actual observationTo correct the estimated parameter mean μ k+1 and distribution covariance/>The formula of (2) is as follows:
Wherein K is Kalman gain, and the expression is
Meanwhile, the calculation formulas of the prediction state and the observation covariance are as follows
The observed estimation result may jump due to the geometrical model shielding relation or the noise of the detector, and may have a great influence on the attitude estimation, so that it is required to determine whether the jump is included. The jump has two cases: one is jump caused by external factors such as a detector, the jump can influence the stability of an estimation algorithm, and the other is abrupt change caused by specular reflection on the large surface of a space target, and the occurrence of the situation can not influence the stability of the algorithm, so that whether the jump is the jump influencing a system can be judged through two thresholds.
The point jump coefficients are first defined as follows
Where m k is the view star observed at time k, and if jump suppression is enabled, the estimated state parameter lags the actual observation by one frame.
If the jump coefficient is larger than the threshold value, the system is considered to have jump, whether the jump belongs to reasonable jump is further judged, the observation value at the moment k+1 is predicted, if the residual error between the predicted observation value at the moment k+1 and the actual observation value is larger than the threshold value, the jump is considered to be an unreasonable threshold value, namely
4. Correction of overfitting phenomenon
If the lossless transformation parameters, in particular the observed model error covariance, are not properly chosen, an overfitting will occur to the estimation of the state parameters. Quaternion may represent a simpler state parameter difference, but it is difficult to quantitatively express the scale that needs to be corrected, so for the over-fitting phenomenon, the correct scale is quantitatively characterized using the rodgers parameter in its correction.
The correction is accomplished by linear transformation, so the overfitting can be regarded as a linear process, the correction being of the scale
Wherein,Representing predicted observations,/>The corresponding observed value after correcting the state parameter is represented, and m k+1 represents the observed value actually observed.
Based on the linear transformation assumption, the magnitude of the Kalman gain is corrected according to the scale which needs to be corrected, and the corrected Kalman gain is
Wherein, K - is the Kalman gain calculated according to the sampling point, and K + is the corrected Kalman gain.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Claims (4)
1. The space target key feature identification method based on photometric signals is characterized by comprising the following steps of:
step 1: setting an initial state of a space target, predicting the position of the space target based on a finite element model and a solar radiation pressure model, adopting a nonlinear filtering method to consider the inversion of target gesture parameters as a filtering estimation problem of a nonlinear dynamic system, and estimating the gesture of the space target according to a gesture control mode;
step 2: analyzing the bin visibility of the surface of the space target;
Step 3: predicting a space target luminosity signal based on the bidirectional reflectivity distribution function, and correcting the space target state predicted in the step 1 according to the actual observed value;
Step 4: estimating the observed value of the corrected parameter based on the space target state in the step 3, and correcting the possible overfitting phenomenon in the step 3 by using the actual observed value;
Step 5: repeating the steps 1 to 4 based on the space target parameters estimated in the step4, and updating and obtaining the space target position and posture to realize the identification of the key features of the space target;
in step 1, the prediction and estimation of the spatial target position and attitude comprises the following steps:
Step 1.1: predicting nonlinear system state parameters and observed values through lossless transformation;
step 1.2: the numerical value transmission and updating are completed by fitting the nonlinear system state parameter mean value and covariance;
step 1.3: constructing a motion update model of a gesture quaternion of a space target;
Step 1.4: and updating the space target gesture according to the track position and the speed.
2. The method for identifying key features of a spatial target based on photometric signals according to claim 1, wherein in step 2, the bin visibility analysis of the spatial target surface comprises the steps of:
Step 2.1: finite element division is carried out on the geometric model of the space target, and face element parameters are extracted;
step 2.2: determining a shielding relation between the surface elements based on the corner positions of the surface elements and the direction of the light clusters;
step 2.3: based on occlusion relationships between the bins, a visible bin and its visibility are determined.
3. The method for identifying key features of a space object based on photometric signals according to claim 1, wherein in step 3, the photometric signal and state prediction value correction of the space object comprises the steps of:
step 3.1: calculating the radiation illuminance of the visible surface element at the entrance pupil of the foundation detector based on the solar radiation illuminance to obtain a single-frame luminosity signal;
step 3.2: based on the space target motion parameters, combining the sun, the optical detector position parameters, the space target position and the gesture parameters, predicting a space target single frame luminosity signal;
step 3.3: and (3) correcting the space target state based on the photometric signal predicted in the step 3.2 and the actual observed value.
4. The method for identifying key features of a spatial target based on photometric signals according to claim 1, wherein in step 4, the overfitting phenomenon correction for the observation estimation and correction comprises the steps of:
Step 4.1: for a position correction process, determining a quantitative expression of the overfitting correction scale based on a linear transformation assumption;
Step 4.2: for the pose correction process, a quantitative expression of the overfitting correction scale is determined based on the linear transformation assumption.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110117950.2A CN112926237B (en) | 2021-01-28 | 2021-01-28 | Space target key feature identification method based on photometric signals |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110117950.2A CN112926237B (en) | 2021-01-28 | 2021-01-28 | Space target key feature identification method based on photometric signals |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112926237A CN112926237A (en) | 2021-06-08 |
CN112926237B true CN112926237B (en) | 2024-05-24 |
Family
ID=76167784
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110117950.2A Active CN112926237B (en) | 2021-01-28 | 2021-01-28 | Space target key feature identification method based on photometric signals |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112926237B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103646175A (en) * | 2013-12-06 | 2014-03-19 | 西安电子科技大学 | Computing method for spectral radiance of target |
CN108415098A (en) * | 2018-02-28 | 2018-08-17 | 西安交通大学 | Based on luminosity curve to the high rail small size target signature recognition methods in space |
CN109492347A (en) * | 2019-01-22 | 2019-03-19 | 中国人民解放军战略支援部队航天工程大学 | A kind of method that three-element model describes extraterrestrial target optical diffusion characteristic |
CN112179355A (en) * | 2020-09-02 | 2021-01-05 | 西安交通大学 | Attitude estimation method aiming at typical characteristics of photometric curve |
-
2021
- 2021-01-28 CN CN202110117950.2A patent/CN112926237B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103646175A (en) * | 2013-12-06 | 2014-03-19 | 西安电子科技大学 | Computing method for spectral radiance of target |
CN108415098A (en) * | 2018-02-28 | 2018-08-17 | 西安交通大学 | Based on luminosity curve to the high rail small size target signature recognition methods in space |
CN109492347A (en) * | 2019-01-22 | 2019-03-19 | 中国人民解放军战略支援部队航天工程大学 | A kind of method that three-element model describes extraterrestrial target optical diffusion characteristic |
CN112179355A (en) * | 2020-09-02 | 2021-01-05 | 西安交通大学 | Attitude estimation method aiming at typical characteristics of photometric curve |
Non-Patent Citations (1)
Title |
---|
地基光度曲线反演空间目标特征技术研究进展;王阳等;中国科学;第第62卷卷(第第15期期);第1578-1590页 * |
Also Published As
Publication number | Publication date |
---|---|
CN112926237A (en) | 2021-06-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Giorgini et al. | Predicting the Earth encounters of (99942) Apophis | |
Sciré et al. | Analysis of orbit determination for space based optical space surveillance system | |
Linares et al. | Astrometric and photometric data fusion for resident space object orbit, attitude, and shape determination via multiple-model adaptive estimation | |
Kai et al. | Autonomous navigation for a group of satellites with star sensors and inter-satellite links | |
CN111102981B (en) | High-precision satellite relative navigation method based on UKF | |
Mazarico et al. | Advanced illumination modeling for data analysis and calibration. Application to the Moon | |
CN113091731A (en) | Spacecraft autonomous navigation method based on star sight relativistic effect | |
Linares et al. | Particle filtering light curve based attitude estimation for non-resolved space objects | |
Du et al. | The attitude inversion method of geostationary satellites based on unscented particle filter | |
Linares et al. | Photometric data from non-resolved objects for space object characterization and improved atmospheric modeling | |
CN112179355B (en) | Attitude estimation method aiming at typical characteristics of luminosity curve | |
CN111125874B (en) | High-precision rail measurement forecasting method for movable platform | |
CN112926237B (en) | Space target key feature identification method based on photometric signals | |
Kessler et al. | Filtering methods for the orbit determination of a tethered satellite | |
CN112394381B (en) | Full-autonomous lunar navigation and data communication method based on spherical satellite | |
CN115422699A (en) | Interactive ground space target monitoring sensor analog simulation system | |
CN114485620A (en) | Orbital dynamics fused asteroid detector autonomous visual positioning system and method | |
Chen et al. | A feature selection model to filter periodic variable stars with data-sensitive light-variable characteristics | |
Bae et al. | Precision attitude determination (PAD) | |
Shuang et al. | Autonomous optical navigation for landing on asteroids | |
Shi et al. | Research on Starlink constellation simulation and target area visibility algorithm | |
Shin et al. | Determination of geostationary orbits (GEO) satellite orbits using optical wide-field patrol network (OWL-Net) data | |
Tan et al. | Quantifying uncertainties of space objects centroid position based on optical observation | |
Liu et al. | Autonomous navigation technology of whole space | |
Guo et al. | Frontiers of Moon-Based Earth Observation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |