CN114577208B - Navigation system error unified modeling method based on rotating reference coordinate system - Google Patents

Navigation system error unified modeling method based on rotating reference coordinate system Download PDF

Info

Publication number
CN114577208B
CN114577208B CN202210107583.2A CN202210107583A CN114577208B CN 114577208 B CN114577208 B CN 114577208B CN 202210107583 A CN202210107583 A CN 202210107583A CN 114577208 B CN114577208 B CN 114577208B
Authority
CN
China
Prior art keywords
error
camera
image plane
equivalent rotation
plane translation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210107583.2A
Other languages
Chinese (zh)
Other versions
CN114577208A (en
Inventor
王大轶
孙博文
李茂登
邓润然
朱卫红
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Spacecraft System Engineering
Original Assignee
Beijing Institute of Spacecraft System Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Spacecraft System Engineering filed Critical Beijing Institute of Spacecraft System Engineering
Priority to CN202210107583.2A priority Critical patent/CN114577208B/en
Publication of CN114577208A publication Critical patent/CN114577208A/en
Application granted granted Critical
Publication of CN114577208B publication Critical patent/CN114577208B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/24Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for cosmonautical navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Astronomy & Astrophysics (AREA)
  • Navigation (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a navigation system error unified modeling method based on a rotating reference coordinate system, which comprises the following steps: (1) Representing the mounting error of the camera by using the equivalent rotation vector; (2) Obtaining a camera image plane translation error according to a camera measurement principle; (3) Converting the camera image plane translation error into an equivalent rotation vector form camera image plane translation error by using a rotation reference coordinate system; (4) And obtaining a unified modeling system error according to the camera image plane translation error in the form of the equivalent rotation vector and the camera installation error. The method overcomes the defect that the existing method can not simultaneously meet the requirements of estimating the installation error and the image plane translation of the spaceborne optical camera, reduces the dimension of the system error, performs unified dimension reduction characterization on the system error of the spaceborne optical camera by using a rotating reference coordinate system, and realizes unified modeling of the autonomous navigation system error of the spacecraft.

Description

Navigation system error unified modeling method based on rotating reference coordinate system
Technical Field
The invention belongs to the technical field of space navigation, and particularly relates to a navigation system error unified modeling method based on a rotating reference coordinate system.
Background
In the detection of the small celestial body, the spacecraft runs on orbit for a long time, the parameter change of the measurement sensor is easy to cause, and systematic errors, such as installation errors, zero offset of a lens and the like, possibly exist in the satellite-borne optical camera, so that the method is a main reason for restricting the autonomous relative navigation precision of the spacecraft. Because the spacecraft is limited in load, redundant measuring sensors cannot be carried, and meanwhile, elements are not easy to replace on the orbit, on-orbit estimation on errors of the satellite-borne optical camera system is necessary. Currently, on-orbit estimation of errors of an on-board optical camera system becomes a hot problem of research.
At present, an augmentation Kalman filtering method is mostly adopted to estimate the systematic errors, namely, the systematic error variables to be estimated are expanded into state variables to be estimated, the calculation complexity of an algorithm is greatly increased due to the fact that the number of the systematic errors is large, meanwhile, calculation resources on the satellite are severely limited, large data volume operation cannot be born, and therefore the traditional augmentation Kalman filtering method cannot estimate the systematic errors on the way.
Disclosure of Invention
The invention solves the technical problems that: the method overcomes the defects of the prior art, overcomes the defect that the prior method cannot simultaneously meet the requirements of estimating the installation error and the image plane translation of the satellite-borne optical camera, reduces the dimension of the system error, performs unified dimension reduction representation on the satellite-borne optical camera system error by using the rotating reference coordinate system, and realizes unified modeling of the autonomous navigation system error of the spacecraft.
The invention aims at realizing the following technical scheme: a unified modeling method for navigation system errors based on a rotating reference coordinate system, the method comprising the steps of: (1) Representing the mounting error of the camera by using the equivalent rotation vector; (2) Obtaining a camera image plane translation error according to a camera measurement principle; (3) Converting the camera image plane translation error into an equivalent rotation vector form camera image plane translation error by using a rotation reference coordinate system; (4) And obtaining a unified modeling system error according to the camera image plane translation error in the form of the equivalent rotation vector and the camera installation error.
In the above method for unified modeling of navigation system errors based on the rotation reference coordinate system, in step (1), the installation error matrix C ins of the camera is:
Cins=I3+[θ×];
Wherein, I 3 is a 3-dimensional identity matrix, and θ is the installation error of the camera.
In the above-mentioned unified modeling method of navigation system error based on the rotating reference coordinate system, in the step (3), the step of converting the camera image plane translation error into the camera image plane translation error in the form of the equivalent rotation vector by using the rotating reference coordinate system includes the following steps: (31) obtaining a unified equivalent rotation axis; (32) Obtaining a uniform rotation angle under an equivalent rotation shaft according to the camera image plane translation error and the camera focal length; (33) And obtaining the camera image plane translation error in the form of an equivalent rotation vector according to the unified equivalent rotation axis and the rotation angle under the unified equivalent rotation axis.
In the above-mentioned navigation system error unified modeling method based on the rotation reference coordinate system, in the step (31), the unified equivalent rotation axis is l= [ delta vu 0]T; wherein l is a uniform equivalent rotation axis, delta v is the translation amount of the camera optical axis in the longitudinal direction of the image plane, delta u is the translation amount of the camera optical axis in the transverse direction of the image plane.
In the above-mentioned navigation system error unified modeling method based on the rotation reference coordinate system, in step (32), the rotation angle under the unified equivalent rotation axis is:
Wherein phi is the rotation angle of the unified equivalent rotation axis, f is the focal length of the camera, delta v is the translation of the camera optical axis in the longitudinal direction of the image plane, delta u is the translation of the camera optical axis in the transverse direction of the image plane.
In the above-mentioned unified modeling method of navigation system error based on the rotation reference coordinate system, in step (33), the camera image plane translation error in the form of the equivalent rotation vector is:
Wherein, beta is the camera image plane translation error in the form of an equivalent rotation vector, l is a uniform equivalent rotation axis, and phi is the rotation angle under the uniform equivalent rotation axis.
In the above method for unified modeling of navigation system errors based on the rotation reference coordinate system, in step (4), the system errors after unified modeling are:
α=θ+β;
Wherein alpha is a systematic error after unified modeling, theta is a mounting error of the camera, and beta is a camera image plane translation error in an equivalent rotation vector form.
A navigation system error unified modeling system based on a rotating reference coordinate system, comprising: a first module for representing an installation error of the camera using the equivalent rotation vector; the second module is used for obtaining a camera image plane translation error according to a camera measurement principle; the third module is used for converting the camera image plane translation error into an equivalent rotation vector form camera image plane translation error by utilizing the rotation reference coordinate system; and the fourth module is used for obtaining the system error after unified modeling according to the camera image plane translation error in the form of the equivalent rotation vector and the installation error of the camera.
In the above navigation system error unified modeling system based on the rotation reference coordinate system, the installation error matrix C ins of the camera is:
Cins=I3+[θ×];
Wherein, I 3 is a 3-dimensional identity matrix, and θ is the installation error of the camera.
In the above navigation system error unified modeling system based on the rotation reference coordinate system, the converting the camera image plane translation error into the camera image plane translation error in the form of the equivalent rotation vector by using the rotation reference coordinate system includes the following steps: (31) obtaining a unified equivalent rotation axis; (32) Obtaining a uniform rotation angle under an equivalent rotation shaft according to the camera image plane translation error and the camera focal length; (33) And obtaining the camera image plane translation error in the form of an equivalent rotation vector according to the unified equivalent rotation axis and the rotation angle under the unified equivalent rotation axis.
Compared with the prior art, the invention has the following beneficial effects:
(1) According to the invention, through the unified modeling characteristic of the system error, the unified dimension reduction characterization of the system error is realized, the calculation complexity is reduced, and the limited calculation capability on the spacecraft can be satisfied;
(2) The invention gives out the condition that the system meets observability through the observability analysis of the system errors after unified modeling, estimates the system on the premise of meeting the observability of the system, and ensures the convergence of filtering;
(3) According to the invention, after unified modeling and observability analysis of the system errors, corresponding filtering steps are provided, so that the self-correction of the spacecraft navigation system errors can be realized.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
FIG. 1 is a flow chart of a unified modeling method for navigation system errors based on a rotating reference frame provided by an embodiment of the present invention;
FIG. 2 is a schematic diagram of a filtered estimation error after unified modeling of systematic errors according to an embodiment of the present invention;
FIG. 3 is a graph illustrating filtered estimation errors of the relative navigation state variable r/|r| provided by an embodiment of the present invention;
FIG. 4 is a relative navigation state variable provided by an embodiment of the present invention A graph of the filtered estimation error of (a).
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art. It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other. The invention will be described in detail below with reference to the drawings in connection with embodiments.
Fig. 1 is a flowchart of a unified modeling method for error of a navigation system based on a rotating reference frame according to an embodiment of the present invention. As shown in fig. 1, the method comprises the steps of:
(1) The installation error of the camera is approximately represented by using an equivalent rotation vector, namely, the installation error of the camera is represented in a linearization way;
(2) On the basis of the step (1), a parameterized representation method of camera image plane translation is given from the perspective of a camera measurement principle;
(3) On the basis of the step (2), the image plane translation approximation is converted into an equivalent rotation vector form by utilizing a rotation reference coordinate system, wherein the rotation reference coordinate system is used for converting the additive image plane translation error into an equivalent rotation vector through a unified rotation mode;
(4) According to the transformation method of the image plane translation error given by the step (3), the two system errors are subjected to unified dimension reduction representation, namely unified modeling, by combining the equivalent rotation vector representation method of the installation error given by the step (1), so that the number of parameters to be estimated is reduced, and the calculation complexity is reduced.
In the installation error representing method in step (1), since the installation error is small, the second order term is ignored, and the installation error can be represented by linearizing the equivalent rotation vector, and the rotation matrix of the installation error is represented by the form I 3 < + > theta× ] of the outer product of the equivalent rotation vector theta and the identity matrix.
In the image plane translation representation method in the step (2), from the perspective of a camera imaging principle, the parameterized representation method is given by combining the physical background of the image plane translation.
The approximate conversion method based on the rotation reference coordinate system in the step (3) is as follows:
(4.1) searching a uniform equivalent rotation axis l= [ delta vu 0]T according to the characteristic of camera image plane translation error;
(4.2) calculating the rotation angle under the rotation axis in the step (4.1) according to the magnitude of the image plane translation error and the magnitude of the focal length of the camera
(4.3) Combining the rotation axis in (4.1) with the rotation angle in (4.2) to obtain an equivalent rotation vector of image plane translation
In the system error unified modeling method in the step (4), the equivalent rotation vectors of the installation error and the image plane translation error are added on the basis of the step (1) and the step (3), and the system error alpha=theta+beta after unified modeling is obtained.
The method comprises the following specific steps:
Given the measurement equation, assuming that the camera focal length is f, the direction vector of the target under the camera system is p= [ p x,py,pz]T, and the measurement value [ u, v ] T of the camera satisfies:
Where p=r M/|rM |, r is the relative position vector of the non-cooperative target under the camera system.
If there is an installation error in the optical camera, there are:
in the method, in the process of the invention, The rotation matrix for the ideal lower track system to camera system, C ins, is the installation error matrix.
Since the installation error is small, the equivalent rotation vector θ= [ θ xyz]T ] can be made to represent the installation error, and the installation error matrix can be expressed as:
If there is an image plane shift δ= [ δ uv]T, then
Giving a unified modeling method:
Converting the image plane translation into a rotation matrix form, it is known that the rotation axis l satisfying the rotation from the vector l 1 to the vector l 2 must have:
<l,l1>=<l,l2>
from the geometrical relationship, there is a common rotation axis:
l=[δvu 0]T
assuming that the imaging point is in the vicinity of the optical axis by the attitude control, the rotation angle is approximately:
The equivalent rotation vector is approximated as:
obtaining a unified modeling result of the errors of the optical camera system:
α=θ+β
if the direction vector of the target under the camera system is p, the observation equation is:
where u is the coordinate value of the object on the horizontal axis of the image plane, v is the coordinate value of the object on the vertical axis of the image plane, f is the focal length of the camera, and p is the direction vector of the object under the camera system.
The method further comprises the steps of: the method comprises the following specific steps of constructing a relative motion model without distance information, and carrying out observability analysis on the navigation system errors after unified modeling on the basis of the relative motion model without distance information:
(2.1) giving a linearized relative orbital dynamics equation, the position vector r and the velocity vector without taking noise into account The systematic error α can be expressed as:
in the method, in the process of the invention, Omega is the angular velocity of the orbit of the spacecraft, F 1 isRegarding the first order Jacobi matrix of r, F 2 isWith respect toIs a first order Jacobi matrix of (b).
For simple observability analysis, let normalized [ -u, -v, f ] T be the measured value, then the measurement equation for time without taking measurement noise into account is:
Wherein C α=I3 + [ alpha ] x, ζ is the measured value of the target relative direction vector, C α is the equivalent rotation matrix of the system error after unified modeling, For a rotation matrix of the track system to the camera system under ideal conditions,Is the unit direction vector of the target under the camera system.
(2.2) Since the camera only measures the direction vector of the target, we remove the distance information in the relative motion to give a relative motion equation without distance information in order to intuitively perform observability analysis on the systematic error:
in the method, in the process of the invention,
(2.3) Discretizing the equation of state and the equation of measurement of the non-distance scale
Discretizing a distance-free scale state equation:
Wherein, The ratio of the derivative of the position vector at time k +1 to the distance,For the ratio of the second order derivative of the position vector to the distance at time k+1, alpha k is the systematic error at time k after unified modeling,The differentiation of the systematic error at time k +1 after unified modeling,The ratio of the derivative of the position vector at time k to the distance,Phi k+1,k is the state transition matrix from time k to time k+1, which is the ratio of the position vector at time k to the distance.
Discretizing a measurement equation:
wherein ζ k is the measured value of the relative direction vector of the target at the time k, The rotation matrix from the track system to the camera system under ideal conditions at the moment k.
The measurement matrix is:
(2.4) constructing an observability matrix O, and analyzing the observability of the system:
in the method, in the process of the invention,
Wherein H k is a measurement matrix at k time, H k+1 is a measurement matrix at k+1 time, H k+2 is a measurement matrix at k+2 time, phi k+1,k is a state transition matrix from k time to k+1 time, phi k+2,k+1 is a state transition matrix from k+1 time to k+2 time, O 11 is a 3×3-dimensional block matrix in the observability matrix O, O 12 is a 3×3-dimensional block matrix in the observability matrix O, O 13 is a 3×3-dimensional block matrix in the observability matrix O, O 21 is a 3×3-dimensional block matrix in the observability matrix O, O 22 is a 3×3-dimensional block matrix in the observability matrix O, O 23 is a 3×3-dimensional block matrix in the observability matrix O, O 31 is a 3×3-dimensional block matrix in the observability matrix O, O 32 is a 3×3-dimensional block matrix in the observability matrix O 33 is a 3×3-dimensional block matrix in the observability matrix O,For the rotation matrix from the track system to the camera system at the moment k,Is the unit direction vector of the target under the camera system at the moment k,For the rotation matrix from the track system to the camera system at time k+1,For a rotation matrix of the track system to the camera system at time k +2,The ratio of the target position vector to the distance at time k +1,The ratio of the target position vector to the distance at time k +2,A unit direction vector of the target under the camera system at the time k+1,Is the unit direction vector of the target under the camera system at the time k+2.
It can be seen that the system does not meet the observability if the direction vector of the target under the camera system is unchanged. At this time, the direction vector of the target under the camera system needs to be changed through attitude maneuver so as to meet the observability of the system.
The method further comprises the steps of: and giving out a filtering step according to the relative kinematic equation without distance information, the observation equation after unified modeling and the observability analysis result.
① State variable one-step prediction X k+1,k, whereWherein, X k+1,k is the predicted value of the state variable at the next moment in time k, X k is the estimated value of the state variable at time k, and α k is the system error at time k.
② State covariance one-step predictor P k+1,k:
where Φ k+1,k is the state transition matrix from k to k+1, Q k is the state noise covariance at k, P k+1,k is the predicted value of the state covariance matrix at k to the next time, and P k is the state covariance matrix at k.
③ Calculating a gain matrix K k:
wherein H k is a measurement matrix at k time, and R k is measurement noise covariance at k time;
④ Zeta k+1 is obtained after normalization of the actual measurement value:
Wherein ζ k+1 is the measured value of the relative direction vector of the target at time k+1, u k+1 is the coordinate value of the target at time k+1 on the horizontal axis of the image plane, and v k+1 is the coordinate value of the target at time k+1 on the vertical axis of the image plane.
⑤ State variable update:
⑥ State covariance update
Pk+1=(I9-KkHk)Pk+1,k
Wherein, P k+1 is the state covariance matrix at k+1 time, and I 9 is the 9-dimensional identity matrix.
Test results
FIG. 2 is a schematic diagram of a filtered estimation error after unified modeling of systematic errors according to an embodiment of the present invention; FIG. 3 is a graph illustrating filtered estimation errors of the relative navigation state variable r/|r| provided by an embodiment of the present invention; FIG. 4 is a relative navigation state variable provided by an embodiment of the present inventionA graph of the filtered estimation error of (a). As shown in fig. 2 to 4:
Six orbits of the spacecraft and the non-cooperative targets are shown in table 1, and parameters required for the experiment are shown in table 2.
TABLE 1 six track counts
Table 2 experimental parameters
According to the unified modeling method, the equivalent rotation vector after unified modeling is known as alpha= [5.20,8.69,3.49] T (unit: rad), and according to the filtering method, the autonomous relative navigation estimation error statistical result based on unified modeling can be obtained, as shown in table 3.
TABLE 3 estimation error statistics
The embodiment also provides a navigation system error unified modeling system based on a rotation reference coordinate system, which comprises: a first module for representing an installation error of the camera using the equivalent rotation vector; the second module is used for obtaining a camera image plane translation error according to a camera measurement principle; the third module is used for converting the camera image plane translation error into an equivalent rotation vector form camera image plane translation error by utilizing the rotation reference coordinate system; and the fourth module is used for obtaining the system error after unified modeling according to the camera image plane translation error in the form of the equivalent rotation vector and the installation error of the camera.
According to the invention, through the unified modeling characteristic of the system error, the unified dimension reduction characterization of the system error is realized, the calculation complexity is reduced, and the limited calculation capability on the spacecraft can be satisfied; the invention gives out the condition that the system meets observability through the observability analysis of the system errors after unified modeling, estimates the system on the premise of meeting the observability of the system, and ensures the convergence of filtering; according to the invention, after unified modeling and observability analysis of the system errors, corresponding filtering steps are provided, so that the self-correction of the spacecraft navigation system errors can be realized.
Although the present invention has been described in terms of the preferred embodiments, it is not intended to be limited to the embodiments, and any person skilled in the art can make any possible variations and modifications to the technical solution of the present invention by using the methods and technical matters disclosed above without departing from the spirit and scope of the present invention, so any simple modifications, equivalent variations and modifications to the embodiments described above according to the technical matters of the present invention are within the scope of the technical matters of the present invention.

Claims (8)

1. The unified modeling method for the errors of the navigation system based on the rotating reference coordinate system is characterized by comprising the following steps:
(1) Representing the mounting error of the camera by using the equivalent rotation vector;
(2) Obtaining a camera image plane translation error according to a camera measurement principle;
(3) Converting the camera image plane translation error into an equivalent rotation vector form camera image plane translation error by using a rotation reference coordinate system;
(4) Obtaining a unified modeling system error according to the camera image plane translation error in the form of an equivalent rotation vector and the camera installation error;
in step (3), converting the camera image plane translation error into a camera image plane translation error in the form of an equivalent rotation vector using the rotation reference frame comprises the steps of:
(31) Obtaining a uniform equivalent rotation shaft;
(32) Obtaining a uniform rotation angle under an equivalent rotation shaft according to the camera image plane translation error and the camera focal length;
(33) And obtaining the camera image plane translation error in the form of an equivalent rotation vector according to the unified equivalent rotation axis and the rotation angle under the unified equivalent rotation axis.
2. The unified modeling method for navigation system errors based on a rotating reference frame according to claim 1, wherein: in step (1), the camera mounting error matrix C ins is:
Cins=I3+[θ×];
Wherein, I 3 is a 3-dimensional identity matrix, and θ is the installation error of the camera.
3. The unified modeling method for navigation system errors based on a rotating reference frame according to claim 1, wherein: in step (31), the unified equivalent rotation axis is l= [ delta vu 0]T; wherein l is a uniform equivalent rotation axis, delta v is the translation amount of the camera optical axis in the longitudinal direction of the image plane, delta u is the translation amount of the camera optical axis in the transverse direction of the image plane.
4. The unified modeling method for navigation system errors based on a rotating reference frame according to claim 1, wherein: in step (32), the rotation angle at the uniform equivalent rotation axis is:
Wherein phi is the rotation angle of the unified equivalent rotation axis, f is the focal length of the camera, delta v is the translation of the camera optical axis in the longitudinal direction of the image plane, delta u is the translation of the camera optical axis in the transverse direction of the image plane.
5. The unified modeling method for navigation system errors based on a rotating reference frame according to claim 1, wherein: in step (33), the camera image plane translation error in the form of an equivalent rotation vector is:
Wherein, beta is the camera image plane translation error in the form of an equivalent rotation vector, l is a uniform equivalent rotation axis, and phi is the rotation angle under the uniform equivalent rotation axis.
6. The unified modeling method for navigation system errors based on a rotating reference frame according to claim 1, wherein: in the step (4), the systematic error after unified modeling is:
α=θ+β;
Wherein alpha is a systematic error after unified modeling, theta is a mounting error of the camera, and beta is a camera image plane translation error in an equivalent rotation vector form.
7. A navigation system error unified modeling system based on a rotating reference coordinate system, characterized by comprising:
A first module for representing an installation error of the camera using the equivalent rotation vector;
the second module is used for obtaining a camera image plane translation error according to a camera measurement principle;
the third module is used for converting the camera image plane translation error into an equivalent rotation vector form camera image plane translation error by utilizing the rotation reference coordinate system;
The fourth module is used for obtaining a unified modeling system error according to the camera image plane translation error in the form of an equivalent rotation vector and the installation error of the camera;
The method for converting the camera image plane translation error into the camera image plane translation error in the form of an equivalent rotation vector by using the rotation reference coordinate system comprises the following steps:
(31) Obtaining a uniform equivalent rotation shaft;
(32) Obtaining a uniform rotation angle under an equivalent rotation shaft according to the camera image plane translation error and the camera focal length;
(33) And obtaining the camera image plane translation error in the form of an equivalent rotation vector according to the unified equivalent rotation axis and the rotation angle under the unified equivalent rotation axis.
8. The unified modeling system for navigation system errors based on a rotating reference frame of claim 7, wherein: the camera installation error matrix C ins is:
Cins=I3+[θ×];
Wherein, I 3 is a 3-dimensional identity matrix, and θ is the installation error of the camera.
CN202210107583.2A 2022-01-28 2022-01-28 Navigation system error unified modeling method based on rotating reference coordinate system Active CN114577208B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210107583.2A CN114577208B (en) 2022-01-28 2022-01-28 Navigation system error unified modeling method based on rotating reference coordinate system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210107583.2A CN114577208B (en) 2022-01-28 2022-01-28 Navigation system error unified modeling method based on rotating reference coordinate system

Publications (2)

Publication Number Publication Date
CN114577208A CN114577208A (en) 2022-06-03
CN114577208B true CN114577208B (en) 2024-07-12

Family

ID=81771964

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210107583.2A Active CN114577208B (en) 2022-01-28 2022-01-28 Navigation system error unified modeling method based on rotating reference coordinate system

Country Status (1)

Country Link
CN (1) CN114577208B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103487032A (en) * 2013-08-08 2014-01-01 上海卫星工程研究所 Low earth orbit space camera free-pointing image motion vector calculation method
CN108645426A (en) * 2018-04-09 2018-10-12 北京空间飞行器总体设计部 A kind of in-orbit self-calibrating method of extraterrestrial target Relative Navigation vision measurement system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3428539B2 (en) * 1999-12-10 2003-07-22 日本電気株式会社 Satellite attitude sensor calibration device
CN109696182A (en) * 2019-01-23 2019-04-30 张过 A kind of spaceborne push-broom type optical sensor elements of interior orientation calibrating method
CN109631952B (en) * 2019-01-31 2020-07-03 中国人民解放军国防科技大学 Method for calibrating installation error of attitude reference mirror of optical gyro component for spacecraft
CN111561935B (en) * 2020-05-19 2022-03-15 中国科学院微小卫星创新研究院 In-orbit geometric calibration method and system for rotary large-breadth optical satellite

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103487032A (en) * 2013-08-08 2014-01-01 上海卫星工程研究所 Low earth orbit space camera free-pointing image motion vector calculation method
CN108645426A (en) * 2018-04-09 2018-10-12 北京空间飞行器总体设计部 A kind of in-orbit self-calibrating method of extraterrestrial target Relative Navigation vision measurement system

Also Published As

Publication number Publication date
CN114577208A (en) 2022-06-03

Similar Documents

Publication Publication Date Title
CN113945206B (en) Positioning method and device based on multi-sensor fusion
CN112268559B (en) Mobile measurement method for fusing SLAM technology in complex environment
CN110174899B (en) High-precision imaging attitude pointing control method based on agile satellite
CN112083725B (en) Structure-shared multi-sensor fusion positioning system for automatic driving vehicle
CN112505737B (en) GNSS/INS integrated navigation method
CN110018691B (en) Flight state estimation system and method for small multi-rotor unmanned aerial vehicle
CN111380518B (en) SINS/USBL tight combination navigation positioning method introducing radial velocity
CN107702709B (en) Time-frequency domain hybrid identification method for non-cooperative target motion and inertia parameters
CN113175929B (en) UPF-based spatial non-cooperative target relative pose estimation method
CN109143223B (en) Bistatic radar space target tracking filtering device and method
CN113129377B (en) Three-dimensional laser radar rapid robust SLAM method and device
CN113238072B (en) Moving target resolving method suitable for vehicle-mounted photoelectric platform
Wang et al. Variational Bayesian cubature RTS smoothing for transfer alignment of DPOS
CN112083457B (en) Neural network optimized IMM satellite positioning navigation method
CN110969643A (en) On-satellite autonomous prediction method for ground target moving track
CN111766397A (en) Meteorological wind measurement method based on inertia/satellite/atmosphere combination
CN115902930A (en) Unmanned aerial vehicle room built-in map and positioning method for ship detection
CN114047766B (en) Mobile robot data acquisition system and method for long-term application of indoor and outdoor scenes
CN114577208B (en) Navigation system error unified modeling method based on rotating reference coordinate system
CN114419109A (en) Aircraft positioning method based on visual and barometric information fusion
Liu et al. Global estimation method based on spatial–temporal Kalman filter for DPOS
CN118111430A (en) Interactive multi-model AUV integrated navigation method based on minimum error entropy Kalman
CN114674345B (en) Inertial navigation/camera/laser velocimeter online joint calibration method
Sun et al. A Motion Information Acquisition Algorithm of Multiantenna SAR Installed on Flexible and Discontinuous Structure Based on Distributed POS
CN112697075B (en) Projection area analysis method for rendezvous and docking laser radar cooperative target

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant