CN111060078A - Positioning method based on satellite observation angle error estimation - Google Patents

Positioning method based on satellite observation angle error estimation Download PDF

Info

Publication number
CN111060078A
CN111060078A CN201911333635.2A CN201911333635A CN111060078A CN 111060078 A CN111060078 A CN 111060078A CN 201911333635 A CN201911333635 A CN 201911333635A CN 111060078 A CN111060078 A CN 111060078A
Authority
CN
China
Prior art keywords
observation angle
coordinate system
coordinates
satellite
scene image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911333635.2A
Other languages
Chinese (zh)
Inventor
彭耿
杨中书
商旭升
王吉心
李涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201911333635.2A priority Critical patent/CN111060078A/en
Publication of CN111060078A publication Critical patent/CN111060078A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computational Mathematics (AREA)
  • Algebra (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Remote Sensing (AREA)
  • Pure & Applied Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The application discloses a positioning method based on satellite observation angle error estimation. Selecting a control point on a first scene image, and acquiring accurate geographic coordinates and elevation information of the control point; obtaining a rough coordinate of the control point in a geocentric rotation coordinate system according to the elevation information; converting the precise geographic coordinates of the control points into coordinates under the geocentric rotation coordinate system, and subtracting the rough coordinates from the coordinates to obtain coordinate errors of the control points in the geocentric rotation coordinate system; calculating an observation angle error according to the coordinate error; compensating the observation angle error to a second scene image to obtain an accurate observation angle value; and calculating to obtain the geographic coordinates at least comprising longitude and latitude of the pixel points in the second scene image according to the accurate observation angle value. The method and the device solve the technical problem that high-precision direct positioning of the remote sensing satellite image cannot be realized. By the method and the device, high-precision direct positioning based on satellite observation angle error estimation is realized.

Description

Positioning method based on satellite observation angle error estimation
Technical Field
The application relates to the field of remote sensing satellite image processing, in particular to a positioning method based on satellite observation angle error estimation.
Background
The high-precision geometric positioning technology of the satellite remote sensing data is one of key basic supporting technologies for the quantification of the satellite remote sensing data, and the precision of the high-precision geometric positioning technology directly influences the depth of the quantification processing and the value-added processing.
As can be seen from the remote sensing satellite direct positioning model, positioning errors may occur in many places, such as the observation angle of the sensor, the attitude, position, velocity, etc. of the satellite platform. Among them, the observation angle of the satellite sensor is a very important factor.
In order to meet the high-precision geometric positioning problem to be solved in a plurality of remote sensing application fields, the observation angle which is a very important factor influencing the direct positioning error needs to be optimized.
Aiming at the problem that high-precision direct positioning of remote sensing satellite images cannot be realized in the related technology, an effective solution is not provided at present.
Disclosure of Invention
The main purpose of the present application is to provide a positioning method based on satellite observation angle error estimation, so as to solve the problem that high-precision direct positioning of remote sensing satellite images cannot be achieved.
To achieve the above object, according to one aspect of the present application, there is provided a positioning method based on satellite observation angle error estimation.
The positioning method based on satellite observation angle error estimation comprises the following steps: selecting a control point on a first scene image, and acquiring accurate geographic coordinates and elevation information of the control point; obtaining a rough coordinate of the control point in a geocentric rotation coordinate system according to the elevation information; converting the precise geographic coordinates of the control points into coordinates under the geocentric rotation coordinate system, and subtracting the rough coordinates from the coordinates to obtain coordinate errors of the control points in the geocentric rotation coordinate system; calculating an observation angle error according to the coordinate error; compensating the observation angle error to a second scene image to obtain an accurate observation angle value; and calculating to obtain the geographic coordinates at least comprising longitude and latitude of the pixel points in the second scene image according to the accurate observation angle value.
Further, selecting a control point on the first scene image, and acquiring accurate geographic coordinates and elevation information of the control point comprises:
selecting a small number of control points of the first scene image, and obtaining accurate geographic coordinates of the corresponding control points on the digital map
Figure BDA0002329248760000021
Wherein n is the number of control points; and obtaining elevation information (L) of the control point using the DEM modeli,Bi,Hi),i=1,...,n。
Further, obtaining a rough coordinate of the control point in the geocentric rotation coordinate system according to the elevation information includes:
using the obtained elevation information (L)i,Bi,Hi) N, calculating a rough coordinate (X) of a control point on the first scene image in a geocentric rotation coordinate system by a direct positioning methodi,Yi,Zi),i=1,...,n。
Further, the step of converting the precise geographic coordinates of the control point into coordinates in the geocentric rotation coordinate system, and subtracting the rough coordinates from the coordinates to obtain a coordinate error of the control point in the geocentric rotation coordinate system includes:
will be precise geographic coordinates
Figure BDA0002329248760000022
Conversion to elevation information (L)0 i,B0 i,H0 i) 1.. n, the conversion formula is as follows:
will (L)0 i,B0 i,H0 i) N is converted into corresponding coordinates in a geocentric rotation coordinate system
Figure BDA0002329248760000023
Will be provided with
Figure BDA0002329248760000024
Minus the coarse coordinate (X)i,Yi,Zi) I 1.. n, which yields the coordinate error (Δ X, Δ Y, Δ Z) of the control point in the earth-centered rotation coordinate system.
Further, from the coordinate error, calculating an observation angle error comprises:
if certain system error delta psi exists in observation angle of satellite sensorx,Δψy
And obtaining the observation vector error transmitted from the observation vector error in the satellite body coordinate system to the earth center rotating coordinate system according to the transformation relation from the satellite body coordinate system to the orbit coordinate system and from the orbit coordinate system to the protocol inertia coordinate system.
Further, compensating the observation angle error to the second scene image on the same orbit, and obtaining an accurate observation angle value includes:
compensating the obtained observation angle error to the second scene image on the same orbit to obtain an accurate value of the observation angle of (psi)x+Δψxy+Δψy) Wherein the error of the observation angle of the satellite sensor is (delta phi)x,Δψy) If there is a certain system error delta psi in the observation angle of the satellite sensorx,Δψy
Further, the coordinates of any point on the ground in the geographic coordinate system can be represented by (L, B, H) or (X, Y, Z), where L is the dihedral angle formed by the meridian plane of the ground point and the meridian plane of the prime meridian, B is the angle between the normal line of the ellipsoid corresponding to the ground point and the equatorial plane, and H is the height of the ellipsoid, i.e., the distance from the ground point to the ellipsoid along the normal line.
Furthermore, an earth coordinate system with a protocol earth as a datum point is adopted in the earth center rotating coordinate system, the earth center of mass is used as an origin, the Z axis points to the north pole of the earth, the X axis points to the intersection point of the Greenwich mean meridian and the equator of the earth, and the Y axis is determined according to the right-hand rule.
Furthermore, the first scene image is a satellite remote sensing image serving as a sample, and the second scene image is a satellite remote sensing image serving as a to-be-positioned image.
Further, the second scene image is an in-orbit image of the first scene image.
In the positioning method based on satellite observation angle error estimation in the embodiment of the application, a control point is selected on a first scene image, and accurate geographic coordinates and elevation information of the control point are obtained; obtaining a rough coordinate of the control point in a geocentric rotation coordinate system according to the elevation information; converting the precise geographic coordinates of the control points into coordinates under the geocentric rotation coordinate system, and subtracting the rough coordinates from the coordinates to obtain coordinate errors of the control points in the geocentric rotation coordinate system; calculating an observation angle error according to the coordinate error; compensating the observation angle error to a second scene image to obtain an accurate observation angle value; and calculating to obtain the geographic coordinates at least including longitude and latitude of the pixel points in the second scene image according to the accurate observation angle value, and achieving the purpose of high-precision direct positioning based on satellite observation angle error estimation through the observation angle, thereby realizing the technical effect of high-precision direct positioning based on satellite observation angle error estimation and further solving the technical problem that the high-precision direct positioning of the remote sensing satellite image cannot be realized.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, serve to provide a further understanding of the application and to enable other features, objects, and advantages of the application to be more apparent. The drawings and their description illustrate the embodiments of the invention and do not limit it. In the drawings:
fig. 1 is a schematic flowchart of a positioning method based on satellite observation angle error estimation according to an embodiment of the present application;
fig. 2 is a flowchart illustrating a positioning method based on satellite observation angle error estimation according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be used. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In this application, the terms "upper", "lower", "left", "right", "front", "rear", "top", "bottom", "inner", "outer", "middle", "vertical", "horizontal", "lateral", "longitudinal", and the like indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings. These terms are used primarily to better describe the present application and its embodiments, and are not used to limit the indicated devices, elements or components to a particular orientation or to be constructed and operated in a particular orientation.
Moreover, some of the above terms may be used to indicate other meanings besides the orientation or positional relationship, for example, the term "on" may also be used to indicate some kind of attachment or connection relationship in some cases. The specific meaning of these terms in this application will be understood by those of ordinary skill in the art as appropriate.
Furthermore, the terms "mounted," "disposed," "provided," "connected," and "sleeved" are to be construed broadly. For example, it may be a fixed connection, a removable connection, or a unitary construction; can be a mechanical connection, or an electrical connection; may be directly connected, or indirectly connected through intervening media, or may be in internal communication between two devices, elements or components. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
As shown in fig. 1, the method includes steps S101 to S106 as follows:
step S101, selecting a control point on a first scene image, and acquiring accurate geographic coordinates and elevation information of the control point;
step S102, obtaining rough coordinates of the control point in a geocentric rotation coordinate system according to the elevation information;
step S103, converting the precise geographic coordinates of the control points into coordinates under the geocentric rotation coordinate system, and subtracting the rough coordinates from the coordinates to obtain coordinate errors of the control points in the geocentric rotation coordinate system;
step S104, calculating an observation angle error according to the coordinate error;
s105, compensating the observation angle error to a second scene image to obtain an accurate observation angle value;
and step S106, calculating to obtain the geographic coordinates at least including longitude and latitude of the pixel points in the second scene image according to the accurate observation angle value.
Specifically, a small number of control points are selected from a first scene image to obtain accurate geographic position and elevation information; calculating a rough coordinate of the control point in the geocentric rotation coordinate system by using the acquired elevation information through a direct positioning method; subtracting the rough coordinate from the precise coordinate to obtain a coordinate error under the geocentric rotation coordinate system; accurately calculating an observation angle error according to the coordinate error; compensating the observation angle error to a second scene image on the same orbit to obtain an accurate value of an observation angle; and calculating the geographic coordinates of the second scene image by a direct positioning method by using the accurate observation angle.
From the above description, it can be seen that the following technical effects are achieved by the present application:
through the steps, high-precision direct positioning based on satellite observation angle error estimation is realized, positioning errors caused by factors such as satellite observation angle errors are overcome, the positioning precision of remote sensing satellite images can be effectively improved, and the deep application of remote sensing satellite image data is promoted.
According to the embodiment of the present application, as shown in fig. 2, as a preferable option in the embodiment, selecting a control point on the first scene image, and acquiring the precise geographic coordinate and the elevation information of the control point includes:
selecting a small number of control points of the first scene image, and obtaining accurate geographic coordinates of the corresponding control points on the digital map
Figure BDA0002329248760000061
Wherein n is the number of control points; and obtaining elevation information (L) of the control point using the DEM modeli,Bi,Hi),i=1,...,n。
According to the embodiment of the present application, as a preferable preference in the embodiment, as shown in fig. 2, obtaining the rough coordinates of the control point in the geocentric rotation coordinate system according to the elevation information includes:
using the obtained elevation information (L)i,Bi,Hi) N, calculating a rough coordinate (X) of a control point on the first scene image in a geocentric rotation coordinate system by a direct positioning methodi,Yi,Zi),i=1,...,n。
According to the embodiment of the present application, as a preferable example in the embodiment, as shown in fig. 2, the step of converting the precise geographic coordinates of the control point into coordinates in the geocentric rotation coordinate system, and subtracting the rough coordinates from the coordinates to obtain the coordinate error of the control point in the geocentric rotation coordinate system includes:
will be precise geographic coordinates
Figure BDA0002329248760000062
Conversion to elevation information (L)0 i,B0 i,H0 i) 1.. n, the conversion formula is as follows:
will (L)0 i,B0 i,H0 i) N is converted into corresponding coordinates in a geocentric rotation coordinate system
Figure BDA0002329248760000063
Will be provided with
Figure BDA0002329248760000071
Minus the coarse coordinate (X)i,Yi,Zi) I 1.. n, which yields the coordinate error (Δ X, Δ Y, Δ Z) of the control point in the earth-centered rotation coordinate system.
According to the embodiment of the present application, as a preferable example in the embodiment, as shown in fig. 2, calculating the observation angle error from the coordinate error includes:
if certain system error delta psi exists in observation angle of satellite sensorx,ΔψyAnd obtaining the observation vector error transmitted from the observation vector error in the satellite body coordinate system to the earth center rotating coordinate system according to the transformation relation from the satellite body coordinate system to the orbit coordinate system and from the orbit coordinate system to the protocol inertia coordinate system.
According to the embodiment of the present application, as a preferable example in the embodiment, as shown in fig. 2, the compensating the observation angle error to the on-orbit second scene image to obtain an accurate observation angle value includes:
compensating the obtained observation angle error to the second scene image on the same orbit to obtain an accurate value of the observation angle of (psi)x+Δψxy+Δψy)。
According to the embodiment of the present application, as a preference in the present embodiment, coordinates of any point on the ground in the geographic coordinate system may be represented by (L, B, H) or (X, Y, Z), where L is a dihedral angle formed by the meridian plane of the ground and the meridian plane of the first meridian, B is an angle between the normal line of the ellipsoid corresponding to the ground and the equatorial plane, and H is an ellipsoidal height, that is, a distance from the ground point to the ellipsoidal surface along the normal line.
According to the embodiment of the application, as a preferable example in the embodiment, an earth coordinate system with a protocol earth as a reference point is adopted in the earth center rotation coordinate system, the earth centroid is taken as an origin, the Z axis points to the north pole of the earth, the X axis points to the intersection point of the greenwich mean meridian and the equator of the earth, and the Y axis is determined according to the right-hand rule.
According to the embodiment of the present application, as a preferable preference in the embodiment, the first scene image is a satellite remote sensing image as a sample, and the second scene image is a satellite remote sensing image to be positioned.
According to the embodiment of the present application, it is preferable in the embodiment that the second scene image is an in-orbit image of the first scene image.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than presented herein.
According to the embodiment of the present application, there is also provided an implementation principle explanation for implementing the above-mentioned high-precision direct positioning method based on satellite observation angle error estimation, as shown in fig. 2, including:
the method comprises the following steps: selecting a small number of control points of the first scene image, acquiring accurate geographic coordinates of the corresponding control points on a Digital map, and acquiring Elevation information of the control points by using a Digital Elevation Model (DEM);
step two: calculating to obtain a rough coordinate of a control point on the first scene image in the geocentric rotation coordinate system by using the obtained elevation information through a direct positioning method;
step three: converting the precise geographic coordinates of the control points into coordinates under the geocentric rotation coordinate system, and subtracting the rough coordinates from the coordinates to obtain coordinate errors of the control points in the geocentric rotation coordinate system;
step four: accurately calculating the error of the observation angle by using a formula according to the coordinate error;
step five: compensating the obtained observation angle error to a second scene image on the same orbit to obtain an accurate observation angle value;
step six: and (4) obtaining the geographic coordinates (longitude and latitude) of the pixel points in the second scene image by using the accurate observation angle through direct positioning calculation.
Specifically, the high-precision direct positioning method based on satellite observation angle error estimation comprises the following steps:
the method comprises the following steps: selecting a small number of control points of the first scene image, and obtaining accurate geographic coordinates of the corresponding control points on the digital map
Figure BDA0002329248760000081
And obtaining elevation information (L) of the control point using the DEM modeli,Bi,Hi) N, n is the number of control points;
step two: calculating rough coordinates (X) of the control points on the first scene image in the geocentric rotation coordinate system by a direct positioning method by using the obtained elevation informationi,Yi,Zi),i=1,...,n;
Step three: converting the precise geographic coordinates of the control points into coordinates in a geocentric rotating coordinate system
Figure BDA0002329248760000082
Subtracting the rough coordinates to obtain coordinate errors (delta X, delta Y and delta Z) of the control points in the geocentric rotation coordinate system;
step four: from the coordinate error, the error of the observation angle is accurately calculated as (delta psi) using a formulax,Δψy);
Step five: compensating the obtained observation angle error to the second scene image on the same orbit to obtain an accurate value of the observation angle of (psi)x+Δψxy+Δψy);
Step six: and calculating the geographic position coordinates (longitude and latitude) of the pixel points in the second scene image by using the accurate observation angle through a direct positioning method.
Detailed Description
The method comprises the following steps: selecting a small number of control points of the first scene image, and obtaining accurate geographic coordinates of the corresponding control points on the digital map
Figure BDA0002329248760000091
n is the number of control points, and elevation information (L) of the control points is obtained by using a DEM modeli,Bi,Hi),i=1,...,n。
The geographical coordinate system is also called the earth-centered coordinate system, known as the longitude and latitude coordinate system. The reference ellipsoid is generally selected as a basic reference surface, a reference point is selected as a starting point (geodetic origin) of geodetic measurement, and the position and the direction of the reference ellipsoid in the earth can be determined by using an astronomical observation value of the geodetic origin. Notably, the reference ellipsoid center thus determined does not generally coincide with the earth's centroid. The coordinate system takes the center O of the reference ellipsoid as an origin, the Z axis is parallel to the rotating axis of the reference ellipsoid, the X axis points to the intersection point of the initial geodetic meridian plane and the equator of the reference ellipsoid, and the Y axis is determined according to the right-hand rule. The coordinates of any point on the ground can be represented by (L, B, H) or (X, Y, Z), wherein L is a dihedral angle formed by a meridian plane of the earth where the ground point is located and the meridian plane of the prime meridian, B is an included angle between the normal line of the ellipsoid corresponding to the ground point and the equatorial plane, and H is the height of the ellipsoid, namely the distance from the ground point to the ellipsoid along the normal line.
Step two: using the obtained elevation information (L)i,Bi,Hi) N, calculating a rough coordinate (X) of a control point on the first scene image in a geocentric rotation coordinate system by a direct positioning methodi,Yi,Zi),i=1,...,n。
An Earth center Rotating coordinate System (ECR), also called a protocol Earth coordinate System (CTS), is an Earth coordinate System using a protocol Earth pole as a reference point, and uses the Earth centroid as an origin, the Z-axis points to the north pole of the Earth, the X-axis points to the intersection of the greenwich mean meridian and the equator, and the Y-axis is determined according to the right-hand rule.
(Li,Bi,Hi) N is converted into corresponding rough coordinates (X) in the earth's center rotation coordinate systemi,Yi,Zi) 1, n is:
Figure BDA0002329248760000101
wherein,
Figure BDA0002329248760000102
the curvature radius of the prime circle at the point is a long radius of an earth ellipsoid, and e is the earth eccentricity.
Step three: precise geographical coordinates of control points
Figure BDA0002329248760000103
Converting into coordinates in a geocentric rotation coordinate system
Figure BDA0002329248760000104
And subtracting the rough coordinates to obtain the coordinate errors (delta X, delta Y and delta Z) of the control points in the geocentric rotation system.
First selected, precise geographic coordinates
Figure BDA0002329248760000105
Conversion to elevation information (L)0 i,B0 i,H0 i) 1.. n, the conversion formula is as follows:
Figure BDA0002329248760000106
wherein b is the major radius of the earth ellipsoid,
Figure BDA0002329248760000107
the other symbols have the same meanings as those of the formula (1).
Then, (L) is expressed according to the formula (1)0 i,B0 i,H0 i) N is converted into corresponding 1, n in the earth center rotation coordinate systemCoordinates of the object
Figure BDA0002329248760000111
Finally, will
Figure BDA0002329248760000112
Minus the coarse coordinate (X)i,Yi,Zi) I 1.. times, n, the coordinate error (Δ X, Δ Y, Δ Z) of the control point in the earth-center rotation coordinate system is obtained as follows:
Figure BDA0002329248760000113
step four: from the coordinate errors (Δ X, Δ Y, Δ Z), an error of (Δ ψ) of the observation angle of the satellite sensor is accurately calculated using equation (14)x,Δψy)。
If certain system error delta psi exists in observation angle of satellite sensorx,ΔψyThis will result in the observation vector in the satellite body coordinate system becoming u1=[-tg(ψy+Δψy)tg(ψx+Δψx)-1]The expansion using the first order taylor approximation yields:
Figure BDA0002329248760000114
obtaining an observation vector error delta u in the satellite body coordinate system according to the transformation relation from the satellite body coordinate system to the orbit coordinate system and from the orbit coordinate system to the protocol inertia coordinate system1Observed vector error Deltau transmitted into geocentric rotating coordinate system3The following were used:
Figure BDA0002329248760000121
namely, it is
Figure BDA0002329248760000122
Due to the fact that
Figure BDA0002329248760000123
In the formula
Figure BDA0002329248760000124
Figure BDA0002329248760000131
Figure BDA0002329248760000132
Other parameters may be calculated with reference to the forms of equations (8) - (10).
Combining (6) and (7) to obtain
Figure BDA0002329248760000133
In order to ensure that the water-soluble organic acid,
Figure BDA0002329248760000141
then
Figure BDA0002329248760000142
(13) The equation is an over-determined equation that needs to be solved by the least squares method. Therefore, the final error accurate estimation formula for the sensor observation angle is:
Figure BDA0002329248760000143
step five: compensating the obtained observation angle error to the second scene image on the same orbit to obtain an accurate value of the observation angle of (psi)x+Δψxy+Δψy);
Step six: and calculating the geographic position coordinates (longitude and latitude) of the pixel points in the second scene image by using the accurate observation angle through a direct positioning method.
It will be apparent to those skilled in the art that the modules or steps of the present application described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and they may alternatively be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, or fabricated separately as individual integrated circuit modules, or fabricated as a single integrated circuit module from multiple modules or steps. Thus, the present application is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A positioning method based on satellite observation angle error estimation is characterized by comprising the following steps:
selecting a control point on a first scene image, and acquiring accurate geographic coordinates and elevation information of the control point;
obtaining a rough coordinate of the control point in a geocentric rotation coordinate system according to the elevation information;
converting the precise geographic coordinates of the control points into coordinates under the geocentric rotation coordinate system, and subtracting the rough coordinates from the coordinates to obtain coordinate errors of the control points in the geocentric rotation coordinate system;
calculating an observation angle error according to the coordinate error;
compensating the observation angle error to a second scene image to obtain an accurate observation angle value;
and calculating to obtain the geographic coordinates at least comprising longitude and latitude of the pixel points in the second scene image according to the accurate observation angle value.
2. The positioning method based on the satellite observation angle error estimation according to claim 1, wherein a control point is selected on the first scene image, and obtaining the precise geographic coordinates and elevation information of the control point comprises:
selecting a small number of control points of the first scene image, and obtaining accurate geographic coordinates of the corresponding control points on the digital map
Figure FDA0002329248750000011
Wherein n is the number of control points; and obtaining elevation information (L) of the control point using the DEM modeli,Bi,Hi),i=1,...,n。
3. The positioning method based on satellite observation angle error estimation according to claim 1, wherein obtaining the rough coordinates of the control point in the geocentric rotation coordinate system according to the elevation information includes:
using the obtained elevation information (L)i,Bi,Hi) N, calculating a rough coordinate (X) of a control point on the first scene image in a geocentric rotation coordinate system by a direct positioning methodi,Yi,Zi),i=1,...,n。
4. The positioning method based on satellite observation angle error estimation according to claim 1, wherein converting the precise geographical coordinates of the control point into coordinates in the geocentric rotation coordinate system, and subtracting the rough coordinates to obtain the coordinate error of the control point in the geocentric rotation coordinate system comprises:
will be precise geographic coordinates
Figure FDA0002329248750000021
Conversion to elevation information (L)0 i,B0 i,H0 i) 1.. n, the conversion formula is as follows:
will (L)0 i,B0 i,H0 i) N is converted into corresponding coordinates in a geocentric rotation coordinate system
Figure FDA0002329248750000022
Will be provided with
Figure FDA0002329248750000023
Minus the coarse coordinate (X)i,Yi,Zi) I 1.. n, which yields the coordinate error (Δ X, Δ Y, Δ Z) of the control point in the earth-centered rotation coordinate system.
5. The satellite observation angle error estimation-based positioning method according to claim 1, wherein calculating an observation angle error from the coordinate error comprises:
if certain system error delta psi exists in observation angle of satellite sensorx,Δψy
And obtaining the observation vector error transmitted from the observation vector error in the satellite body coordinate system to the earth center rotating coordinate system according to the transformation relation from the satellite body coordinate system to the orbit coordinate system and from the orbit coordinate system to the protocol inertia coordinate system.
6. The positioning method based on satellite observation angle error estimation according to claim 1, wherein the compensating the observation angle error to the in-orbit second scene image to obtain an accurate observation angle value comprises:
compensating the obtained observation angle error to the second scene image on the same orbit to obtain an accurate value of the observation angle of (psi)x+Δψxy+Δψy) Wherein the error of the observation angle of the satellite sensor is (delta phi)x,Δψy) If there is a certain system error delta psi in the observation angle of the satellite sensorx,Δψy
7. The positioning method according to claim 1, wherein the coordinates of any point on the ground in the geographic coordinate system can be represented by (L, B, H) or (X, Y, Z), where L is a dihedral angle formed by the meridian plane of the ground and the meridian plane of the first meridian, B is an angle between the normal of the ellipsoid corresponding to the ground and the equatorial plane, and H is an ellipsoidal height, i.e. a distance from the ground point to the ellipsoid along the normal.
8. The positioning method based on the satellite observation angle error estimation according to claim 1, characterized in that an earth coordinate system with a protocol earth as a reference point is adopted in the earth center rotation coordinate system, the earth centroid is taken as an origin, the Z-axis points to the north pole of the earth, the X-axis points to the intersection point of the greenwich mean meridian and the equator of the earth, and the Y-axis is determined according to the right-hand rule.
9. The positioning method based on satellite observation angle error estimation according to claim 1, characterized in that the first scene image is a satellite remote sensing image as a sample, and the second scene image is a satellite remote sensing image to be positioned.
10. The satellite observation angle error estimation-based positioning method of claim 1, wherein the second scene image is an in-orbit image of the first scene image.
CN201911333635.2A 2019-12-20 2019-12-20 Positioning method based on satellite observation angle error estimation Pending CN111060078A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911333635.2A CN111060078A (en) 2019-12-20 2019-12-20 Positioning method based on satellite observation angle error estimation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911333635.2A CN111060078A (en) 2019-12-20 2019-12-20 Positioning method based on satellite observation angle error estimation

Publications (1)

Publication Number Publication Date
CN111060078A true CN111060078A (en) 2020-04-24

Family

ID=70301483

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911333635.2A Pending CN111060078A (en) 2019-12-20 2019-12-20 Positioning method based on satellite observation angle error estimation

Country Status (1)

Country Link
CN (1) CN111060078A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112762939A (en) * 2020-12-24 2021-05-07 北京航天飞腾装备技术有限责任公司 Target coordinate acquisition system and method based on digital map
CN114187514A (en) * 2021-12-13 2022-03-15 北京环境特性研究所 Pixel space expression method of earth limb interface
CN114383632A (en) * 2021-12-23 2022-04-22 北京市遥感信息研究所 Optical satellite processing target positioning precision evaluation method based on mean square error
CN116796522A (en) * 2023-06-05 2023-09-22 中国人民解放军战略支援部队航天工程大学 Satellite data processing method, device, equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100201568A1 (en) * 2009-02-09 2010-08-12 National Taiwan University Method for implementing gps surveying field work planning using 3d topographic informaiton and method for analyzing 3d topographic information
CN102346033A (en) * 2010-08-06 2012-02-08 清华大学 Direct positioning method and system based on satellite observation angle error estimation
CN104121884A (en) * 2014-07-24 2014-10-29 中国科学院遥感与数字地球研究所 Method for calculating observation zenith angle and azimuth angle of pixel of satellite image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100201568A1 (en) * 2009-02-09 2010-08-12 National Taiwan University Method for implementing gps surveying field work planning using 3d topographic informaiton and method for analyzing 3d topographic information
CN102346033A (en) * 2010-08-06 2012-02-08 清华大学 Direct positioning method and system based on satellite observation angle error estimation
CN104121884A (en) * 2014-07-24 2014-10-29 中国科学院遥感与数字地球研究所 Method for calculating observation zenith angle and azimuth angle of pixel of satellite image

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张华海 等: "由空间直角坐标计算大地坐标的简便公式", 《全球定位系统》 *
李岳: "坐标转换系统的设计与实现", 《中国优秀硕士学位论文全文数据库 电子期刊》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112762939A (en) * 2020-12-24 2021-05-07 北京航天飞腾装备技术有限责任公司 Target coordinate acquisition system and method based on digital map
CN112762939B (en) * 2020-12-24 2022-10-04 北京航天飞腾装备技术有限责任公司 Target coordinate acquisition system and method based on digital map
CN114187514A (en) * 2021-12-13 2022-03-15 北京环境特性研究所 Pixel space expression method of earth limb interface
CN114383632A (en) * 2021-12-23 2022-04-22 北京市遥感信息研究所 Optical satellite processing target positioning precision evaluation method based on mean square error
CN114383632B (en) * 2021-12-23 2023-09-29 北京市遥感信息研究所 Method for evaluating positioning accuracy of optical on-satellite processing target based on root mean square error
CN116796522A (en) * 2023-06-05 2023-09-22 中国人民解放军战略支援部队航天工程大学 Satellite data processing method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN111060078A (en) Positioning method based on satellite observation angle error estimation
EP3411725B1 (en) A method and device for calibration of a three-axis magnetometer
CN107504981B (en) Satellite attitude error correction method and device based on laser height measurement data
Wang et al. Geometric accuracy validation for ZY-3 satellite imagery
CN106871932B (en) Satellite-borne laser on-orbit pointing calibration method based on pyramid search terrain matching
CN109708649B (en) Attitude determination method and system for remote sensing satellite
CN104764443B (en) A kind of tight imaging geometry model building method of Optical remote satellite
CN110017849A (en) A kind of tilt measuring method of the mapping all-in-one machine based on GNSS receiver and IMU sensor
CN110503687B (en) Target positioning method for aerial photoelectric measurement platform
CN102346033A (en) Direct positioning method and system based on satellite observation angle error estimation
CN108225307A (en) A kind of star pattern matching method of inertia measurement information auxiliary
CN107917699A (en) A kind of method for being used to improve empty three mass of mountain area landforms oblique photograph measurement
CN112862966B (en) Method, device, equipment and storage medium for constructing surface three-dimensional model
CN113739767A (en) Method for producing orthoimage aiming at image acquired by domestic area array swinging imaging system
CN106441297B (en) The gravity error vector acquisition methods and device of inertial navigation system
CN108253942B (en) Method for improving oblique photography measurement space-three quality
CN106643726B (en) Unified inertial navigation resolving method
CN107705272A (en) A kind of high-precision geometric correction method of aerial image
CN108710145A (en) A kind of unmanned plane positioning system and method
CN112833878A (en) Near-ground multi-source astronomical autonomous navigation method
CN110514201A (en) A kind of inertial navigation system and the air navigation aid suitable for high revolving speed rotary body
CN114509071B (en) Attitude measurement method for wind tunnel test model
CN115950419A (en) Combined navigation method, device and system for subminiature unmanned aerial vehicle
CN108801260B (en) Data processing method and device based on underwater robot
CN110887475B (en) Static base rough alignment method based on north polarization pole and polarized solar vector

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200424

RJ01 Rejection of invention patent application after publication