CN113705734B - Remote sensing image characteristic point elevation obtaining method based on multiple sensors and geocentric - Google Patents

Remote sensing image characteristic point elevation obtaining method based on multiple sensors and geocentric Download PDF

Info

Publication number
CN113705734B
CN113705734B CN202111163428.4A CN202111163428A CN113705734B CN 113705734 B CN113705734 B CN 113705734B CN 202111163428 A CN202111163428 A CN 202111163428A CN 113705734 B CN113705734 B CN 113705734B
Authority
CN
China
Prior art keywords
remote sensing
point
slam
unmanned aerial
aerial vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111163428.4A
Other languages
Chinese (zh)
Other versions
CN113705734A (en
Inventor
李晨阳
耿虎军
高峰
关俊志
张泽勇
柴兴华
陈彦桥
王雅涵
蔡迎哲
牛韶源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CETC 54 Research Institute
Original Assignee
CETC 54 Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CETC 54 Research Institute filed Critical CETC 54 Research Institute
Priority to CN202111163428.4A priority Critical patent/CN113705734B/en
Publication of CN113705734A publication Critical patent/CN113705734A/en
Application granted granted Critical
Publication of CN113705734B publication Critical patent/CN113705734B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/11Complex mathematical operations for solving equations, e.g. nonlinear equations, general mathematical optimization problems
    • G06F17/12Simultaneous equations, e.g. systems of linear equations

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Operations Research (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Algebra (AREA)
  • Image Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a remote sensing image characteristic point elevation obtaining method based on multiple sensors and geocentric, and belongs to the field of autonomous navigation and remote sensing images. The method controls the unmanned aerial vehicle to cruise in a remote sensing image area, adopts an SLAM method combining an onboard camera and an IMU to establish an SLAM map containing pose information of the unmanned aerial vehicle and sparse point cloud information of a flying scene of the unmanned aerial vehicle, obtains coordinates of a ground center in an SLAM coordinate system, obtains the elevation of a feature point, and finally adds elevation information into the remote sensing image. The method can obtain the coordinates of the geocentric in the SLAM coordinate system, increases the elevation information of the feature points in the remote sensing map, and can provide more references for remote sensing map users.

Description

Remote sensing image feature point elevation obtaining method based on multiple sensors and geocenter
Technical Field
The invention relates to the field of autonomous navigation and remote sensing images, in particular to a remote sensing image feature point elevation obtaining method based on multiple sensors and geocentric.
Background
The remote sensing map is a visual map. Each pixel point in the remote sensing map has accurate geographical position information, and position reference can be provided for a user. However, the remote sensing map only contains two-dimensional information of each pixel point, and a user cannot obtain elevation information of each point in the remote sensing map. Therefore, after elevation information is added to the feature points in the map by using the multiple sensors and the geocentric, the information in the remote sensing map is richer, and more references are provided for a user.
The multiple sensors comprise an airborne camera, an IMU (inertial measurement Unit), an air pressure gauge and the like, the unmanned aerial vehicle can be autonomously positioned and the sparse point cloud map can be reconstructed by fusing camera and IMU information through an unmanned aerial vehicle SLAM (Simultaneous localization and Mapping) method, but the three-dimensional point information in the map is based on the position and the direction of the unmanned aerial vehicle SLAM during initialization, and the three-dimensional information of points cannot be directly added into a remote sensing map through the information of each sensor.
Disclosure of Invention
In order to solve the problems, the invention provides a remote sensing image feature point elevation acquisition method based on multiple sensors and geocentric, which is based on airborne camera image information and IMU information, realizes autonomous positioning of an unmanned aerial vehicle and sparse point cloud map reconstruction of an unmanned aerial vehicle flight scene through an unmanned aerial vehicle SLAM technology, recovers height information of feature points in a remote sensing image by using an air pressure altimeter and geocentric information, and provides more reference information for a remote sensing map user.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
a remote sensing image feature point elevation obtaining method based on multiple sensors and geocentric comprises the following steps:
loading a remote sensing image and calculating to obtain position information and descriptor information of feature points in the remote sensing map, setting a flight area of the unmanned aerial vehicle according to a shooting range of the remote sensing image, controlling the unmanned aerial vehicle to cruise and fly in the flight area, and calculating the average radius of the earth in the flight area according to the latitude and longitude range of the remote sensing image;
estimating the pose of the unmanned aerial vehicle by using a SLAM method combining an airborne camera and an IMU, recovering feature points in a camera image into three-dimensional point cloud, and establishing an SLAM map containing the pose of the unmanned aerial vehicle and scene three-dimensional point cloud information;
acquiring altitude information of the unmanned aerial vehicle at each moment by using a barometric altimeter, obtaining the distance from the unmanned aerial vehicle to the geocenter at each moment by adding according to the average radius of the earth in a flight area, establishing an equation set of geocentric coordinates in an SLAM coordinate system according to a point-to-point distance formula, and solving to obtain the coordinates of the geocenter in the SLAM coordinate system;
and performing feature matching on the airborne image and the remote sensing image to obtain a three-dimensional coordinate of a matching point in the SLAM map, calculating the distance between the matching point and the center of the earth in the SLAM coordinate system, subtracting the average radius of the earth in a flight area from the obtained distance to obtain elevation information of the feature point, and adding the elevation information of the feature point into the remote sensing map.
Further, estimating the pose of the unmanned aerial vehicle by using a SLAM method combining an airborne camera and an IMU, recovering feature points in a camera image into three-dimensional point cloud, and establishing an SLAM map containing the pose of the unmanned aerial vehicle and scene three-dimensional point cloud information, wherein the SLAM method comprises the following steps:
setting an origin and a direction of an unmanned aerial vehicle SLAM coordinate system, and determining an external parameter matrix of a camera and an IMU;
carrying out feature detection on the airborne camera image sequence to obtain position information and descriptor information of the feature points, and obtaining the positions of the same feature points in different camera images in a feature tracking mode;
calculating pose transformation among different camera images by a multi-view geometric method, and recovering feature points in the camera images into three-dimensional point clouds by a triangulation method;
optimizing the pose of the unmanned aerial vehicle and the three-dimensional point cloud coordinate by using a light beam adjustment method;
calculating and optimizing various parameters of the IMU according to the calculated pose information of the unmanned aerial vehicle and data output by the IMU, and calculating to obtain a pre-integral quantity of the IMU;
and integrating visual and IMU information to establish the unmanned aerial vehicle SLAM map with scale information.
Further, the method includes the steps of acquiring altitude information of the unmanned aerial vehicle at each moment by using a barometric altimeter, obtaining the distance from the unmanned aerial vehicle to the geocenter at each moment by adding according to the average radius of the earth in a flight area, establishing an equation set of geocentric coordinates in a SLAM coordinate system according to a point-to-point distance formula, and solving to obtain the coordinates of the geocenter in the SLAM coordinate system, and comprises the following steps:
acquiring and outputting the flight altitude information of the unmanned aerial vehicle with the time stamp in real time through a barometric altimeter, aligning the altitude information with SLAM information according to the time stamp to obtain the flight altitude of the unmanned aerial vehicle at each moment in the SLAM, and adding the flight altitude of the unmanned aerial vehicle with the average radius of the earth in a flight area to obtain the distance from the unmanned aerial vehicle to the geocenter at each moment;
and establishing an equation set of the geocentric coordinate in the SLAM coordinate system according to a point-to-point distance formula, and solving to obtain the coordinate of the geocentric in the SLAM coordinate system.
Further, the method comprises the steps of carrying out feature matching on the airborne image and the remote sensing image to obtain a three-dimensional coordinate of a matching point in the SLAM map, calculating the distance between the matching point and the center of earth in the SLAM coordinate system, subtracting the average radius of the earth in a flight area from the obtained distance to obtain elevation information of the feature point, and adding the elevation information of the feature point into the remote sensing map, wherein the method comprises the following steps:
matching the airborne image characteristic points and the remote sensing image characteristic points according to the descriptor information of the characteristic points, and establishing a characteristic matching point pair relation;
finding a corresponding three-dimensional coordinate point of the feature point in the remote sensing image in the SLAM map according to the mapping relation between the feature point of the airborne image and the three-dimensional point in the SLAM coordinate system;
obtaining the distance between the characteristic point of the remote sensing image in the SLAM map and the center of the earth by using a point-to-point distance formula of a coordinate system, and subtracting the average radius of the earth in a flight area from the obtained distance to obtain elevation information of the characteristic point in the remote sensing image;
when a certain feature point in the remote sensing map is matched with the feature points in the plurality of airborne images, respectively calculating elevation information for each matching condition, calculating an average value, and adding the average value serving as final elevation information into information of the feature point position of the remote sensing map.
Compared with the prior art, the method has the following beneficial effects:
the method controls the unmanned aerial vehicle to cruise in a remote sensing image area, the average earth radius of the area can be obtained according to longitude and latitude, an SLAM map containing pose information of the unmanned aerial vehicle and sparse point cloud information of a flight scene of the unmanned aerial vehicle is established by adopting an SLAM method combining an airborne camera and an IMU, the altitude of the unmanned aerial vehicle at each moment is obtained according to altimeter information, the distance from the unmanned aerial vehicle to the center of the earth at each moment can be obtained by adding the altitude to the average earth radius, the distance is equal to the Euclidean distance between the position of the unmanned aerial vehicle and the center of the earth in an SLAM coordinate system, an equation set is established according to the constraint, the coordinate of the center of the earth in an SLAM coordinate system can be obtained, feature matching is carried out on the remote sensing image and the airborne camera image, the mapped three-dimensional coordinates of the feature points in the sparse point cloud map which are successfully matched are found, the distance between the three-dimensional point and the center of the earth is calculated, the local average earth radius is subtracted from the distance, the elevation of the feature points can be obtained, and finally the elevation information is added to the remote sensing image. Compared with the existing remote sensing map, the method can obtain the coordinates of the geocentric in the SLAM coordinate system, increases the elevation information of the feature points in the remote sensing map, and can provide more references for a remote sensing map user.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the structures shown in the drawings without creative efforts.
FIG. 1 is a flow chart of a method according to an embodiment of the present invention.
Detailed description of the invention
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, a method for obtaining elevation of feature points of remote sensing images based on multiple sensors and geocentric includes:
s1, loading a remote sensing image, calculating to obtain position information and descriptor information of feature points in the remote sensing map, setting an unmanned aerial vehicle flight area according to a shooting range of the remote sensing image, controlling the unmanned aerial vehicle to cruise and fly in the area, and calculating an average earth radius in the area according to a longitude and latitude range of the remote sensing image;
s2, estimating the pose of the unmanned aerial vehicle by using a SLAM method combining an airborne camera and an IMU, recovering feature points in a camera image into three-dimensional point cloud, and establishing a SLAM map (a map generated based on the SLAM method) containing the pose of the unmanned aerial vehicle and scene three-dimensional point cloud information;
s3, acquiring altitude information of the unmanned aerial vehicle at each moment by using a barometric altimeter, wherein the average earth radius of the area is known, adding the average earth radius and the average earth radius to obtain the distance from the unmanned aerial vehicle to the geocenter at each moment, establishing an equation set of geocenter coordinates in an SLAM coordinate system according to a point-to-point distance formula, and solving to obtain the coordinates of the geocenter in the SLAM coordinate system (the coordinate system established based on the SLAM method);
and S4, carrying out feature matching on the airborne image and the remote sensing image to obtain a three-dimensional coordinate of the matching point in the SLAM map, calculating the distance between the matching point and the geocenter in the SLAM coordinate system, subtracting the average radius of the earth in the area from the distance to obtain the flight height of the unmanned aerial vehicle, and finally adding the elevation information of the feature point into the remote sensing map.
In the step S1, loading the remote sensing image and calculating to obtain the position information and the descriptor information of the feature points in the remote sensing map, setting the flight area of the unmanned aerial vehicle according to the shooting range of the remote sensing image, controlling the unmanned aerial vehicle to cruise in the area, and calculating the average earth radius in the area according to the longitude and latitude of each point of the remote sensing image, wherein the method comprises the following steps:
s11: loading the remote sensing image and calculating by using a characteristic detection algorithm to obtain position information and descriptor information of characteristic points in the remote sensing image;
the feature detection algorithm is a feature extraction algorithm based on ORB features, a feature extraction algorithm based on SIFT features, or a feature extraction algorithm based on SURF features, or the like.
Recording the characteristic points in the remote sensing image as
Figure BDA0003290638670000041
Wherein y represents the y-th remote sensing image, and N represents the total number of visual features in a certain remote sensing image.
Obtaining the descriptor information of the corresponding feature points by SIFT feature, SURF feature, ORB feature and other methods according to the information of the pixels around the feature points, and recording the descriptor information as
Figure BDA0003290638670000042
S12: setting a flight area of the unmanned aerial vehicle according to the shooting range of the remote sensing image, and controlling the unmanned aerial vehicle to cruise and fly after taking off in the area;
s13: and calculating the average earth radius in the area according to the latitude and longitude range of the remote sensing image.
In step S13, calculating the distance from each point in the area to the geocenter according to the latitude and longitude range of the remote sensing image, including:
s131: in the geographic information, the altitude is based on a geodetic horizontal plane, but the radius of the earth of each point on the plane is not easy to obtain, and the distance from each point in the area to the geocenter can be calculated by the radius of the standard ellipsoid of the earth of the point in consideration that the error (maximum 106 m) between the geodetic horizontal plane and the standard ellipsoid can be ignored compared with the radius of the earth (about 6356-6378 km);
s132: the longitude and latitude of each point in the remote sensing map are known, in a standard ellipsoid of the earth, the long axis is 6378km, the short axis is 6356km, the distance from each point in the region with the same latitude to the geocenter is equal and is irrelevant to the precision value, so that the distance from each point to the geocenter can be calculated in the ellipse with the long axis 6378km and the short axis 6356km through the latitude;
s133: in the cruising range of the unmanned aerial vehicle, the variation range of the earth radius is very small, and the average radius of each point in the area is set as the earth radius value of each point.
In the step S2, estimating the pose of the unmanned aerial vehicle by using a SLAM method combining an airborne camera and an IMU, recovering feature points in a camera image into three-dimensional point cloud, and establishing an SLAM map containing the pose of the unmanned aerial vehicle and scene three-dimensional point cloud information, wherein the SLAM map comprises the following steps:
s21: setting an origin and a direction of an unmanned aerial vehicle SLAM coordinate system, and determining an external parameter matrix of a camera and an IMU;
and setting the initialized position of the SLAM of the unmanned aerial vehicle as the original point of the SLAM coordinate system, wherein the XYZ coordinate axes point to the front, the left and the upper of the unmanned aerial vehicle respectively. And determining an external parameter matrix of the camera and the IMU according to the position relation of the camera, the IMU and the unmanned aerial vehicle.
S22: carrying out feature detection on an airborne camera image sequence to obtain position information and descriptor information of feature points, and obtaining the positions of the same feature points in different camera images in a feature tracking mode;
the feature detection algorithm is a feature extraction algorithm based on ORB features, a feature extraction algorithm based on SIFT features, or a feature extraction algorithm based on SURF features, or the like.
Recording feature points in an onboard camera image
Figure BDA0003290638670000051
Wherein y represents the y-th airborne camera image, and N represents the total number of visual features in a certain airborne camera image.
Obtaining the descriptor information of the corresponding feature points by SIFT feature, SURF feature, ORB feature and other methods according to the information of the pixels around the feature points, and recording the descriptor information as
Figure BDA0003290638670000052
The feature tracking adopts the existing methods, such as KLT sparse optical flow tracking method, dense optical flow tracking method, feature matching method and the like.
S23: calculating pose transformation among different camera images by a multi-view geometric method, and recovering feature points in the camera images into three-dimensional point clouds by a triangulation method;
the multi-view geometry method includes a PNP method, a method of solving the inter-image basis matrix F, and a method of solving the inter-image homography matrix H, and the like.
S24: optimizing the pose of the unmanned aerial vehicle and the three-dimensional point cloud coordinate by using a light beam adjustment method;
s25: calculating and optimizing various parameters of the IMU according to the calculated pose information of the unmanned aerial vehicle and data output by the IMU, and calculating to obtain a pre-integral quantity of the IMU;
s26: and integrating visual and IMU information to establish the unmanned aerial vehicle SLAM map with scale information.
In step S3, acquiring altitude information of the unmanned aerial vehicle at each moment by using a barometric altimeter, wherein the average earth radius of the area is known, adding the average earth radius and the average earth radius to obtain the distance from the unmanned aerial vehicle to the geocenter at each moment, establishing an equation set of geocentric coordinates in an SLAM coordinate system according to a point-to-point distance formula, and solving to obtain the coordinates of the geocenter in the SLAM coordinate system, wherein the method comprises the following steps:
s31: the air pressure altimeter can obtain and output the flying height of the unmanned aerial vehicle in real time, the altimeter information is aligned with the SLAM information through the timestamp information to obtain the flying height of the unmanned aerial vehicle at each moment in the SLAM, and the flying height of the unmanned aerial vehicle at each moment is added with the average earth radius of the area to obtain the distance from the unmanned aerial vehicle to the center of the earth at each moment;
s32: and establishing an equation set of the geocentric coordinates in the SLAM coordinate system by using a point-to-point distance formula according to the distance from each point in the SLAM coordinate system to the geocenter, and solving to obtain the coordinates of the geocenter in the SLAM coordinate system.
In step S31, the barometric altimeter may obtain and output the flying height of the unmanned aerial vehicle in real time, align altimeter information with SLAM information through timestamp information, obtain the flying height of the unmanned aerial vehicle at each time in SLAM, add up with the average earth radius in the area, and then obtain the distance from the unmanned aerial vehicle to the geocenter at each time, including:
s311: the barometric altimeter can obtain and output the flying altitude of the unmanned aerial vehicle in real time, and the altitude { h) of each position of the unmanned aerial vehicle in the SLAM coordinate system is obtained by aligning the data of the barometric altimeter data frame and the time stamp information of the image frame in the SLAM and obtaining the altitude { h) of each position of the unmanned aerial vehicle in the SLAM coordinate system 1 ,h 2 ,h 2 …h n N is the number of unmanned plane location points in SLAM;
s312: the altitude of each position of the unmanned aerial vehicle is added with the average earth radius of the area to obtain the distance { l ] between each position of the unmanned aerial vehicle and the geocentric 1 ,l 2 ,l 2 ...l n N is the number of the unmanned plane location points in SLAM.
In step S32, according to the distance from each point in the SLAM coordinate system to the geocenter, an equation set of geocenter coordinates in the SLAM coordinate system is established by using a point-to-point distance formula, and the coordinate of the geocenter in the SLAM coordinate system is obtained by solving, including:
s321: in the SLAM coordinate system, the SLAM initialization point of the unmanned plane is the original point and is marked as (0, 0), and the coordinates of each position of the unmanned plane are marked as (x) i ,y i ,z i ) The serial numbers of the positions correspond to the distances from the unmanned aerial vehicles to the center of the earth one by one;
s322: the position of the geocenter in the SLAM coordinate system is set as (x) o ,y o ,z o );
S323: according to the distance from each point in the SLAM coordinate system to the geocenter, a point-to-point distance formula is utilized, and the method comprises the following steps:
Figure BDA0003290638670000061
the finishing can be carried out as follows:
Figure BDA0003290638670000062
after the development, the formula is subtracted from the above formula, and the product can be obtained after the finishing:
Figure BDA0003290638670000071
converting the above equation into a matrix form:
A·X=b
wherein, A is coefficient matrix, X is unknown matrix, b is real matrix, and has:
Figure BDA0003290638670000072
Figure BDA0003290638670000073
Figure BDA0003290638670000074
the solution to this equation can be found by multiplying both sides simultaneously by the inverse of the coefficient matrix, i.e.
X=A -1 b
When the coefficient matrix is full rank, the equation has a unique least squares solution. When the coefficient matrix is not full rank, the equation has a plurality of least square solutions, and a quadratic solution with the minimum modular length is taken as a final solution. The coordinates of the geocenter in the SLAM coordinate system can be obtained.
In step S4, carrying out feature matching on the airborne image and the remote sensing image to obtain a three-dimensional coordinate of a matching point in the SLAM map, calculating the distance between the matching point and the geocenter in the SLAM coordinate system, subtracting the average radius of the earth in the area from the distance to obtain the flight height of the unmanned aerial vehicle, and finally adding the elevation information of the feature point into the remote sensing map, wherein the step S comprises the following steps:
s41: according to the descriptor information of the characteristic points, the characteristic points of the airborne images are processed
Figure BDA0003290638670000075
And remote sensing image feature points
Figure BDA0003290638670000076
Matching is carried out, and a characteristic matching point pair relation is established;
s42: finding out the corresponding three-dimensional coordinate point of the feature point in the remote sensing image in the SLAM map according to the mapping relation between the feature point of the airborne image and the three-dimensional point in the SLAM coordinate system, and recording the coordinate of the three-dimensional coordinate point as the corresponding three-dimensional coordinate point in the SLAM map
Figure BDA0003290638670000077
Y represents the y-th remote sensing image, and i is the serial number of the point in the characteristic point in the remote sensing image;
s43: by using a point-to-point distance formula in a coordinate system, the distance between the remote sensing image feature point in the SLAM map and the center of the earth can be obtained, and the following steps are obtained:
Figure BDA0003290638670000081
wherein d is the distance from the characteristic point to the geocenter in the SLAM coordinate system.
S44: subtracting the average earth radius of the area from the distance d between the feature point and the geocenter to obtain elevation information h of the feature point in the remote sensing image;
s45: when a certain feature point in the remote sensing map is matched with feature points in a plurality of airborne images, the calculation process is repeated, and finally, the average value is calculated and calculated
Figure BDA0003290638670000082
And adding the information into the information of the position of the characteristic point of the remote sensing map.
In a word, the method controls the unmanned aerial vehicle to cruise in a remote sensing image area, the average earth radius of the area can be obtained according to longitude and latitude, an SLAM map containing pose information of the unmanned aerial vehicle and sparse point cloud information of a flight scene of the unmanned aerial vehicle is established by adopting an SLAM method combining an airborne camera and an IMU, the altitude of the unmanned aerial vehicle at each moment is obtained according to altimeter information, the distance from the unmanned aerial vehicle to the center of the earth at each moment can be obtained by adding the altitude to the average earth radius, the distance is equal to the Euclidean distance between the position of the unmanned aerial vehicle and the center of the earth in an SLAM coordinate system, an equation set is established according to the constraint, the coordinate of the center of the earth in an SLAM coordinate system can be obtained, feature matching is carried out on the remote sensing image and the airborne camera image, the mapped three-dimensional coordinates of the feature points in the sparse point cloud map which are successfully matched are found, the distance between the three-dimensional point and the center of the earth is calculated, the local average earth radius is subtracted from the distance, the elevation of the feature points can be obtained, and finally the elevation information is added into the remote sensing image. Compared with the existing remote sensing map, the method can obtain the coordinates of the geocentric in the SLAM coordinate system, increases the elevation information of the feature points in the remote sensing map, and can provide more references for a remote sensing map user.

Claims (4)

1. A remote sensing image feature point elevation obtaining method based on multiple sensors and geocentric is characterized by comprising the following steps:
loading a remote sensing image and calculating to obtain position information and descriptor information of feature points in the remote sensing map, setting a flight area of the unmanned aerial vehicle according to a shooting range of the remote sensing image, controlling the unmanned aerial vehicle to cruise in the flight area, and calculating the average radius of the earth in the flight area according to the latitude and longitude ranges of the remote sensing image;
estimating the pose of the unmanned aerial vehicle by using an SLAM method combining an airborne camera and an IMU, recovering feature points in a camera image into a three-dimensional point cloud, and establishing an SLAM map containing the pose of the unmanned aerial vehicle and scene three-dimensional point cloud information;
acquiring altitude information of the unmanned aerial vehicle at each moment by using a barometric altimeter, obtaining the distance from the unmanned aerial vehicle to the geocenter at each moment by adding according to the average radius of the earth in a flight area, establishing an equation set of geocenter coordinates in an SLAM coordinate system according to a point-to-point distance formula, and solving to obtain the coordinates of the geocenter in the SLAM coordinate system;
and performing feature matching on the airborne image and the remote sensing image to obtain a three-dimensional coordinate of the matching point in the SLAM map, calculating the distance between the matching point and the geocenter in the SLAM coordinate system, subtracting the average radius of the earth in the flight area from the obtained distance to obtain elevation information of the feature point, and adding the elevation information of the feature point into the remote sensing map.
2. The method for obtaining elevation of the feature points of the remote sensing images based on the multiple sensors and the geocentric, as claimed in claim 1, is characterized in that the method of SLAM combined by an airborne camera and an IMU is used for estimating the pose of the unmanned aerial vehicle, the feature points in the camera image are recovered to be three-dimensional point cloud, and an SLAM map containing the pose of the unmanned aerial vehicle and scene three-dimensional point cloud information is established, and comprises the following steps:
setting an original point and a direction of an unmanned aerial vehicle SLAM coordinate system, and determining an external parameter matrix of a camera and an IMU;
carrying out feature detection on an airborne camera image sequence to obtain position information and descriptor information of feature points, and obtaining the positions of the same feature points in different camera images in a feature tracking mode;
calculating pose transformation among different camera images by a multi-view geometric method, and recovering feature points in the camera images into three-dimensional point clouds by a triangulation method;
optimizing the pose of the unmanned aerial vehicle and the three-dimensional point cloud coordinate by using a light beam adjustment method;
calculating and optimizing various parameters of the IMU according to the calculated pose information of the unmanned aerial vehicle and data output by the IMU, and calculating to obtain a pre-integral quantity of the IMU;
and integrating visual and IMU information to establish the unmanned aerial vehicle SLAM map with scale information.
3. The method for obtaining elevation of the feature point of the remote sensing image based on the multiple sensors and the geocenter as claimed in claim 1, wherein the method comprises the steps of collecting altitude information of the unmanned aerial vehicle at each moment by using a barometric altimeter, obtaining the distance from the unmanned aerial vehicle to the geocenter at each moment by adding according to the average radius of the earth in a flight area, establishing an equation set of geocenter coordinates in a SLAM coordinate system according to a point-to-point distance formula, and solving to obtain the coordinates of the geocenter in the SLAM coordinate system, and comprises the following steps:
acquiring and outputting the flight altitude information of the unmanned aerial vehicle with the time stamp in real time through a barometric altimeter, aligning the altitude information with SLAM information according to the time stamp to obtain the flight altitude of the unmanned aerial vehicle at each moment in the SLAM, and adding the flight altitude of the unmanned aerial vehicle with the average radius of the earth in a flight area to obtain the distance from the unmanned aerial vehicle to the geocenter at each moment;
and establishing an equation set of the geocentric coordinates in the SLAM coordinate system according to a point-to-point distance formula, and solving to obtain the coordinates of the geocentric in the SLAM coordinate system.
4. The method for obtaining elevation of the feature points of the remote sensing images based on the multiple sensors and the geocenter as claimed in claim 1, wherein the method for obtaining elevation of the feature points comprises the steps of performing feature matching on the airborne images and the remote sensing images to obtain three-dimensional coordinates of the matching points in a SLAM map, calculating the distance between the matching points and the geocenter in the SLAM coordinate system, subtracting the average radius of the earth in a flight area from the obtained distance to obtain elevation information of the feature points, and adding the elevation information of the feature points to the remote sensing map, and comprises the following steps:
matching the airborne image characteristic points and the remote sensing image characteristic points according to the descriptor information of the characteristic points, and establishing a characteristic matching point pair relation;
finding out a three-dimensional coordinate point corresponding to the feature point in the remote sensing image in the SLAM map according to the mapping relation between the feature point of the airborne image and the three-dimensional point in the SLAM coordinate system;
obtaining the distance between the characteristic point of the remote sensing image in the SLAM map and the center of the earth by using a point-to-point distance formula of a coordinate system, and subtracting the average radius of the earth in a flight area from the obtained distance to obtain the elevation information of the characteristic point in the remote sensing image;
when a certain feature point in the remote sensing map is matched with the feature points in the plurality of airborne images, respectively calculating elevation information for each matching condition, calculating an average value, and adding the average value serving as final elevation information into information of the feature point position of the remote sensing map.
CN202111163428.4A 2021-09-30 2021-09-30 Remote sensing image characteristic point elevation obtaining method based on multiple sensors and geocentric Active CN113705734B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111163428.4A CN113705734B (en) 2021-09-30 2021-09-30 Remote sensing image characteristic point elevation obtaining method based on multiple sensors and geocentric

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111163428.4A CN113705734B (en) 2021-09-30 2021-09-30 Remote sensing image characteristic point elevation obtaining method based on multiple sensors and geocentric

Publications (2)

Publication Number Publication Date
CN113705734A CN113705734A (en) 2021-11-26
CN113705734B true CN113705734B (en) 2022-12-09

Family

ID=78662566

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111163428.4A Active CN113705734B (en) 2021-09-30 2021-09-30 Remote sensing image characteristic point elevation obtaining method based on multiple sensors and geocentric

Country Status (1)

Country Link
CN (1) CN113705734B (en)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109029417B (en) * 2018-05-21 2021-08-10 南京航空航天大学 Unmanned aerial vehicle SLAM method based on mixed visual odometer and multi-scale map
CN112577493B (en) * 2021-03-01 2021-05-04 中国人民解放军国防科技大学 Unmanned aerial vehicle autonomous positioning method and system based on remote sensing map assistance
CN113222820B (en) * 2021-05-20 2024-05-07 北京航空航天大学 Pose information-assisted aerial remote sensing image stitching method

Also Published As

Publication number Publication date
CN113705734A (en) 2021-11-26

Similar Documents

Publication Publication Date Title
CN109029417B (en) Unmanned aerial vehicle SLAM method based on mixed visual odometer and multi-scale map
CN110068335B (en) Unmanned aerial vehicle cluster real-time positioning method and system under GPS rejection environment
CN110926474B (en) Satellite/vision/laser combined urban canyon environment UAV positioning and navigation method
KR100912715B1 (en) Method and apparatus of digital photogrammetry by integrated modeling for different types of sensors
Nagai et al. UAV-borne 3-D mapping system by multisensor integration
Chiabrando et al. UAV and RPV systems for photogrammetric surveys in archaelogical areas: two tests in the Piedmont region (Italy)
CN112419374B (en) Unmanned aerial vehicle positioning method based on image registration
CN110859044A (en) Integrated sensor calibration in natural scenes
CN109443359B (en) Geographical positioning method of ground panoramic image
CN111024072B (en) Satellite map aided navigation positioning method based on deep learning
CN112184786B (en) Target positioning method based on synthetic vision
JPH11230745A (en) Altitude measurement device
KR102239562B1 (en) Fusion system between airborne and terrestrial observation data
CN115272596A (en) Multi-sensor fusion SLAM method oriented to monotonous texture-free large scene
CN112862966B (en) Method, device, equipment and storage medium for constructing surface three-dimensional model
CN113723568A (en) Remote sensing image characteristic point elevation obtaining method based on multiple sensors and sea level
CN108253942B (en) Method for improving oblique photography measurement space-three quality
CN113807435A (en) Remote sensing image characteristic point elevation acquisition method based on multiple sensors
CN115371673A (en) Binocular camera target positioning method based on Bundle Adjustment in unknown environment
CN113705734B (en) Remote sensing image characteristic point elevation obtaining method based on multiple sensors and geocentric
KR102130687B1 (en) System for information fusion among multiple sensor platforms
CN107784666B (en) Three-dimensional change detection and updating method for terrain and ground features based on three-dimensional images
CN116295340A (en) Unmanned aerial vehicle binocular vision SLAM method based on panoramic camera
CN115950435A (en) Real-time positioning method for unmanned aerial vehicle inspection image
CN114429515A (en) Point cloud map construction method, device and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant