CN109725340B - Direct geographic positioning method and device - Google Patents

Direct geographic positioning method and device Download PDF

Info

Publication number
CN109725340B
CN109725340B CN201811651475.1A CN201811651475A CN109725340B CN 109725340 B CN109725340 B CN 109725340B CN 201811651475 A CN201811651475 A CN 201811651475A CN 109725340 B CN109725340 B CN 109725340B
Authority
CN
China
Prior art keywords
coordinate system
image
rotation matrix
image space
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811651475.1A
Other languages
Chinese (zh)
Other versions
CN109725340A (en
Inventor
刘夯
饶丹
王进
王陈
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Zongheng Dapeng Unmanned Plane Technology Co ltd
Original Assignee
Chengdu Zongheng Dapeng Unmanned Plane Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Zongheng Dapeng Unmanned Plane Technology Co ltd filed Critical Chengdu Zongheng Dapeng Unmanned Plane Technology Co ltd
Priority to CN201811651475.1A priority Critical patent/CN109725340B/en
Publication of CN109725340A publication Critical patent/CN109725340A/en
Application granted granted Critical
Publication of CN109725340B publication Critical patent/CN109725340B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)
  • Navigation (AREA)

Abstract

The invention relates to the technical field of photogrammetry and remote sensing, and provides a direct geographic positioning method and a direct geographic positioning device, wherein the method comprises the following steps: acquiring an image to be processed, wherein the image to be processed is obtained by shooting through a camera on a carrier; obtaining an image space-photogrammetry rotation matrix from an image space coordinate system to a photogrammetry coordinate system; predefining an image space auxiliary coordinate system, enabling an image space-photogrammetry rotation matrix and an image space-image auxiliary rotation matrix containing external orientation angle elements to meet an equivalence relation, and calculating the external orientation angle elements of the image to be processed; and converting the target image point in the image to be processed into a photogrammetric coordinate system according to the outer azimuth element, the outer azimuth element of the image to be processed and the preset inner azimuth element to obtain a first geographical position of the target image point. Compared with the prior art, the direct geographical positioning method and the direct geographical positioning device provided by the invention can realize a remote sensing system with a high-precision real-time geographical positioning function.

Description

Direct geographic positioning method and device
Technical Field
The invention relates to the technical field of photogrammetry and remote sensing, in particular to a direct geographical positioning method and a direct geographical positioning device.
Background
The Direct Geo-Positioning (Direct Geo-Positioning) technology is a satellite-inertial navigation Positioning System (GPS and INS Integrated Positioning System) carried by a mobile carrier or a task load, and directly performs Geo-Positioning on a photo or a photographic target during a work process.
In the prior art, the accuracy of direct geo-location of photographs is not high.
Disclosure of Invention
The present invention aims to provide a direct geographic positioning method and device to improve the problem of low accuracy of direct geographic positioning of an image in the prior art.
In order to achieve the above purpose, the embodiment of the present invention adopts the following technical solutions:
in a first aspect, an embodiment of the present invention provides a direct geolocation method, where the method includes: acquiring an image to be processed, wherein the image to be processed is obtained by shooting through a camera on a carrier; obtaining an image space-photogrammetry rotation matrix from an image space coordinate system to a photogrammetry coordinate system; predefining an image space auxiliary coordinate system, enabling the image space-photogrammetry rotation matrix and an image space-image auxiliary rotation matrix containing an external orientation angle element to meet an equivalence relation, and calculating the external orientation angle element of the image to be processed, wherein the image space-image auxiliary rotation matrix represents a rotation transformation relation from the image space coordinate system to the image space auxiliary coordinate system; and converting the target image point in the image to be processed into the photogrammetric coordinate system according to the outer azimuth element, the outer azimuth element of the image to be processed and a preset inner azimuth element to obtain a first geographical position of the target image point.
In a second aspect, an embodiment of the present invention provides a direct geolocation apparatus, including: the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring an image to be processed, and the image to be processed is obtained by shooting by a camera on a carrier;
the processing module is used for obtaining an image space-photogrammetry rotation matrix from an image space coordinate system to a photogrammetry coordinate system; predefining an image space auxiliary coordinate system, enabling the image space-photogrammetry rotation matrix and an image space-image auxiliary rotation matrix containing an external orientation angle element to meet an equivalence relation, and calculating the external orientation angle element of the image to be processed, wherein the image space-image auxiliary rotation matrix represents a rotation transformation relation from the image space coordinate system to the image space auxiliary coordinate system; and converting the target image point in the image to be processed into the photogrammetric coordinate system according to the outer azimuth element, the outer azimuth element of the image to be processed and a preset inner azimuth element to obtain a first geographical position of the target image point.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
according to the direct geographical positioning method and device provided by the embodiment of the invention, the image space-photogrammetry rotation matrix is equivalent to the predefined image space-image auxiliary rotation matrix containing the external azimuth angle element, the external azimuth angle element of the image to be processed is solved to obtain the external azimuth angle element, and then the first position coordinate of the target image point in the image to be processed is obtained according to the external azimuth angle element, the line element and the internal azimuth element, so that the high-precision direct geographical positioning of the image to be processed is realized, and further, the remote sensing system with the high-precision real-time geographical positioning function is realized.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention, and therefore should not be considered as limiting the scope, and for a user of ordinary skill in the art, other related drawings can be obtained according to these drawings without creative efforts.
Fig. 1 shows a block schematic diagram of an electronic device provided by an embodiment of the present invention.
Fig. 2 shows a flowchart of a direct geolocation method provided by an embodiment of the present invention.
Fig. 3 is a flowchart illustrating the sub-steps of step S2 in fig. 2.
Fig. 4 shows a schematic diagram of a camera coordinate system provided by an embodiment of the invention.
Fig. 5 shows a schematic diagram of an image plane coordinate system provided by an embodiment of the invention.
Fig. 6 shows a block schematic diagram of a direct geolocation device provided by an embodiment of the present invention.
Icon: 100-an electronic device; 101-a processor; 102-a memory; 103-a bus; 104-a communication interface; 105-a display screen; 200-direct geolocation device; 201-an acquisition module; 202-processing module.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be obtained by a user skilled in the art without inventive work based on the embodiments of the present invention, are within the scope of protection of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present invention, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
In recent years, with the rise of applications of robots, unmanned planes and the like, there are some engineers engaged in the research of positioning and navigation technologies of robots and unmanned planes, and direct geographic positioning technologies are concerned and researched.
The direct geographic positioning is originated from aerial photogrammetry, and the current mature application is that in professional remote sensing software (such as Smart3D and Pix4 Dmap), the external orientation element of the photo is roughly calculated by using the direct geographic positioning to provide an iteration initial value for subsequent processing procedures such as beam adjustment and the like.
In theory, direct geolocation techniques can generate map products in real-time and near real-time; and this is the only possible way since it does not rely on dense join point matching to estimate the outer orientation element of the shot; however, limited by the data acquisition and processing accuracy of the sensors, applications for generating map products in real-time and near real-time are still not mature.
The current direct geographic positioning technology still has the following technical problems:
firstly, in the field of positioning and navigation of robots and unmanned planes, the research of engineers in non-photogrammetry and remote sensing professionals is incompatible with the existing definitions, theories and methods of photogrammetry in the aspects of coordinate system definition, technical link description and the like, which brings difficulty to the surveying and mapping result for the output of a direct positioning result.
Secondly, in the field of photogrammetry and remote sensing, research objects of direct geographic positioning technology are fixed cameras which are used for shooting, and only 6 degrees of freedom including position coordinates and attitude angles to the ground are not suitable for pod with more freedom, electric cradle heads or inclined fixed cradle head cameras.
Thirdly, because the image space is a two-dimensional space, the visual positioning is carried out under the general condition, at least a binocular vision system is needed to establish the stereoscopic vision, and the space three-dimensional coordinates of the photographic target are calculated by the intersection in front of the space, thereby realizing the direct geographic positioning. However, monocular remote sensing systems are not applicable.
The technical problem to be solved by the invention is to provide a direct geographical positioning method for realizing direct geographical positioning of a monocular remote sensing system. The direct geographical positioning method provided by the invention is different from the traditional direct geographical positioning method, firstly, the external azimuth angle element of the image to be processed is solved by equating an image space-photogrammetry rotation matrix with a predefined image space-image auxiliary rotation matrix containing the external azimuth angle element to obtain the external azimuth angle element, and then the first position coordinate of the target image point in the image to be processed is obtained according to the external azimuth angle element, the line element and the internal azimuth element, so that the high-precision direct geographical positioning of the image to be processed is realized, and a remote sensing system with a high-precision real-time geographical positioning function is further realized; secondly, a camera-carrier rotation matrix from a camera coordinate system to a carrier coordinate system is calculated by increasing three attitude angles, namely a first pitch angle, a first roll angle and a spin-yaw angle of the camera, so that the method can be applied to a pod with more freedom, an electric pan-tilt or a camera with an inclined fixed pan-tilt; and finally, shooting by a camera to obtain an image to be processed, directly positioning the target image point on the image to be processed in a geographical manner, and obtaining the geographical position coordinate of the target object point according to the central projection relation between the image point and the object point and the surface model, thereby realizing the direct geographical positioning of the monocular remote sensing system.
On the basis, an image space-camera rotation matrix from an image space coordinate system to a camera coordinate system is obtained; obtaining a camera-carrier rotation matrix from the camera coordinate system to a carrier coordinate system; obtaining a carrier-local navigation rotation matrix from the carrier coordinate system to a local navigation coordinate system; obtaining a local navigation-photogrammetry rotation matrix from the local navigation coordinate system to a photogrammetry coordinate system; and calculating an image space-photogrammetry rotation matrix according to the image space-camera rotation matrix, the camera-carrier rotation matrix, the carrier-local navigation rotation matrix and the local navigation-photogrammetry rotation matrix, and enabling the image space-photogrammetry rotation matrix to be equivalent to the image space-image auxiliary rotation matrix containing the external orientation angle elements to obtain the external orientation angle elements of the image to be processed, and the external orientation angle elements are compatible with the existing definitions, theories and methods of photogrammetry.
An application scenario of possible implementation of the direct geolocation method is provided below, and the direct geolocation method may be used in this application scenario and may also be used in other possible application scenarios, which is not limited in the embodiment of the present invention.
The direct geographical positioning method provided by the embodiment of the present invention may be applied to the electronic device 100 on a carrier, or may be applied to the electronic device 100 on the ground.
The carrier may be, but is not limited to, an airplane and a drone. The carrier is provided with a satellite-inertial navigation combined positioning system, and the carrier is provided with a pod or a cradle head which is provided with a camera. The pan/tilt head may be a two-axis pan/tilt head or a three-axis pan/tilt head, and the camera may be, but is not limited to, a Charge Coupled Device (CCD) camera or a Metal-Oxide Semiconductor (CMOS) camera.
The electronic device 100 is connected to the pod/pan/tilt head, to the combined satellite-inertial navigation positioning system, and to the camera, where the connection may be an electrical connection or a communication connection, which is not limited in the embodiment of the present invention.
Referring to fig. 1, fig. 1 is a block diagram illustrating an electronic device 100 according to an embodiment of the invention, where the electronic device includes a processor 101, a memory 102, a bus 103, and a communication interface 104. The processor 101, the memory 102 and the communication interface 104 are connected by a bus 103, and the processor 101 is configured to execute an executable module, such as a computer program, stored in the memory 102.
The processor 101 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the direct geolocation method may be performed by instructions in the form of hardware integrated logic circuits or software in the processor 101. The Processor 101 may be a general-purpose Processor 101, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the device can also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA), or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
The Memory 102 may comprise a high-speed Random Access Memory (RAM) and may also include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The Memory 102 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like.
The bus 103 may be an ISA (Industry Standard architecture) bus, a PCI (peripheral Component interconnect) bus, an EISA (extended Industry Standard architecture) bus, or the like. Only one bi-directional arrow is shown in fig. 1, but this does not indicate only one bus 103 or one type of bus 103.
The electronic device 100 implements a communication connection between the electronic device 100 and an external device (e.g., a camera, a pan-tilt, a combined satellite-inertial navigation positioning system, etc.) through at least one communication interface 104 (which may be wired or wireless). The memory 102 is used to store programs such as the direct geolocation device 200. The direct geolocation means 200 comprise at least one software functional module that may be stored in said memory 102 in the form of software or firmware (firmware) or solidified in the Operating System (OS) of the electronic device 100. The processor 101, upon receiving the execution instruction, executes the program to implement the direct geolocation method.
The display screen 105 is used to display an image, which may be the result of some processing by the processor 101. The display screen 105 may be a touch display screen, a display screen 105 without interactive functionality, or the like. The display screen 105 may display the image to be processed, the first geographical location, or the second geographical location.
It should be understood that the configuration shown in fig. 1 is merely a schematic application of the configuration of the electronic device 100, and that the electronic device 100 may also include more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination thereof.
Based on the electronic device 100, a possible implementation manner of the direct geographic positioning method is given below, an execution subject of the method may be the electronic device 100, please refer to fig. 2, and fig. 2 shows a flowchart of the direct geographic positioning method according to an embodiment of the present invention. The direct geolocation method includes the steps of:
and S1, acquiring the image to be processed, wherein the image to be processed is obtained by shooting by a camera on the carrier.
In an embodiment of the invention, the image to be processed may be captured by a camera on the carrier. The step of acquiring the image to be processed may be understood as that the electronic device 100 sends a control instruction to the camera, the camera captures the image to be processed, and the image to be processed is sent to the electronic device 100.
S2, an image space-photogrammetry rotation matrix from the image space coordinate system to the photogrammetry coordinate system is obtained.
In the embodiment of the invention, the image space coordinate system can take the shooting center as the origin of coordinates, and the imageFirst coordinate axis X of a spatial coordinate systemiAnd a second coordinate axis YiParallel to the x-axis and y-axis of the image plane coordinate system, respectively, and the third coordinate axis Z of the image space coordinate systemiPerpendicular to the first coordinate axis XiAnd a second coordinate axis YiAnd satisfies the right-hand rule.
The coordinate origin of the photogrammetric coordinate system can be an intersection point of the photographic center projected to the ground along the normal direction in the ground surface and also a tangent point of a local tangent plane of the ground surface, and the first coordinate axis X of the photogrammetric coordinate systempPointing to the east, second axis Y of the photogrammetric coordinate systempPointing to true north, the third axis Z of the photogrammetric coordinate systempAlong the direction of the earth's surface normal, a first coordinate axis X of a photogrammetric coordinate systempA second coordinate axis YpAnd a third coordinate axis ZpPerpendicular to each other and satisfies the right-hand rule.
The image space-photogrammetry rotation matrix may be a product characterizing a rotation transformation of the image space coordinate system to the photogrammetry coordinate system, in particular the image space-photogrammetry rotation matrix may be an image space-camera rotation matrix, a camera-carrier rotation matrix, a carrier-local navigation rotation matrix, a local navigation-photogrammetry rotation matrix.
Referring to fig. 3, step S2 may include the following sub-steps:
s21, an image space-camera rotation matrix from the image space coordinate system to the camera coordinate system is obtained.
Referring to FIG. 4, the camera coordinate system may be a camera coordinate system with the camera center as the origin of coordinates and the first coordinate axis X of the camera coordinate systemcAlong the direction of the main optical axis, a second axis Y of the camera coordinate systemcParallel to the transverse screen coordinate axis of the camera, i.e. the u-axis of the screen pixel coordinate system, and to the first coordinate axis XcVertical, third coordinate axis Z of the camera coordinate systemcWith a first coordinate axis XcA second coordinate axis YcPerpendicular to each other and satisfies the right-hand rule. The image space-camera rotation matrix may represent a rotational transformation of the image space coordinate system to the camera coordinate system.
Referring to fig. 5, the corresponding image space coordinate systems are different under different image plane coordinate systems, and three different image plane coordinate systems are described below:
image plane coordinate system blu h defined by the university of Hannover in germany (Hannover): the origin is positioned at the geometric center of the photo, the x axis is parallel and reverse to the v axis of the pixel ordinate, the y axis is parallel and reverse to the u axis of the pixel abscissa, and the z axis is perpendicular to the x axis and the y axis and meets the right-hand rule;
Image plane coordinate system PATB defined by stettgart university in germany: the origin is positioned at the geometric center of the photo, the x axis is parallel and equidirectional with the v axis of the pixel ordinate, the y axis is parallel and equidirectional with the u axis of the pixel abscissa, and the z axis is perpendicular to the x axis and the y axis and meets the right-hand rule;
third, an image plane coordinate system CCHZ defined in "low altitude digital photogrammetry specification (CH/Z3005-2010)" in china: the origin is located at the geometric center of the picture, the x axis is parallel and equidirectional with the u axis of the pixel abscissa, the y axis is parallel and equidirectional with the v axis of the pixel ordinate, and the z axis is perpendicular to the x axis and the y axis and meets the right-hand rule.
According to the definition of photogrammetry, the image plane coordinate system is translated to the shooting center S point along the z-axis in the opposite direction, and the axis system establishes an image space coordinate system in parallel.
Under three different definitions, the corresponding image space coordinate system is also different, and then the image space-camera rotation matrix from the image space coordinate system to the camera coordinate system is also different. For different image planes, the image space-camera rotation matrices may be:
Figure GDA0001963307930000091
Figure GDA0001963307930000101
Figure GDA0001963307930000102
and S22, obtaining a camera-carrier rotation matrix from the camera coordinate system to the carrier coordinate system.
In the embodiment of the invention, the carrier coordinate system may use a position sensor in a satellite-inertial navigation combined positioning system on the carrier as a coordinate origin, and a first coordinate axis X of the carrier coordinate system bThe second coordinate axis Y of the carrier coordinate system may be along the axial direction of the carrier (e.g. the axis of the drone)bMay be perpendicular to the first coordinate axis XbAnd directed to the right of the direction of travel of the carrier, the third coordinate axis Z of the carrier coordinate systembThe axis being perpendicular to the first coordinate axis XbAnd a second coordinate axis YbAnd satisfies the right-hand rule. The camera-carrier rotation matrix may represent a rotational transformation relationship from a camera coordinate system to a carrier coordinate system.
The step of obtaining a camera-carrier rotation matrix of the camera coordinate system to the carrier coordinate system may be understood as first obtaining a first pitch angle of the camera, which may be a pitch angle of the camera with respect to the carrier coordinate system, a first roll angle, which may be a roll angle of the camera with respect to the carrier coordinate system, and a rotation angle, which may be a rotation angle of the camera with respect to the carrier coordinate system. Since the camera is connected to the carrier via the pan/tilt head or the pod, the angle of rotation of the pan/tilt head or the pod is controllable, so that the first pitch angle, the first roll angle and the yaw angle of the camera with respect to the coordinate system of the carrier are known. Then, according to the first pitch angle, the first roll angle and the rotation deflection angle, calculating a camera-carrier rotation matrix from a camera coordinate system to a carrier coordinate system:
Figure GDA0001963307930000103
Wherein the content of the first and second substances,
Figure GDA0001963307930000104
is a camera-carrier rotation matrix, theta is a first pitch angle,
Figure GDA0001963307930000105
at a first roll angle, psi is the spin angle.
And S23, obtaining a carrier-local navigation rotation matrix from the carrier coordinate system to the local navigation coordinate system.
In the embodiment of the present invention, the local navigation coordinate system may use the position sensor of the satellite-inertial navigation combined positioning system as the origin of coordinates, and the first coordinate axis X of the local navigation coordinate systemnAlong the north direction, the second coordinate axis Y of the local navigation coordinate systemnThe third coordinate axis Z of the local navigation coordinate system can be along the east-ward directionnCan locally navigate a first coordinate axis X of a coordinate system along a normal direction in the earth's surfacenA second coordinate axis YnAnd a third coordinate axis ZnPerpendicular to each other and satisfying the right-hand rule, it can be understood that the coordinate axes of the local navigation coordinate system are parallel to the coordinate axes of the northeast coordinate system. The carrier-local navigation rotation matrix can represent the rotation transformation relation from the carrier coordinate system to the local navigation coordinate system.
The step of obtaining the carrier-local navigation rotation matrix from the carrier coordinate system to the local navigation coordinate system may be understood as first obtaining a second pitch angle, which may be a pitch angle of the carrier with respect to the local navigation coordinate system, a second roll angle, which may be a roll angle of the carrier with respect to the local navigation coordinate system, and an azimuth angle, which may be an azimuth angle of the carrier with respect to the local navigation coordinate system. And a satellite-inertial navigation combined positioning system is arranged on the carrier, and a second pitch angle, a second roll angle and an azimuth angle of the carrier can be obtained through an inertial navigation system in the satellite-inertial navigation combined positioning system. Then, according to the second pitch angle, the second roll angle and the azimuth angle, calculating a carrier-local navigation rotation matrix from the carrier coordinate system to the local navigation coordinate system:
Figure GDA0001963307930000111
Wherein the content of the first and second substances,
Figure GDA0001963307930000112
for the vehicle-navigation rotation matrix, Θ is the second pitch angle of the vehicle, Φ is the second roll angle of the vehicle, and Ψ is the azimuth angle of the vehicle.
S24, a local navigation-photogrammetry rotation matrix from the local navigation coordinate system to the photogrammetry coordinate system is obtained.
In the embodiment of the invention, the coordinate origin of the photogrammetric coordinate system can be an intersection point of the photographic center projected to the ground along the opposite direction of the surface normal, and also can be a tangent point of a local tangent plane of the ground, and the first coordinate axis X of the photogrammetric coordinate systempPointing to the east, second axis Y of the photogrammetric coordinate systempPointing to true north, the third axis Z of the photogrammetric coordinate systempAlong the direction of the earth's surface normal, a first coordinate axis X of a photogrammetric coordinate systempA second coordinate axis YpAnd a third coordinate axis ZpPerpendicular to each other and satisfies the right-hand rule. The local navigation-photogrammetry rotation matrix may characterize a rotational transformation relationship of the local navigation coordinate system to the photogrammetry coordinate system.
As an embodiment, the local navigator-photogrammetry rotation matrix may be:
Figure GDA0001963307930000121
and S25, calculating an image space-photogrammetry rotation matrix according to the image space-camera rotation matrix, the camera-carrier rotation matrix, the carrier-local navigation rotation matrix and the local navigation-photogrammetry rotation matrix.
In the embodiment of the present invention, the image space-photogrammetry rotation matrix may represent a rotation transformation relationship from an image space coordinate system to a photogrammetry coordinate system, and the image space-photogrammetry rotation matrix expression:
Figure GDA0001963307930000122
wherein the content of the first and second substances,
Figure GDA0001963307930000123
for measuring the rotation matrix like a space-camera,
Figure GDA0001963307930000124
for local navigation-photogrammetry of the rotation matrix,
Figure GDA0001963307930000125
for the carrier-local navigation rotation matrix,
Figure GDA0001963307930000126
for the camera-carrier rotation matrix,
Figure GDA0001963307930000127
like the spatio-camera rotation matrix.
And S3, predefining an image space auxiliary coordinate system, enabling the image space-photogrammetry rotation matrix and the image space-image auxiliary rotation matrix containing the external orientation angle element to satisfy an equivalence relation, and calculating the external orientation angle element of the image to be processed, wherein the image space-image auxiliary rotation matrix represents the rotation transformation relation from the image space coordinate system to the image space auxiliary coordinate system.
In the embodiment of the present invention, the image space auxiliary coordinate system may have a first coordinate axis X of the image space auxiliary coordinate system with the photographing center as a coordinate originaPointing to the east, like the second axis Y of the auxiliary coordinate system of spaceaPointing to the third coordinate axis Z of the true north, like a spatial auxiliary coordinate systemaAlong the direction of the earth's surface external normal, a first coordinate axis X of an auxiliary coordinate system in image space aA second coordinate axis YaAnd a third coordinate axis ZaPerpendicular to each other and satisfies the right-hand rule. The image space-image auxiliary rotation matrix represents a rotation transformation relationship from the image space coordinate system to the image space auxiliary coordinate system.
The derivation of the image spatio-image assisted rotation matrix is as follows:
image plane coordinate system BLUH derivation: using Z-axis as main axis
Figure GDA0001963307930000134
Corner systemThe system has a rotation matrix of an image space coordinate system relative to an image space auxiliary coordinate system as follows:
Figure GDA0001963307930000131
secondly, deriving according to an image plane coordinate system PATB: using Z-axis as main axis
Figure GDA0001963307930000135
And the rotation matrix of the image space coordinate system relative to the image space auxiliary coordinate system of the corner system is as follows:
Figure GDA0001963307930000132
third, according to image plane coordinate system CCHZ derivation: usually with the Y-axis as the main axis
Figure GDA0001963307930000133
And the rotation matrix of the image space coordinate system relative to the image space auxiliary coordinate system of the corner system is as follows:
Figure GDA0001963307930000141
since the coordinate axes of the photogrammetric coordinate system are parallel to the coordinate axes of the image space auxiliary coordinate system, which translates the image space auxiliary coordinate system along the Z-axis in the opposite direction to the "ground", the image space-image auxiliary rotation matrix of the image space coordinate system to the image space auxiliary coordinate system should be equivalent to the image space-photogrammetric rotation matrix of the image space coordinate system to the photogrammetric coordinate system.
Namely, the image space-photogrammetry rotation matrix and the image space-image auxiliary rotation matrix containing the external orientation angle elements satisfy the equivalence relation:
Figure GDA0001963307930000142
wherein the content of the first and second substances,
Figure GDA0001963307930000143
for measuring the rotation matrix like a space-camera,
Figure GDA0001963307930000144
is an image space-image auxiliary rotation matrix.
The following relations are provided:
Figure GDA0001963307930000145
from this it is deduced:
the method for calculating the external orientation angle element corresponding to the image plane coordinate system BLUH comprises the following steps:
Figure GDA0001963307930000146
secondly, the calculation method of the external orientation angle element corresponding to the PATB image plane coordinate system is as follows:
Figure GDA0001963307930000151
thirdly, the method for calculating the external orientation angle element corresponding to the image plane coordinate system CCHZ is as follows:
Figure GDA0001963307930000152
the image space-image auxiliary rotation matrix is equal to the image space-photogrammetry rotation matrix, and then the external orientation angle elements kappa, omega and of the image to be processed can be solved
Figure GDA0001963307930000153
And S4, converting the target image point in the image to be processed into a photogrammetric coordinate system according to the external azimuth element, the external azimuth element of the image to be processed and the preset internal azimuth element, and obtaining a first geographical position of the target image point.
In the embodiment of the present invention, the external orientation line element of the image to be processed may be a coordinate of the photographing center in an object coordinate system, and may be obtained by GPS measurement in a satellite-inertial navigation combined positioning system (simple conversion is required between different object coordinate systems), and the preset internal orientation element may be a parameter of a position relationship between the photographing center and the photo (i.e., the image to be processed), that is, a vertical distance from the photographing center to the photo and a coordinate of the image principal point in an image plane coordinate system. The inner orientation element may be preset in the electronic device 100, or may be preset in the camera, and the electronic device 100 obtains the inner orientation element, which is not limited herein.
The first geographical position may be a position coordinate of a target image point in the image to be processed in the photogrammetric coordinate system. The target image point may be any image point in the image to be processed, and the target image point may be selected by a user in a self-defined manner or may be an image point at a preset position in the image to be processed.
Converting the target image point in the image to be processed into a photogrammetric coordinate system according to the external azimuth angle element, the external azimuth line element of the image to be processed and the preset internal azimuth element to obtain a first geographical position of the target image point.
And S5, iteratively calculating a second geographical position of the target object point in the photogrammetric coordinate system according to the first geographical position, the central projection relation between the image point and the object point and the preset surface model, wherein the target image point corresponds to the target object point.
In the embodiment of the present invention, the preset Surface Model may be a Digital Surface Model (DSM), a Digital Elevation Model (DEM), or a Digital Terrain Model (DTM), which is not limited herein.
The second geographic location may be a location coordinate of the target object point in the photogrammetric coordinate system corresponding to the target image point.
Restoring the imaging geometry of central projection in the photogrammetric coordinate system through the transformation from the image space coordinate system to the photogrammetric coordinate system, thereby obtaining the one-to-many mapping relation from the target image point to the object point; to further obtain a one-to-one mapping relationship from the target image point to the target object point, a mathematical surface on which the object point is located needs to be introduced for constraint. Generally, a preset surface model of the object point is adopted to perform intersection operation with the light beam projected from the center, so as to determine the one-to-one mapping relation.
Setting a coordinate vector V of the target image point in an image space coordinate systemi=[xi yi -f]TWherein f is the camera main distance; setting the coordinate vector V of the target image point in the auxiliary coordinate system of the image spacea=[Xa Ya Za]TSetting the coordinate vector V of the projection point mapped by the target image point in the photogrammetric coordinate systemP=[XP YP ZP]TSetting the coordinate vector V of the photography center in the photogrammetric coordinate systemS=[XS YS ZS]TAccording to the imaging geometry, the following relation is satisfied:
Figure GDA0001963307930000161
the introduced digital surface model DSM is denoted XPAnd YPThe binary function of (a) is then:
ZP=DSM(XP,YP)
to avoid using binary function DSM (X)P,YP) Directly introducing equation solution to actually calculate V PIn this case, patent "a method for locating a target by combining off-line elevation with an onboard electro-optic pod" (application No. CN201810409853.9) "can be cited, and V is calculated by iterative approximationPThe method comprises the following basic steps:
(1) inputting the current carrier position parameter, which is generally the longitude and latitude height coordinate of WGS84 in the geodetic coordinate system, and converting the current carrier position parameter into the coordinate vector V of the photographing center S in the photogrammetric coordinate system by referring to a known algorithm (such as the geodetic2enu function of Matlab)S=[XS YS ZS]TThe intersection point of the photographing center projected to the ground along the direction opposite to the surface normal is used as the origin of coordinates, and the first coordinate axis XpPointing to the east, second coordinate axis YpPointing to true north, the third coordinate axis ZpAnd establishing a photogrammetric coordinate system along the direction of the external normal of the earth surface.
(2) According to the input carrier attitude and camera relative attitude parameters, obtaining the rotation matrix from image space coordinate system to photogrammetric coordinate system
Figure GDA0001963307930000171
(3) Given a target point coordinate (x)i,yi) If the current focal length F is calculated from the input field angle, the principal distance F is approximately equal to F, and the current focal length F can be calculated by F u × F/(u-F) (where u is the object distance, i.e., the distance from the imaging center S to the model projection point), the corresponding image space coordinate vector is Vi=[xi yi -f]T
(4) Calculating the position coordinates of the target image point in the auxiliary coordinate system of image space
Figure GDA0001963307930000172
Further obtaining Z of the target image pointaCoordinate values;
(5) giving the model projection point of the target image point a ZPInitial coordinate value, if the current point is the first point of calculation, then the default value (generally 0) is taken; if the current point is not the first point of the calculation, the last projection point Z of which the iterative calculation is finished can be takenPThe coordinate value is used as an initial value;
(6) z from current model projection pointPCoordinate value, calculating initial zoom multiple lambda, and further calculating to obtain VPAnd obtaining (X)P,YP) Coordinates;
(7) according to (X)P,YP) Obtaining Z by inquiring digital surface modelP', determine | ZP-ZPWhether or not' | satisfies the desired accuracy: if so, then Z is takenP=(ZP+ZP')/2, return VP=[XP YP ZP]TAnd exiting the iteration; if not, then take ZP=ZP', and repeating steps (6) to (7).
Finally obtained VP=[XP YP ZP]TI.e. the second geographical position of the target point.
To further improve the accuracy, the coordinate systems may be compensated, requiring measurement of displacements or angular deviations between the coordinate systems. The displacement deviation compensation can take the offset vector T from the camera center of a GNSS receiver (a position sensor in a satellite-inertial navigation combined positioning system) to the center of a pod or a tripod head shafting into considerationdbThe offset vector T from the center of the pod or pan-tilt axis to the center of the camera can be considered cdAlso, the image principal point (x) can be considered0,y0) Deviation from the geometric centre of the picture T0=[-x0 -y0 0]T(ii) a The angular deviation compensation can take into account the installation angle compensation matrix of the nacelle or the pan-tilt relative to the carrier
Figure GDA0001963307930000181
Taking the above-mentioned deviations into account, the camera-carrier rotation matrix
Figure GDA0001963307930000182
Is equivalent to
Figure GDA0001963307930000183
And the following relationships exist:
Figure GDA0001963307930000184
compared with the prior art, the embodiment of the invention has the following advantages:
firstly, the external azimuth angle element of the image to be processed is solved by equating the image space-photogrammetry rotation matrix with a predefined image space-image auxiliary rotation matrix containing the external azimuth angle element, so as to obtain the external azimuth angle element, and then the first position coordinate of the target image point in the image to be processed is obtained according to the external azimuth angle element, the line element and the internal azimuth element, so that the high-precision direct geographical positioning of the image to be processed is realized, and further, the remote sensing system with the high-precision real-time geographical positioning function is realized.
Secondly, considering 3 image plane coordinate system definitions of BLUH, PATB and CCHZ, the coordinate system can be configured according to the specific coordinate system type used
Figure GDA0001963307930000185
In the equation
Figure GDA0001963307930000186
And
Figure GDA0001963307930000187
the matrix realizes the output under different coordinate system definitions through algorithm branches, the input freedom degree parameters do not need to be matched with the coordinate system definitions in advance, the unification of external interfaces is realized, and the existing definitions, theories and methods of photogrammetry are compatible.
And then, a camera-carrier rotation matrix from a camera coordinate system to a carrier coordinate system is calculated through three attitude angles, namely a first pitch angle, a first roll angle and a rotation deflection angle of the camera, so that the method can be applied to a pod with more freedom, an electric tripod head or a camera with an inclined fixed tripod head.
And finally, shooting by a camera to obtain an image to be processed, carrying out geographical positioning on a target image point on the image to be processed, and carrying out geographical positioning on a target object point corresponding to the target image point to obtain a second geographical position, thereby realizing direct geographical positioning of the monocular remote sensing system.
With reference to the method flows of fig. 2 to fig. 3, a possible implementation manner of the direct geographic positioning apparatus 200 is given below, where the direct geographic positioning apparatus 200 may be implemented by using the device structure of the electronic device 100 in the foregoing embodiment, or implemented by using the processor 101 in the electronic device 100, please refer to fig. 6, and fig. 6 shows a block schematic diagram of the direct geographic positioning apparatus provided in the embodiment of the present invention. The direct geolocation device 200 includes an acquisition module 201 and a generation module.
An obtaining module 201, configured to obtain an image to be processed, where the image to be processed is obtained by shooting with a camera on a carrier;
A processing module 202 for obtaining an image space-photogrammetry rotation matrix from an image space coordinate system to a photogrammetry coordinate system; predefining an image space auxiliary coordinate system, enabling an image space-photogrammetry rotation matrix and an image space-image auxiliary rotation matrix containing an external orientation angle element to meet an equivalence relation, and calculating the external orientation angle element of the image to be processed, wherein the image space-image auxiliary rotation matrix represents a rotation transformation relation from the image space coordinate system to the image space auxiliary coordinate system; and converting the target image point in the image to be processed into a photogrammetric coordinate system according to the outer azimuth element, the outer azimuth element of the image to be processed and the preset inner azimuth element to obtain a first geographical position of the target image point.
In this embodiment of the present invention, the processing module 202 is specifically configured to: taking a photographing center as a coordinate origin, wherein a first coordinate axis of an image space auxiliary coordinate system points to the east, a second coordinate axis of the image space auxiliary coordinate system points to the north, and a third coordinate axis of the image space auxiliary coordinate system points to the outer normal direction of the earth surface, and an image space auxiliary coordinate system is established, wherein the first coordinate axis, the second coordinate axis and the third coordinate axis of the image space auxiliary coordinate system are mutually vertical and meet the right-hand rule; the image space-photogrammetry rotation matrix and the image space-image auxiliary rotation matrix containing the external orientation angle elements satisfy an equivalence relation:
Figure GDA0001963307930000201
Wherein the content of the first and second substances,
Figure GDA0001963307930000202
for measuring the rotation matrix like a space-camera,
Figure GDA0001963307930000203
is an image space-image auxiliary rotation matrix.
The processing module 202 may be further specifically configured to: obtaining an image space-camera rotation matrix from an image space coordinate system to a camera coordinate system; obtaining a camera-carrier rotation matrix from a camera coordinate system to a carrier coordinate system; obtaining a carrier-local navigation rotation matrix from a carrier coordinate system to a local navigation coordinate system; obtaining a local navigation-photogrammetry rotation matrix from a local navigation coordinate system to a photogrammetry coordinate system; and calculating the image space-photogrammetry rotation matrix according to the image space-camera rotation matrix, the camera-carrier rotation matrix, the carrier-local navigation rotation matrix and the local navigation-photogrammetry rotation matrix.
In the embodiment of the present invention, the camera coordinate system uses the photographing center as the origin of coordinates, the first coordinate axis of the camera coordinate system is along the direction of the main optical axis, the second coordinate axis of the camera coordinate system is parallel to the screen lateral coordinate axis of the camera, the third coordinate axis of the camera coordinate system is perpendicular to the first coordinate axis and the second coordinate axis, and satisfies the right-hand rule, and the image space-camera rotation matrix includes:
Figure GDA0001963307930000204
Figure GDA0001963307930000205
Figure GDA0001963307930000206
one kind of (1).
The processing module 202 may be further specifically configured to: acquiring a first pitch angle, a first roll angle and a first rotation deflection angle of a camera; calculating a camera-carrier rotation matrix from a camera coordinate system to a carrier coordinate system according to the first pitch angle, the first roll angle and the spin deflection angle:
Figure GDA0001963307930000211
wherein the content of the first and second substances,
Figure GDA0001963307930000212
is a camera-carrier rotation matrix, theta is a first pitch angle,
Figure GDA0001963307930000213
at a first roll angle, psi is the spin angle.
The processing module 202 may be further specifically configured to: like the spatio-photogrammetric rotation matrix expression:
Figure GDA0001963307930000214
wherein the content of the first and second substances,
Figure GDA0001963307930000215
for measuring the rotation matrix like a space-camera,
Figure GDA0001963307930000216
for local navigation-photogrammetry of the rotation matrix,
Figure GDA0001963307930000217
for the carrier-local navigation rotation matrix,
Figure GDA0001963307930000218
for the camera-carrier rotation matrix,
Figure GDA0001963307930000219
like the spatio-camera rotation matrix.
The processing module 202 may also be configured to: and iteratively calculating a second geographical position of the target object point in the photogrammetric coordinate system according to the first geographical position, the central projection relation of the image point and the object point and the preset surface model, wherein the target image point corresponds to the target object point.
In summary, an embodiment of the present invention provides a direct geographic positioning method and apparatus, where the method includes: acquiring an image to be processed, wherein the image to be processed is obtained by shooting through a camera on a carrier; obtaining an image space-photogrammetry rotation matrix from an image space coordinate system to a photogrammetry coordinate system; predefining an image space auxiliary coordinate system, enabling an image space-photogrammetry rotation matrix and an image space-image auxiliary rotation matrix containing an external orientation angle element to meet an equivalence relation, and calculating the external orientation angle element of the image to be processed, wherein the image space-image auxiliary rotation matrix represents a rotation transformation relation from the image space coordinate system to the image space auxiliary coordinate system; and converting the target image point in the image to be processed into a photogrammetric coordinate system according to the outer azimuth element, the outer azimuth element of the image to be processed and the preset inner azimuth element to obtain a first geographical position of the target image point. Compared with the prior art, the external azimuth angle element of the image to be processed is solved by equating the image space-photogrammetry rotation matrix with the predefined image space-image auxiliary rotation matrix containing the external azimuth angle element, so that the external azimuth angle element is obtained, and then the first position coordinate of the target image point in the image to be processed is obtained according to the external azimuth angle element, the line element and the internal azimuth element, so that the high-precision direct geographical positioning of the image to be processed is realized, and further the remote sensing system with the high-precision real-time geographical positioning function can be realized.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, the functional modules in the embodiments of the present invention may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes. It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.

Claims (6)

1. A direct geolocation method characterized in that said method comprises:
acquiring an image to be processed, wherein the image to be processed is obtained by shooting through a camera on a carrier;
obtaining an image space-photogrammetry rotation matrix from an image space coordinate system to a photogrammetry coordinate system;
the step of obtaining an image space-photogrammetry rotation matrix from an image space coordinate system to a photogrammetry coordinate system comprises: obtaining an image space-camera rotation matrix from an image space coordinate system to a camera coordinate system; obtaining a camera-carrier rotation matrix from the camera coordinate system to a carrier coordinate system; obtaining a carrier-local navigation rotation matrix from the carrier coordinate system to a local navigation coordinate system; obtaining a local navigation-photogrammetry rotation matrix from the local navigation coordinate system to a photogrammetry coordinate system; calculating an image space-photogrammetry rotation matrix from the image space-camera rotation matrix, the camera-carrier rotation matrix, the carrier-local navigation rotation matrix and the local navigation-photogrammetry rotation matrix; predefining an image space auxiliary coordinate system, enabling the image space-photogrammetry rotation matrix and an image space-image auxiliary rotation matrix containing an external orientation angle element to meet an equivalence relation, and calculating the external orientation angle element of the image to be processed, wherein the image space-image auxiliary rotation matrix represents a rotation transformation relation from the image space coordinate system to the image space auxiliary coordinate system;
The step of predefining an image space auxiliary coordinate system such that the image space-photogrammetry rotation matrix and an image space-image auxiliary rotation matrix containing exterior orientation angle elements satisfy an equivalence relation, comprises:
taking a photographing center as a coordinate origin, wherein a first coordinate axis of an image space auxiliary coordinate system points to the east, a second coordinate axis of the image space auxiliary coordinate system points to the north, and a third coordinate axis of the image space auxiliary coordinate system points to the outer normal direction of the earth surface, and an image space auxiliary coordinate system is established, wherein the first coordinate axis, the second coordinate axis and the third coordinate axis of the image space auxiliary coordinate system are mutually vertical and meet the right-hand rule;
the image space-photogrammetry rotation matrix and the image space-image auxiliary rotation matrix containing the exterior orientation angle elements satisfy an equivalence relation:
Figure FDA0003017975360000021
wherein the content of the first and second substances,
Figure FDA0003017975360000022
a rotation matrix is measured for the image space-photogrammetry,
Figure FDA0003017975360000023
(ii) spatially-image assisted rotation matrix for said image;
converting a target image point in the image to be processed into the photogrammetric coordinate system according to the outer azimuth element, the outer azimuth element of the image to be processed and a preset inner azimuth element to obtain a first geographical position of the target image point;
The camera coordinate system takes a photographing center as a coordinate origin, a first coordinate axis of the camera coordinate system is along the direction of a main optical axis, a second coordinate axis of the camera coordinate system is parallel to a screen transverse coordinate axis of the camera, a third coordinate axis of the camera coordinate system is perpendicular to the first coordinate axis and the second coordinate axis, and right-hand rules are met, and the image space-camera rotation matrix comprises:
Figure FDA0003017975360000024
Figure FDA0003017975360000025
Figure FDA0003017975360000026
one kind of (1).
2. The method of claim 1, wherein the step of obtaining a camera-carrier rotation matrix of the camera coordinate system to a carrier coordinate system comprises:
acquiring a first pitch angle, a first roll angle and a spin-yaw angle of the camera;
calculating a camera-carrier rotation matrix from a camera coordinate system to a carrier coordinate system according to the first pitch angle, the first roll angle and the rotation deviation angle:
Figure FDA0003017975360000031
wherein the content of the first and second substances,
Figure FDA0003017975360000032
for the camera-carrier rotation matrix, θ is the first pitchThe angle of elevation,
Figure FDA0003017975360000033
psi is the first roll angle and psi is the spin angle.
3. The method of claim 1, wherein said step of computing an image space-photogrammetry rotation matrix from said image space-camera rotation matrix, camera-carrier rotation matrix, carrier-local navigation rotation matrix, and said local navigation-photogrammetry rotation matrix comprises:
Like the spatio-photogrammetric rotation matrix expression:
Figure FDA0003017975360000034
wherein the content of the first and second substances,
Figure FDA0003017975360000035
a rotation matrix is measured for the image space-photogrammetry,
Figure FDA0003017975360000036
a rotation matrix is measured for the local navigator-photogrammetry,
Figure FDA0003017975360000037
for the carrier-local navigation rotation matrix,
Figure FDA0003017975360000038
for the camera-carrier rotation matrix,
Figure FDA0003017975360000039
the image space-camera rotation matrix is used.
4. The method of claim 1, wherein the method further comprises:
and iteratively calculating a second geographical position of the target object point in the photogrammetric coordinate system according to the first geographical position, the central projection relation between the image point and the object point and a preset surface model, wherein the target image point corresponds to the target object point.
5. A direct geolocation device characterized in that said device comprises:
the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring an image to be processed, and the image to be processed is obtained by shooting by a camera on a carrier;
the processing module is used for obtaining an image space-photogrammetry rotation matrix from an image space coordinate system to a photogrammetry coordinate system; predefining an image space auxiliary coordinate system, enabling the image space-photogrammetry rotation matrix and an image space-image auxiliary rotation matrix containing an external orientation angle element to meet an equivalence relation, and calculating the external orientation angle element of the image to be processed, wherein the image space-image auxiliary rotation matrix represents a rotation transformation relation from the image space coordinate system to the image space auxiliary coordinate system; converting a target image point in the image to be processed into the photogrammetric coordinate system according to the outer azimuth element, the outer azimuth element of the image to be processed and a preset inner azimuth element to obtain a first geographical position of the target image point;
The processing module is specifically configured to: obtaining an image space-camera rotation matrix from an image space coordinate system to a camera coordinate system; obtaining a camera-carrier rotation matrix from the camera coordinate system to a carrier coordinate system; obtaining a carrier-local navigation rotation matrix from the carrier coordinate system to a local navigation coordinate system; obtaining a local navigation-photogrammetry rotation matrix from the local navigation coordinate system to a photogrammetry coordinate system; calculating an image space-photogrammetry rotation matrix from the image space-camera rotation matrix, the camera-carrier rotation matrix, the carrier-local navigation rotation matrix and the local navigation-photogrammetry rotation matrix;
the step of predefining an image space auxiliary coordinate system such that the image space-photogrammetry rotation matrix and an image space-image auxiliary rotation matrix containing exterior orientation angle elements satisfy an equivalence relation, comprises:
taking a photographing center as a coordinate origin, wherein a first coordinate axis of an image space auxiliary coordinate system points to the east, a second coordinate axis of the image space auxiliary coordinate system points to the north, and a third coordinate axis of the image space auxiliary coordinate system points to the outer normal direction of the earth surface, and an image space auxiliary coordinate system is established, wherein the first coordinate axis, the second coordinate axis and the third coordinate axis of the image space auxiliary coordinate system are mutually vertical and meet the right-hand rule;
The image space-photogrammetry rotation matrix and the image space-image auxiliary rotation matrix containing the exterior orientation angle elements satisfy an equivalence relation:
Figure FDA0003017975360000041
wherein the content of the first and second substances,
Figure FDA0003017975360000042
a rotation matrix is measured for the image space-photogrammetry,
Figure FDA0003017975360000043
(ii) spatially-image assisted rotation matrix for said image;
the camera coordinate system takes a photographing center as a coordinate origin, a first coordinate axis of the camera coordinate system is along the direction of a main optical axis, a second coordinate axis of the camera coordinate system is parallel to a screen transverse coordinate axis of the camera, a third coordinate axis of the camera coordinate system is perpendicular to the first coordinate axis and the second coordinate axis, and right-hand rules are met, and the image space-camera rotation matrix comprises:
Figure FDA0003017975360000051
Figure FDA0003017975360000052
Figure FDA0003017975360000053
one kind of (1).
6. The apparatus of claim 5, wherein the processing module is further to:
and iteratively calculating a second geographical position of the target object point in the photogrammetric coordinate system according to the first geographical position, the central projection relation between the image point and the object point and a preset surface model, wherein the target image point corresponds to the target object point.
CN201811651475.1A 2018-12-31 2018-12-31 Direct geographic positioning method and device Active CN109725340B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811651475.1A CN109725340B (en) 2018-12-31 2018-12-31 Direct geographic positioning method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811651475.1A CN109725340B (en) 2018-12-31 2018-12-31 Direct geographic positioning method and device

Publications (2)

Publication Number Publication Date
CN109725340A CN109725340A (en) 2019-05-07
CN109725340B true CN109725340B (en) 2021-08-20

Family

ID=66299398

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811651475.1A Active CN109725340B (en) 2018-12-31 2018-12-31 Direct geographic positioning method and device

Country Status (1)

Country Link
CN (1) CN109725340B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110967021B (en) * 2019-12-16 2021-07-16 中国兵器科学研究院 Active/passive ranging independent target geographic positioning method for airborne photoelectric system
CN111784622B (en) * 2020-09-07 2021-01-26 成都纵横自动化技术股份有限公司 Image splicing method based on monocular inclination of unmanned aerial vehicle and related device
CN115540876B (en) * 2022-11-28 2023-04-07 济南和普威视光电技术有限公司 Target positioning method combining offline DEM and photoelectric video data

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103558619A (en) * 2013-11-06 2014-02-05 中测新图(北京)遥感技术有限责任公司 Method for obtaining exterior orientation elements of aerial photograph
CN107967712A (en) * 2017-11-21 2018-04-27 海南电网有限责任公司电力科学研究院 Mountain fire is accurately positioned and algorithm of the mountain fire edge far from overhead transmission line vertical range
CN108627142A (en) * 2018-05-02 2018-10-09 成都纵横自动化技术有限公司 A kind of object localization method of combination offline elevation and airborne photoelectric gondola

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101794449A (en) * 2010-04-13 2010-08-04 公安部物证鉴定中心 Method and device for calibrating camera parameters
CN101975588B (en) * 2010-08-20 2012-07-11 北京航空航天大学 Global calibration method and device of rigid rod of multisensor vision measurement system
US9160980B2 (en) * 2011-01-11 2015-10-13 Qualcomm Incorporated Camera-based inertial sensor alignment for PND
CN102298818B (en) * 2011-08-18 2013-07-10 中国科学技术大学 Binocular shooting fire detecting and positioning device and fire positioning method thereof
US9323325B2 (en) * 2011-08-30 2016-04-26 Microsoft Technology Licensing, Llc Enhancing an object of interest in a see-through, mixed reality display device
CN104613941B (en) * 2015-01-30 2017-02-22 北京林业大学 Analysis method of terrestrial photograph Kappa and Omega angle with vertical base line
US9964941B2 (en) * 2015-05-06 2018-05-08 Aleader Vision Technology Co., Ltd. Method, device and system for improving system accuracy of X-Y motion platform
CN105091867B (en) * 2015-08-14 2017-03-08 北京林业大学 A kind of measuring method to absolute exterior orientation parameter for aeroplane photography picture
CN105716583B (en) * 2016-01-26 2018-03-30 河海大学 A kind of exploration adit geological record base map generation method based on parallel photography
CN106910238A (en) * 2017-01-18 2017-06-30 北京建筑大学 Color texture method for reconstructing based on high inclination-angle close-range image
CN107063190B (en) * 2017-03-02 2019-07-30 辽宁工程技术大学 Pose high-precision direct method estimating towards calibration area array cameras image
CN106885571B (en) * 2017-03-07 2019-10-25 辽宁工程技术大学 A kind of lunar surface rover method for rapidly positioning of combination IMU and navigation image
CN106979787B (en) * 2017-05-23 2019-10-22 辽宁工程技术大学 A kind of rover localization method based on stereo navigation image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103558619A (en) * 2013-11-06 2014-02-05 中测新图(北京)遥感技术有限责任公司 Method for obtaining exterior orientation elements of aerial photograph
CN107967712A (en) * 2017-11-21 2018-04-27 海南电网有限责任公司电力科学研究院 Mountain fire is accurately positioned and algorithm of the mountain fire edge far from overhead transmission line vertical range
CN108627142A (en) * 2018-05-02 2018-10-09 成都纵横自动化技术有限公司 A kind of object localization method of combination offline elevation and airborne photoelectric gondola

Also Published As

Publication number Publication date
CN109725340A (en) 2019-05-07

Similar Documents

Publication Publication Date Title
CN111156998B (en) Mobile robot positioning method based on RGB-D camera and IMU information fusion
CN110500995B (en) Method for establishing high-resolution satellite image equivalent geometric imaging model by using RPC parameters
US9185289B2 (en) Generating a composite field of view using a plurality of oblique panoramic images of a geographic area
CN109725340B (en) Direct geographic positioning method and device
TWI556198B (en) Positioning and directing data analysis system and method thereof
CN113820735B (en) Determination method of position information, position measurement device, terminal and storage medium
CN104764443A (en) Optical remote sensing satellite rigorous imaging geometrical model building method
CN113551665B (en) High-dynamic motion state sensing system and sensing method for motion carrier
CN105424010A (en) Unmanned aerial vehicle video geographic space information registering method
CN108594255B (en) Laser ranging auxiliary optical image joint adjustment method and system
CN107656286A (en) Object localization method and system under big beveled distal end observing environment
WO2020038720A1 (en) Apparatus, method and computer program for detecting the form of a deformable object
CN110853142A (en) Airport clearance three-dimensional model construction method and device based on unmanned aerial vehicle shooting
CN111247389B (en) Data processing method and device for shooting equipment and image processing equipment
Qiao et al. Ground target geolocation based on digital elevation model for airborne wide-area reconnaissance system
CN102279001A (en) Phase shift compensation method of space-borne camera
CN112985391B (en) Multi-unmanned aerial vehicle collaborative navigation method and device based on inertia and binocular vision
Zhou et al. Automatic orthorectification and mosaicking of oblique images from a zoom lens aerial camera
CN110800023A (en) Image processing method and equipment, camera device and unmanned aerial vehicle
WO2021056503A1 (en) Positioning method and apparatus for movable platform, movable platform, and storage medium
WO2020062024A1 (en) Distance measurement method and device based on unmanned aerial vehicle and unmanned aerial vehicle
Cheng et al. High precision passive target localization based on airborne electro-optical payload
CN111213101A (en) Line patrol control method and device for movable platform, movable platform and system
US8903163B2 (en) Using gravity measurements within a photogrammetric adjustment
US11415990B2 (en) Optical object tracking on focal plane with dynamic focal length

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant