CN113624231A - Inertial vision integrated navigation positioning method based on heterogeneous image matching and aircraft - Google Patents

Inertial vision integrated navigation positioning method based on heterogeneous image matching and aircraft Download PDF

Info

Publication number
CN113624231A
CN113624231A CN202110783533.1A CN202110783533A CN113624231A CN 113624231 A CN113624231 A CN 113624231A CN 202110783533 A CN202110783533 A CN 202110783533A CN 113624231 A CN113624231 A CN 113624231A
Authority
CN
China
Prior art keywords
image
real
projection
coordinate system
gradient direction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110783533.1A
Other languages
Chinese (zh)
Other versions
CN113624231B (en
Inventor
尚克军
扈光锋
裴新凯
段昊雨
明丽
王大元
庄广琛
刘崇亮
王海军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Automation Control Equipment Institute BACEI
Original Assignee
Beijing Automation Control Equipment Institute BACEI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Automation Control Equipment Institute BACEI filed Critical Beijing Automation Control Equipment Institute BACEI
Priority to CN202110783533.1A priority Critical patent/CN113624231B/en
Publication of CN113624231A publication Critical patent/CN113624231A/en
Application granted granted Critical
Publication of CN113624231B publication Critical patent/CN113624231B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides an inertial vision integrated navigation positioning method based on heterogeneous image matching and an aircraft, wherein the method comprises the following steps: respectively carrying out orthoimage correction on the real-time image and the reference image, and unifying the scaling sizes of the real-time image and the reference image; respectively acquiring a first projection and a quantized gradient direction histogram of a real-time image and a second projection and a quantized gradient direction histogram of a reference image; detecting the similarity of the first projection and the second projection and the histogram of the quantization gradient direction, wherein the position of the maximum Hamming distance value of the second projection and the histogram of the quantization gradient direction is the image matching position of the real-time image in the reference image; and converting the image matching position into the position of the unmanned aerial vehicle as an observed quantity, and constructing a Kalman filter based on the observed quantity to realize the inertial vision integrated navigation positioning of the unmanned aerial vehicle. By applying the technical scheme of the invention, the technical problem that accurate and stable matching between the heterogeneous images is difficult to realize in the prior art is solved.

Description

基于异源图像匹配的惯性视觉组合导航定位方法及飞行器Inertial Vision Integrated Navigation and Positioning Method and Aircraft Based on Heterogeneous Image Matching

技术领域technical field

本发明涉及计算机视觉研究技术领域,尤其涉及一种基于异源图像匹配的惯性视觉组合导航定位方法及飞行器。The invention relates to the technical field of computer vision research, in particular to an inertial vision combined navigation and positioning method and an aircraft based on heterologous image matching.

背景技术Background technique

图像匹配所使用的参考图为可见光卫星影像,实时图一般采用可全天时工作的红外图像,属于典型的异源图像匹配技术。与同源图像匹配不同,异源图像由于成像机理的不同,即使在同一场景下所拍摄到的图像差异也会很大,因此需要针对异源图像匹配进行特别处理。同时,随着飞行器的飞行姿态变化,相机拍摄实时图并非正下视,所成实时图像与预先装载在飞行器上的正射参考图影像之间存在尺度、旋转、仿射和透视等几何变换,所以在匹配之前,需要对图像进行一定的预处理校正操作,同时还需要考虑到所设计的匹配算法对校正过后仍然存在的一定程度上的图像几何形变的不变性。然而,在现有视觉匹配中,异源图像之间难以实现准确稳定匹配,因此组合导航算法自主性较差。The reference images used in image matching are visible light satellite images, and real-time images generally use infrared images that can work all day, which is a typical heterogeneous image matching technology. Different from homologous image matching, due to the different imaging mechanisms of heterologous images, the images captured in the same scene will be very different, so special processing is required for heterologous image matching. At the same time, with the change of the flight attitude of the aircraft, the real-time image captured by the camera is not a straight-down view, and there are geometric transformations such as scale, rotation, affine and perspective between the real-time image and the orthophoto reference image preloaded on the aircraft. Therefore, before matching, it is necessary to perform a certain preprocessing correction operation on the image, and at the same time, it is also necessary to take into account the invariance of the designed matching algorithm to a certain degree of image geometric deformation that still exists after correction. However, in the existing visual matching, it is difficult to achieve accurate and stable matching between heterogeneous images, so the combined navigation algorithm is less autonomous.

发明内容SUMMARY OF THE INVENTION

本发明提供了一种基于异源图像匹配的惯性视觉组合导航定位方法及飞行器,能够解决现有技术中异源图像之间难以实现准确稳定匹配的技术问题。The invention provides an inertial vision combined navigation and positioning method and an aircraft based on heterologous image matching, which can solve the technical problem that it is difficult to achieve accurate and stable matching between heterologous images in the prior art.

根据本发明的一方面,提供了一种基于异源图像匹配的惯性视觉组合导航定位方法,基于异源图像匹配的惯性视觉组合导航定位方法包括:利用惯导姿态信息以及激光测距高度信息将实时图和基准图分别进行正射影像校正,统一实时图和基准图的缩放尺寸;分别获取实时图的第一投影和量化梯度方向直方图以及基准图的第二投影和量化梯度方向直方图;基于相似性测量原理对第一投影和量化梯度方向直方图和第二投影和量化梯度方向直方图的相似度进行检测,在第二投影和量化梯度方向直方图中检索与第一投影和量化梯度方向直方图的汉明距离最大值的位置,汉明距离最大值所在的位置即为实时图在基准图中的图像匹配位置;将图像匹配位置转换到无人机所在位置并作为观测量,基于观测量构建卡尔曼滤波器以实现无人机的惯性视觉组合导航定位。According to one aspect of the present invention, an inertial vision combined navigation and positioning method based on heterologous image matching is provided. The inertial vision combined navigation and positioning method based on heterologous image matching includes: using inertial navigation attitude information and laser ranging height information to Perform orthophoto correction on the real-time image and the reference image respectively, and unify the scaling size of the real-time image and the baseline image; respectively obtain the first projection and quantized gradient direction histogram of the real-time image and the second projection and quantized gradient direction histogram of the baseline image; Based on the similarity measurement principle, the similarity between the first projection and the quantized gradient direction histogram and the second projection and the quantized gradient direction histogram is detected, and the similarity between the first projection and the quantized gradient direction histogram is retrieved from the second projection and the quantized gradient direction histogram. The position of the maximum Hamming distance of the orientation histogram, the position of the maximum Hamming distance is the image matching position of the real-time image in the reference image; the image matching position is converted to the position of the UAV and used as the observation amount, based on The Kalman filter is constructed from the observations to realize the inertial vision integrated navigation and positioning of the UAV.

进一步地,获取实时图的第一投影和量化梯度方向直方图具体包括:对实时图进行高斯滤波;采用Sobel算子提取高斯滤波后的实时图的灰度图像梯度特征以获取实时图的梯度图像;基于实时图的梯度图像,统计实时图的梯度直方图;对实时图的梯度直方图进行投影和量化以获取第一投影和量化梯度方向直方图;获取基准图的第二投影和量化梯度方向直方图具体包括:对基准图进行高斯滤波;采用Sobel算子提取高斯滤波后的基准图的灰度图像梯度特征以获取基准图的梯度图像;基于基准图的梯度图像,统计基准图的梯度直方图;对基准图的梯度直方图进行投影和量化以获取第二投影和量化梯度方向直方图。Further, obtaining the first projection of the real-time image and the quantized gradient direction histogram specifically includes: performing Gaussian filtering on the real-time image; using the Sobel operator to extract the grayscale image gradient feature of the Gaussian filtered real-time image to obtain the gradient image of the real-time image. ; Calculate the gradient histogram of the real-time graph based on the gradient image of the real-time graph; Project and quantify the gradient histogram of the real-time graph to obtain the first projected and quantized gradient direction histogram; Obtain the second projected and quantized gradient direction of the reference graph The histogram specifically includes: performing Gaussian filtering on the reference image; using the Sobel operator to extract the grayscale image gradient features of the Gaussian filtered baseline image to obtain the gradient image of the baseline image; based on the gradient image of the baseline image, count the gradient histogram of the baseline image. map; project and quantify the gradient histogram of the reference map to obtain a second projected and quantized gradient direction histogram.

进一步地,实时图和基准图均可根据

Figure BDA0003157886820000021
实现正射影像校正,其中,
Figure BDA0003157886820000022
Figure BDA0003157886820000023
为非正射影像,
Figure BDA0003157886820000024
为正射影像,
Figure BDA0003157886820000025
为空间点P在世界坐标系中的齐次坐标,zc为空间点P在摄像机坐标系中的坐标的z分量,f为光学系统主距,sc为图像传感器上列方向上相邻像素之间的距离,sr为图像传感器上行方向上相邻像素之间的距离,[cc,cr]T为图像的主点,
Figure BDA0003157886820000026
为两个内参完全一致的摄像机之间的旋转矩阵,
Figure BDA0003157886820000027
为从世界坐标系向摄像机坐标系转换的旋转矩阵,
Figure BDA0003157886820000028
为从世界坐标系向摄像机坐标系转换的平移向量。Further, both the real-time graph and the benchmark graph can be based on
Figure BDA0003157886820000021
Implements orthophoto correction, where,
Figure BDA0003157886820000022
Figure BDA0003157886820000023
is a non-orthophoto image,
Figure BDA0003157886820000024
is an orthophoto,
Figure BDA0003157886820000025
is the homogeneous coordinate of the space point P in the world coordinate system, z c is the z component of the coordinate of the space point P in the camera coordinate system, f is the principal distance of the optical system, and s c is the adjacent pixels in the column direction of the image sensor The distance between them, s r is the distance between adjacent pixels in the upward direction of the image sensor, [c c ,c r ] T is the main point of the image,
Figure BDA0003157886820000026
is the rotation matrix between two cameras with identical internal parameters,
Figure BDA0003157886820000027
is the rotation matrix converted from the world coordinate system to the camera coordinate system,
Figure BDA0003157886820000028
is the translation vector transformed from the world coordinate system to the camera coordinate system.

进一步地,实时图和基准图的缩放系数k可根据

Figure BDA0003157886820000029
来获取,其中,μ为像元尺寸,f为相机焦距,l为相机光心距拍摄点距离。Further, the scaling factor k of the real-time graph and the reference graph can be based on
Figure BDA0003157886820000029
to obtain, where μ is the pixel size, f is the focal length of the camera, and l is the distance between the optical center of the camera and the shooting point.

进一步地,第一投影和量化梯度方向直方图和第二投影和量化梯度方向直方图的相似度S(x,y)可根据

Figure BDA0003157886820000031
来获取,其中,d(a,b)为差别度量,f为单调函数,(u,v)为实时图中的像素坐标集合,I(x+u,y+v)表示实时图在参考图(x,y)处的相似性度量值,T为实时图,I为参考图。Further, the similarity S(x, y) between the first projected and quantized gradient direction histogram and the second projected and quantized gradient direction histogram can be based on
Figure BDA0003157886820000031
to obtain, where d(a, b) is the difference measure, f is a monotonic function, (u, v) is the set of pixel coordinates in the real-time image, and I(x+u, y+v) indicates that the real-time image is in the reference image The similarity measure at (x, y), where T is the real-time image and I is the reference image.

进一步地,图像匹配位置与无人机所在位置之间的转换关系为

Figure BDA0003157886820000032
其中,h为飞机飞行高度,x和y为相机正下方与拍摄图像间位置关系,rd为相机拍摄位置中心点在相机机体坐标系下的投影,rn为rd在地理坐标系下投影,
Figure BDA0003157886820000033
为从相机机体坐标系向地理坐标系转换的矩阵。Further, the conversion relationship between the image matching position and the position of the UAV is as follows:
Figure BDA0003157886820000032
Among them, h is the flight height of the aircraft, x and y are the positional relationship between the camera directly under the camera and the captured image, r d is the projection of the center point of the camera shooting position in the camera body coordinate system, and r n is the projection of r d in the geographic coordinate system ,
Figure BDA0003157886820000033
is the matrix to convert from the camera body coordinate system to the geographic coordinate system.

进一步地,在卡尔曼滤波器中,状态向量包括北速误差、天速误差、东速误差、纬度误差、高度误差、经度误差、北向失准角、天向失准角、东向失准角、x轴安装误差、y轴安装误差、z轴安装误差和激光测距刻度系数误差。Further, in the Kalman filter, the state vector includes north speed error, sky speed error, east speed error, latitude error, altitude error, longitude error, north misalignment angle, sky misalignment angle, and east misalignment angle. , x-axis installation error, y-axis installation error, z-axis installation error and laser ranging scale coefficient error.

进一步地,在卡尔曼滤波器中,观测矩阵为H(k)=[H1 H2 H3 H4 H5],其中,

Figure BDA0003157886820000034
rN为北向距离,rU为天向距离,rE为东向距离。Further, in the Kalman filter, the observation matrix is H(k)=[H 1 H 2 H 3 H 4 H 5 ], where,
Figure BDA0003157886820000034
r N is the distance to the north, r U is the distance to the sky, and r E is the distance to the east.

根据本发明的另一方面,提供了一种飞行器,飞行器使用如上所述的基于异源图像匹配的惯性视觉组合导航定位方法进行组合导航定位。According to another aspect of the present invention, an aircraft is provided, which uses the above-mentioned heterogeneous image matching-based inertial visual integrated navigation and positioning method to perform integrated navigation and positioning.

应用本发明的技术方案,提供了一种基于异源图像匹配的惯性视觉组合导航定位方法,该惯性视觉组合导航定位方法针对GPS拒止条件下,无人机的全天时导航定位问题,开展异源图像匹配定位技术的研究,首先提出利用激光测距传感器捕获飞行高度信息的方式,解决巡航段小机动条件下无法对尺度信息进行初始化的问题,通过惯性/激光测距/图像信息的融合,实现实时图与基准图的尺度统一;其次采用基于投影和量化梯度方向直方图的方法,对结构特征进行提取与二进制编码,解决异源图像匹配过程中由于图像信息不一致的误匹配问题;最后提出一种基于惯性/激光测距机的几何定位方法,解决高空飞行时二维平面控制点的PNP位姿解算过程中的姿态误差较大问题,此种方式能够实现异源图像之间的稳定匹配,提高组合导航算法的自主性。By applying the technical solution of the present invention, an inertial vision integrated navigation and positioning method based on heterologous image matching is provided. In the research of heterologous image matching and positioning technology, firstly, the method of using laser ranging sensor to capture flight height information is proposed to solve the problem that the scale information cannot be initialized under small maneuvering conditions in the cruise segment. Through the fusion of inertial/laser ranging/image information , to achieve the unification of the scale of the real-time image and the reference image; secondly, the method based on the projection and quantized gradient direction histogram is used to extract and binary code the structural features, so as to solve the problem of mismatching due to inconsistent image information in the process of heterogeneous image matching. A geometric positioning method based on inertial/laser rangefinder is proposed to solve the problem of large attitude error in the process of PNP pose calculation of two-dimensional plane control points during high-altitude flight. This method can realize the difference between heterogeneous images. Stable matching improves the autonomy of the integrated navigation algorithm.

附图说明Description of drawings

所包括的附图用来提供对本发明实施例的进一步的理解,其构成了说明书的一部分,用于例示本发明的实施例,并与文字描述一起来阐释本发明的原理。显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。The accompanying drawings, which are included to provide a further understanding of the embodiments of the invention, constitute a part of the specification, are used to illustrate the embodiments of the invention, and together with the description, serve to explain the principles of the invention. Obviously, the drawings in the following description are only some embodiments of the present invention, and for those of ordinary skill in the art, other drawings can also be obtained from these drawings without creative effort.

图1示出了根据本发明的具体实施例提供的坐标系的侧视图;1 shows a side view of a coordinate system provided according to a specific embodiment of the present invention;

图2示出了根据本发明的具体实施例提供的坐标系的俯视图;FIG. 2 shows a plan view of a coordinate system provided according to a specific embodiment of the present invention;

图3示出了根据本发明的具体实施例提供的拍摄图像的结构示意图;FIG. 3 shows a schematic structural diagram of a captured image provided according to a specific embodiment of the present invention;

图4示出了根据本发明的具体实施例提供的图像矫正结果的结构示意图;4 shows a schematic structural diagram of an image correction result provided according to a specific embodiment of the present invention;

图5示出了根据本发明的具体实施例提供的红外与可见光结构特征对比的结构示意图;5 shows a schematic structural diagram of a comparison of infrared and visible light structural features provided according to a specific embodiment of the present invention;

图6示出了根据本发明的具体实施例提供的投影和量化梯度方向直方图生成的流程框图;FIG. 6 shows a flow chart of the projection and quantization gradient direction histogram generation provided according to a specific embodiment of the present invention;

图7示出了根据本发明的具体实施例提供的二维高斯函数曲线的结构示意图;7 shows a schematic structural diagram of a two-dimensional Gaussian function curve provided according to a specific embodiment of the present invention;

图8示出了根据本发明的具体实施例提供的细胞单元的结构示意图;FIG. 8 shows a schematic structural diagram of a cell unit provided according to a specific embodiment of the present invention;

图9示出了根据本发明的具体实施例提供的实时图在基准图中的位置的结构示意图;FIG. 9 shows a schematic structural diagram of the position of a real-time graph in a reference graph provided according to a specific embodiment of the present invention;

图10示出了根据本发明的具体实施例提供的无人机定位点与图像匹配点位置关系的结构示意图。FIG. 10 shows a schematic structural diagram of the positional relationship between the positioning point of the UAV and the image matching point according to a specific embodiment of the present invention.

具体实施方式Detailed ways

需要说明的是,在不冲突的情况下,本申请中的实施例及实施例中的特征可以相互组合。下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。以下对至少一个示例性实施例的描述实际上仅仅是说明性的,决不作为对本发明及其应用或使用的任何限制。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。It should be noted that the embodiments in the present application and the features of the embodiments may be combined with each other in the case of no conflict. The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only a part of the embodiments of the present invention, but not all of the embodiments. The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative efforts shall fall within the protection scope of the present invention.

需要注意的是,这里所使用的术语仅是为了描述具体实施方式,而非意图限制根据本申请的示例性实施方式。如在这里所使用的,除非上下文另外明确指出,否则单数形式也意图包括复数形式,此外,还应当理解的是,当在本说明书中使用术语“包含”和/或“包括”时,其指明存在特征、步骤、操作、器件、组件和/或它们的组合。It should be noted that the terminology used herein is for the purpose of describing specific embodiments only, and is not intended to limit the exemplary embodiments according to the present application. As used herein, unless the context clearly dictates otherwise, the singular is intended to include the plural as well, furthermore, it is to be understood that when the terms "comprising" and/or "including" are used in this specification, it indicates that There are features, steps, operations, devices, components and/or combinations thereof.

除非另外具体说明,否则在这些实施例中阐述的部件和步骤的相对布置、数字表达式和数值不限制本发明的范围。同时,应当明白,为了便于描述,附图中所示出的各个部分的尺寸并不是按照实际的比例关系绘制的。对于相关领域普通技术人员已知的技术、方法和设备可能不作详细讨论,但在适当情况下,所述技术、方法和设备应当被视为授权说明书的一部分。在这里示出和讨论的所有示例中,任何具体值应被解释为仅仅是示例性的,而不是作为限制。因此,示例性实施例的其它示例可以具有不同的值。应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步讨论。The relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the invention unless specifically stated otherwise. Meanwhile, it should be understood that, for the convenience of description, the dimensions of various parts shown in the accompanying drawings are not drawn in an actual proportional relationship. Techniques, methods, and devices known to those of ordinary skill in the relevant art may not be discussed in detail, but where appropriate, such techniques, methods, and devices should be considered part of the authorized description. In all examples shown and discussed herein, any specific value should be construed as illustrative only and not as limiting. Accordingly, other examples of exemplary embodiments may have different values. It should be noted that like numerals and letters refer to like items in the following figures, so once an item is defined in one figure, it does not require further discussion in subsequent figures.

如图1至图10所示,根据本发明的具体实施例提供了一种基于异源图像匹配的惯性视觉组合导航定位方法,该基于异源图像匹配的惯性视觉组合导航定位方法包括:利用惯导姿态信息以及激光测距高度信息将实时图和基准图分别进行正射影像校正,统一实时图和基准图的缩放尺寸;分别获取实时图的第一投影和量化梯度方向直方图以及基准图的第二投影和量化梯度方向直方图;基于相似性测量原理对第一投影和量化梯度方向直方图和第二投影和量化梯度方向直方图的相似度进行检测,在第二投影和量化梯度方向直方图中检索与第一投影和量化梯度方向直方图的汉明距离最大值的位置,汉明距离最大值所在的位置即为实时图在基准图中的图像匹配位置;将图像匹配位置转换到无人机所在位置并作为观测量,基于观测量构建卡尔曼滤波器以实现无人机的惯性视觉组合导航定位。As shown in FIG. 1 to FIG. 10 , according to a specific embodiment of the present invention, an inertial vision combined navigation and positioning method based on heterologous image matching is provided, and the inertial vision combined navigation and positioning method based on heterologous image matching includes: The real-time image and the reference image are respectively corrected by orthophotos based on the orientation information and laser ranging height information, and the scaling size of the real-time image and the baseline image is unified; The second projected and quantized gradient direction histogram; based on the similarity measurement principle, the similarity between the first projected and quantized gradient direction histogram and the second projected and quantized gradient direction histogram is detected, and the second projected and quantized gradient direction histogram Retrieve the position of the maximum Hamming distance from the first projection and the quantized gradient direction histogram in the figure, and the position of the maximum Hamming distance is the image matching position of the real-time image in the reference image; convert the image matching position to no The position of the human and the machine is used as the observation quantity, and the Kalman filter is constructed based on the observation quantity to realize the inertial visual integrated navigation and positioning of the UAV.

应用此种配置方式,提供了一种基于异源图像匹配的惯性视觉组合导航定位方法,该惯性视觉组合导航定位方法针对GPS拒止条件下,无人机的全天时导航定位问题,开展异源图像匹配定位技术的研究,首先提出利用激光测距传感器捕获飞行高度信息的方式,解决巡航段小机动条件下无法对尺度信息进行初始化的问题,通过惯性/激光测距/图像信息的融合,实现实时图与基准图的尺度统一;其次采用基于投影和量化梯度方向直方图的方法,对结构特征进行提取与二进制编码,解决异源图像匹配过程中由于图像信息不一致的误匹配问题;最后提出一种基于惯性/激光测距机的几何定位方法,解决高空飞行时二维平面控制点的PNP位姿解算过程中的姿态误差较大问题,此种方式能够实现异源图像之间的稳定匹配,提高组合导航算法的自主性。Using this configuration method, an inertial vision integrated navigation and positioning method based on heterologous image matching is provided. In the research of source image matching and positioning technology, firstly, the method of using laser ranging sensor to capture flight height information is proposed to solve the problem that scale information cannot be initialized under small maneuvering conditions in the cruise segment. Through the fusion of inertial/laser ranging/image information, Realize the scale unification of real-time image and reference image; secondly, the method based on projection and quantized gradient direction histogram is used to extract and binary code structural features to solve the problem of mismatching due to inconsistent image information in the process of heterogeneous image matching. A geometric positioning method based on inertial/laser rangefinder, to solve the problem of large attitude error in the process of PNP pose calculation of two-dimensional plane control points during high-altitude flight, this method can achieve stability between heterogeneous images Matching to improve the autonomy of the combined navigation algorithm.

在本发明中,为了实现基于异源图像匹配的惯性视觉组合导航定位,首先需要利用惯导姿态信息以及激光测距高度信息将实时图和基准图分别进行正射影像校正,统一实时图和基准图的缩放尺寸。In the present invention, in order to realize the inertial vision combined navigation and positioning based on heterologous image matching, it is first necessary to use the inertial navigation attitude information and the laser ranging height information to perform orthophoto correction on the real-time image and the reference image respectively, and unify the real-time image and the reference image. The scaled size of the graph.

具体地,坐标系具体包括摄像机坐标系(c系)、正射摄像机坐标系(

Figure BDA0003157886820000073
系)、惯导载体坐标系(b系)和地理坐标系(n系),其各自定义具体如下。Specifically, the coordinate system specifically includes a camera coordinate system (c system), an ortho camera coordinate system (
Figure BDA0003157886820000073
system), inertial navigation carrier coordinate system (b system) and geographic coordinate system (n system), and their definitions are as follows.

摄像机坐标系(c系):以光学系统的像方主点为原点oc;当正对光学系统观察时,xc轴平行于成像平面坐标系的横轴(长边),左向为正;yc轴平行于成像平面坐标系的纵轴(短边),下向为正;zc轴指向观察者,并与xc轴和yc轴构成右手坐标系。Camera coordinate system (c system): take the principal point of the image side of the optical system as the origin o c ; when observing the optical system, the x c axis is parallel to the horizontal axis (long side) of the imaging plane coordinate system, and the left direction is positive ; y c axis is parallel to the longitudinal axis (short side) of the imaging plane coordinate system, and the downward direction is positive; z c axis points to the observer, and forms a right-handed coordinate system with x c axis and y c axis.

正射摄像机坐标系(

Figure BDA0003157886820000071
系):假设空中存在一个正射摄像机,该摄像机生成的图像不需经过校正即为正射影像,可知
Figure BDA0003157886820000072
坐标系的三轴分别指向东、南、地。Ortho camera coordinate system (
Figure BDA0003157886820000071
System): Assuming that there is an orthophoto camera in the air, the image generated by the camera is an orthophoto image without correction, and it can be seen that
Figure BDA0003157886820000072
The three axes of the coordinate system point to east, south, and ground, respectively.

惯导载体坐标系(b系):惯导安装在飞行器载体上,坐标系原点Ob取为惯导质心,Xb沿载体纵轴方向,向前为正;Yb沿载体竖轴方向,向上为正;Zb沿载体侧轴方向,向右为正。Inertial navigation carrier coordinate system (b system): Inertial navigation is installed on the aircraft carrier, the origin of the coordinate system O b is taken as the inertial navigation center of mass, X b is along the longitudinal axis of the carrier, and forward is positive; Y b is along the vertical axis of the carrier, Up is positive; Z b is along the side axis of the carrier, and right is positive.

地理坐标系(n系):坐标系原点On取为飞行器质心,Xn轴指北,Yn轴指天,Zn轴指东。Geographical coordinate system (n system): The origin of the coordinate system O n is taken as the center of mass of the aircraft, the X n axis refers to the north, the Y n axis refers to the sky, and the Z n axis refers to the east.

根据针孔摄像机模型,空间点P从世界坐标系中的齐次坐标

Figure BDA0003157886820000081
投影到图像坐标系中的齐次坐标
Figure BDA0003157886820000082
的过程可以描述为
Figure BDA0003157886820000083
According to the pinhole camera model, the spatial point P is from homogeneous coordinates in the world coordinate system
Figure BDA0003157886820000081
Homogeneous coordinates projected into the image coordinate system
Figure BDA0003157886820000082
The process can be described as
Figure BDA0003157886820000083

其中,

Figure BDA0003157886820000084
in,
Figure BDA0003157886820000084

公式中c与r分别为空间点P在图像坐标系中的列坐标值与行坐标值,zc为空间点P在摄像机坐标系中的坐标的z分量,f为光学系统主距,sc为图像传感器上列方向上相邻像素之间的距离,sr为图像传感器上行方向上相邻像素之间的距离,[cc,cr]T是图像的主点,光学系统是基于针孔相机模型设计的,图像传感器是光学系统的一部分。图像传感器上面有cmos光感元器件,图像每个像素对应感应器的一个点阵,两个点间的距离叫象元尺寸,就是相邻像素间的距离。

Figure BDA0003157886820000085
是一个旋转矩阵,描述了从世界坐标系向摄像机坐标系的旋转过程。
Figure BDA0003157886820000086
是从世界坐标系向摄像机坐标系转换的平移向量,即世界坐标系原点在摄像机坐标系下的坐标,
Figure BDA0003157886820000087
为空间点P在世界坐标系中的齐次坐标。In the formula, c and r are the column coordinate value and row coordinate value of the space point P in the image coordinate system, respectively, z c is the z component of the coordinate of the space point P in the camera coordinate system, f is the principal distance of the optical system, s c is the distance between adjacent pixels in the column direction of the image sensor, s r is the distance between adjacent pixels in the upward direction of the image sensor, [c c ,c r ] T is the main point of the image, and the optical system is based on the needle Designed for the hole camera model, the image sensor is part of the optical system. There are cmos photosensitive components on the image sensor. Each pixel of the image corresponds to a dot matrix of the sensor. The distance between two points is called the pixel size, which is the distance between adjacent pixels.
Figure BDA0003157886820000085
is a rotation matrix that describes the rotation process from the world coordinate system to the camera coordinate system.
Figure BDA0003157886820000086
is the translation vector converted from the world coordinate system to the camera coordinate system, that is, the coordinates of the origin of the world coordinate system in the camera coordinate system,
Figure BDA0003157886820000087
is the homogeneous coordinate of the space point P in the world coordinate system.

假设有两个内参完全一致的摄像机,分别记作c与

Figure BDA0003157886820000088
这两个摄像机在同一地点不同角度对地面成像,其中
Figure BDA0003157886820000089
生成的影像为正射影像。根据针孔摄像机成像模型,世界坐标系中的空间点P在两个摄像机中所成的图像坐标分别为Suppose there are two cameras with identical internal parameters, denoted as c and
Figure BDA0003157886820000088
The two cameras image the ground from different angles at the same location, where
Figure BDA0003157886820000089
The resulting image is an orthophoto. According to the pinhole camera imaging model, the image coordinates of the space point P in the world coordinate system in the two cameras are respectively

Figure BDA00031578868200000810
Figure BDA00031578868200000810

Figure BDA00031578868200000811
Figure BDA00031578868200000811

位置姿态矩阵可转化如下The position and attitude matrix can be transformed as follows

Figure BDA0003157886820000091
Figure BDA0003157886820000091

其中

Figure BDA0003157886820000092
为两个摄像机之间的旋转矩阵,代入
Figure BDA0003157886820000093
成像方程可得in
Figure BDA0003157886820000092
For the rotation matrix between the two cameras, substitute
Figure BDA0003157886820000093
The imaging equation is available

Figure BDA0003157886820000094
Figure BDA0003157886820000094

即在获得摄像机内参K、旋转矩阵

Figure BDA0003157886820000095
的条件下,可将非正射影像
Figure BDA0003157886820000096
转化为正射影像
Figure BDA0003157886820000097
That is, after obtaining the camera's internal parameters K, rotation matrix
Figure BDA0003157886820000095
Under the condition of , the non-orthophoto image can be
Figure BDA0003157886820000096
Convert to orthophoto
Figure BDA0003157886820000097

因此,实时图和基准图均可根据

Figure BDA0003157886820000098
实现正射影像校正,其中,
Figure BDA0003157886820000099
Figure BDA00031578868200000910
为非正射影像,
Figure BDA00031578868200000911
为正射影像,
Figure BDA00031578868200000912
为空间点P在世界坐标系中的齐次坐标,zc为空间点P在摄像机坐标系中的坐标的z分量,f为光学系统主距,sc为图像传感器上列方向上相邻像素之间的距离,sr为图像传感器上行方向上相邻像素之间的距离,[cc,cr]T为图像的主点,
Figure BDA00031578868200000913
为两个内参完全一致的摄像机之间的旋转矩阵,
Figure BDA00031578868200000914
为从世界坐标系向摄像机坐标系转换的旋转矩阵,
Figure BDA00031578868200000915
为从世界坐标系向摄像机坐标系转换的平移向量。Therefore, both the real-time graph and the baseline graph can be based on
Figure BDA0003157886820000098
Implements orthophoto correction, where,
Figure BDA0003157886820000099
Figure BDA00031578868200000910
is a non-orthophoto image,
Figure BDA00031578868200000911
is an orthophoto,
Figure BDA00031578868200000912
is the homogeneous coordinate of the space point P in the world coordinate system, z c is the z component of the coordinate of the space point P in the camera coordinate system, f is the principal distance of the optical system, and s c is the adjacent pixels in the column direction of the image sensor The distance between them, s r is the distance between adjacent pixels in the upward direction of the image sensor, [c c ,c r ] T is the main point of the image,
Figure BDA00031578868200000913
is the rotation matrix between two cameras with identical internal parameters,
Figure BDA00031578868200000914
is the rotation matrix converted from the world coordinate system to the camera coordinate system,
Figure BDA00031578868200000915
is the translation vector transformed from the world coordinate system to the camera coordinate system.

在将实时图和基准图分别进行正射影像校正之后,需要统一实时图和基准图的缩放尺寸。为了将实时图与基准图的像素分辨率进行统一,都缩放为1m/像素。基准图通过商用地图软件直接获取,实时图利用激光测距机测得相机光心距拍摄点距离进行实时处理来获取。实时图和基准图的缩放系数可根据

Figure BDA00031578868200000916
来获取,其中,μ为像元尺寸,f为相机焦距,l为相机光心距拍摄点距离。After orthophoto correction is performed on the real-time image and the reference image respectively, the scaling size of the real-time image and the baseline image needs to be unified. In order to unify the pixel resolution of the real-time image and the reference image, both are scaled to 1m/pixel. The benchmark map is directly obtained by commercial map software, and the real-time map is obtained by real-time processing of the distance between the optical center of the camera and the shooting point measured by a laser rangefinder. The zoom factor of the real-time graph and the reference graph can be adjusted according to
Figure BDA00031578868200000916
to obtain, where μ is the pixel size, f is the focal length of the camera, and l is the distance between the optical center of the camera and the shooting point.

进一步地,在统一实时图和基准图的缩放尺寸之后,即可分别获取实时图的第一投影和量化梯度方向直方图以及基准图的第二投影和量化梯度方向直方图。在本发明中,异源图像间的灰度差别比较大,但是图像中的结构特征仍然比较接近,如图5(a)和图5(b)所示。图像的梯度很好的反应了图像中的结构特征,如图5(c)和图5(d)所示,因此利用HOG特征来对实时图和基准图进行描述,可以很好的对两者进行相似性比较。Further, after unifying the scaling sizes of the real-time graph and the reference graph, the first projection and quantized gradient direction histogram of the real-time graph and the second projection and quantized gradient direction histogram of the reference graph can be obtained respectively. In the present invention, the grayscale difference between the heterologous images is relatively large, but the structural features in the images are still relatively close, as shown in Fig. 5(a) and Fig. 5(b). The gradient of the image reflects the structural features in the image well, as shown in Figure 5(c) and Figure 5(d). Do a similarity comparison.

具体地,在本发明中,获取实时图的第一投影和量化梯度方向直方图具体包括:对实时图进行高斯滤波;采用Sobel算子提取高斯滤波后的实时图的灰度图像梯度特征以获取实时图的梯度图像;基于实时图的梯度图像,统计实时图的梯度直方图;对实时图的梯度直方图进行投影和量化以获取第一投影和量化梯度方向直方图;获取基准图的第二投影和量化梯度方向直方图具体包括:对基准图进行高斯滤波;采用Sobel算子提取高斯滤波后的基准图的灰度图像梯度特征以获取基准图的梯度图像;基于基准图的梯度图像,统计基准图的梯度直方图;对基准图的梯度直方图进行投影和量化以获取第二投影和量化梯度方向直方图。下面针对实时图或基准图来详细说明投影和量化梯度方向直方图获取的具体流程。Specifically, in the present invention, acquiring the first projection and quantized gradient direction histogram of the real-time image specifically includes: performing Gaussian filtering on the real-time image; using the Sobel operator to extract the grayscale image gradient feature of the Gaussian filtered real-time image to obtain The gradient image of the real-time graph; based on the gradient image of the real-time graph, the gradient histogram of the real-time graph is counted; the gradient histogram of the real-time graph is projected and quantized to obtain the first projected and quantized gradient direction histogram; the second of the reference graph is obtained The projected and quantified gradient direction histogram specifically includes: performing Gaussian filtering on the reference image; using the Sobel operator to extract the grayscale image gradient features of the Gaussian filtered baseline image to obtain the gradient image of the baseline image; based on the gradient image of the baseline image, statistical The gradient histogram of the benchmark map; the gradient histogram of the benchmark map is projected and quantized to obtain a second projected and quantized gradient direction histogram. The specific process of obtaining the histogram of projected and quantized gradient directions will be described in detail below with respect to the real-time graph or the reference graph.

首先利用高斯滤波对图像进行去噪。在图像处理中,采用二维高斯函数构造维卷积核,对像素值进行处理,二维高斯函数可以描述如下:First, the image is denoised using Gaussian filtering. In image processing, a two-dimensional Gaussian function is used to construct a two-dimensional convolution kernel to process pixel values. The two-dimensional Gaussian function can be described as follows:

Figure BDA0003157886820000101
其中,P(μ12)为像素点,σ为高斯系数。
Figure BDA0003157886820000101
Among them, P(μ 1 , μ 2 ) is the pixel point, and σ is the Gaussian coefficient.

将图像局部像素表示为:Represent image local pixels as:

Figure BDA0003157886820000102
Figure BDA0003157886820000102

Figure BDA0003157886820000111
Figure BDA0003157886820000111

对图像进行滤波时是对每一个像素点进行卷积操作,对于像素点P(μ12),利用高斯函数计算得到的卷积核对该像素点进行卷积,其卷积核可表示为:When filtering the image, the convolution operation is performed on each pixel point. For the pixel point P (μ 1 , μ 2 ), the convolution kernel calculated by the Gaussian function is used to convolve the pixel point, and the convolution kernel can represent for:

K(μ<sub>1</sub>-1,μ<sub>2</sub>-1)K(μ<sub>1</sub>-1,μ<sub>2</sub>-1) K(μ<sub>1</sub>,μ<sub>2</sub>-1)K(μ<sub>1</sub>,μ<sub>2</sub>-1) K(μ<sub>1</sub>+1,μ<sub>2</sub>-1)K(μ<sub>1</sub>+1,μ<sub>2</sub>-1) K(μ<sub>1</sub>-1,μ<sub>2</sub>)K(μ<sub>1</sub>-1,μ<sub>2</sub>) K(μ<sub>1</sub>,μ<sub>2</sub>)K(μ<sub>1</sub>,μ<sub>2</sub>) K(μ<sub>1</sub>+1,μ<sub>2</sub>)K(μ<sub>1</sub>+1,μ<sub>2</sub>) K(μ<sub>1</sub>-1,μ<sub>2</sub>+1)K(μ<sub>1</sub>-1,μ<sub>2</sub>+1) K(μ<sub>1</sub>,μ<sub>2</sub>+1)K(μ<sub>1</sub>,μ<sub>2</sub>+1) K(μ<sub>1</sub>+1,μ<sub>2</sub>+1)K(μ<sub>1</sub>+1,μ<sub>2</sub>+1)

其中,K为归一化之后的结果。Among them, K is the result after normalization.

Figure BDA0003157886820000112
Figure BDA0003157886820000112

卷积之后,对应像素点的值为:After convolution, the value of the corresponding pixel is:

Figure BDA0003157886820000113
Figure BDA0003157886820000113

将对图像所有的像素值都进行这样的操作后,就得到了高斯滤波后的图像,由于一般情况下总是顺序去卷积的,从左至右,从上而下,所以这个过程就是卷积核的滑动。当滑动到边界的时候,卷积核对应的位置没有像素值,因此在进行卷积之前,通常先对原始图像的边缘进行扩充,再进行卷积操作。扩充的部分在进行卷积时可以舍去或者自动填零。After this operation is performed on all pixel values of the image, the Gaussian filtered image is obtained. Since the deconvolution is generally performed sequentially, from left to right, and from top to bottom, this process is called convolution. Sliding of the nuclei. When sliding to the boundary, the corresponding position of the convolution kernel has no pixel value, so before performing the convolution, the edge of the original image is usually expanded, and then the convolution operation is performed. The expanded part can be discarded or automatically filled with zeros during convolution.

高斯函数具有五个重要的性质,这些性质使得它在图像处理中特别有用.这些性质表明,高斯平滑滤波器无论在空间域还是在频率域都是十分有效的低通滤波器,且在实际图像处理中得到了有效使用。具体如下:The Gaussian function has five important properties that make it particularly useful in image processing. These properties show that the Gaussian smoothing filter is a very effective low-pass filter in both the spatial and frequency domains, and in real images used effectively in processing. details as follows:

(1)二维高斯函数具有旋转对称性,即滤波器在各个方向上的平滑程度是相同的.一般来说,一幅图像的边缘方向是事先不知道的,因此,在滤波前是无法确定一个方向上比另一方向上需要更多的平滑.旋转对称性意味着高斯平滑滤波器在后续边缘检测中不会偏向任一方向。(1) The two-dimensional Gaussian function has rotational symmetry, that is, the smoothness of the filter is the same in all directions. Generally speaking, the edge direction of an image is not known in advance, so it cannot be determined before filtering. More smoothing is required in one direction than the other. Rotational symmetry means that the Gaussian smoothing filter is not biased in either direction in subsequent edge detections.

(2)高斯函数是单值函数。这表明,高斯滤波器用像素邻域的加权均值来代替该点的像素值,而每一邻域像素点权值是随该点与中心点的距离单调增减的.这一性质是很重要的,因为边缘是一种图像局部特征,如果平滑运算对离算子中心很远的像素点仍然有很大作用,则平滑运算会使图像失真。(2) The Gaussian function is a single-valued function. This shows that the Gaussian filter replaces the pixel value of the point with the weighted mean of the pixel neighborhood, and the weight of each neighborhood pixel point increases or decreases monotonically with the distance between the point and the center point. This property is very important , because the edge is a local feature of the image, if the smoothing operation still has a great effect on the pixels far away from the center of the operator, the smoothing operation will distort the image.

(3)高斯函数的傅立叶变换频谱是单瓣的。这一性质是高斯函数傅立叶变换等于高斯函数本身这一事实的直接推论.图像常被不希望的高频信号所污染(噪声和细纹理)。而所希望的图像特征(如边缘),既含有低频分量,又含有高频分量.高斯函数傅立叶变换的单瓣意味着平滑图像不会被不需要的高频信号所污染,同时保留了大部分所需信号。(3) The Fourier transform spectrum of the Gaussian function is a single lobe. This property is a direct corollary of the fact that the Fourier transform of the Gaussian function is equal to the Gaussian function itself. Images are often contaminated with unwanted high frequency signals (noise and fine texture). The desired image features, such as edges, contain both low-frequency and high-frequency components. The single lobe of the Fourier transform of the Gaussian function means that the smooth image is not polluted by unwanted high-frequency signals, while preserving most of the desired signal.

(4)高斯滤波器宽度(决定着平滑程度)是由参数表征的,而且和平滑程度的关系是非常简单的。越大,高斯滤波器的频带就越宽,平滑程度就越好。通过调节平滑程度参数,可在图像特征过分模糊(过平滑)与平滑图像中由于噪声和细纹理所引起的过多的不希望突变量(欠平滑)之间取得折衷。(4) The width of the Gaussian filter (determining the degree of smoothness) is characterized by parameters, and the relationship with the degree of smoothness is very simple. The larger it is, the wider the frequency band of the Gaussian filter, and the better the smoothness. By adjusting the smoothness parameter, a compromise can be achieved between excessive blurring of image features (oversmoothing) and excessive undesired abrupt changes (undersmoothing) in the smoothed image due to noise and fine textures.

(5)由于高斯函数的可分离性,较大尺寸的高斯滤波器可以得以有效地实现。二维高斯函数卷积可以分两步来进行,首先将图像与一维高斯函数进行卷积,然后将卷积结果与方向垂直的相同一维高斯函数卷积。因此,二维高斯滤波的计算量随滤波模板宽度成线性增长而不是成平方增长。(5) Due to the separability of the Gaussian function, larger size Gaussian filters can be effectively implemented. The two-dimensional Gaussian function convolution can be performed in two steps, first convolving the image with a one-dimensional Gaussian function, and then convolving the convolution result with the same one-dimensional Gaussian function perpendicular to the direction. Therefore, the computational cost of 2D Gaussian filtering grows linearly rather than quadratically with the width of the filter template.

其次对图像进行高斯滤波之后采用Sobel算子提取灰度图像的梯度特征,具体计算如下:Secondly, after Gaussian filtering of the image, the Sobel operator is used to extract the gradient features of the grayscale image. The specific calculation is as follows:

Gx=(-1)×f(x-1,y-1)+0×f(x,y-1)+1×f(x+1,y-1)Gx=(-1)×f(x-1,y-1)+0×f(x,y-1)+1×f(x+1,y-1)

+(-2)×f(x-1,y)+0×f(x,y)+2×f(x+1,y)+(-2)×f(x-1,y)+0×f(x,y)+2×f(x+1,y)

+(-1)×f(x-1,y+1)+0×f(x,y+1)+1×f(x+1,y+1)+(-1)×f(x-1,y+1)+0×f(x,y+1)+1×f(x+1,y+1)

=[f(x+1,y-1)+2×f(x+1,y)+f(x+1,y+1)]-[f(x-1,y-1)+2×f(x-1,y)+f(x-1,y+1)]=[f(x+1,y-1)+2×f(x+1,y)+f(x+1,y+1)]-[f(x-1,y-1)+2× f(x-1,y)+f(x-1,y+1)]

Gy=1×f(x-1,y-1)+2×f(x,y-1)+1×f(x+1,y-1)Gy=1×f(x-1,y-1)+2×f(x,y-1)+1×f(x+1,y-1)

+0×f(x-1,y)+0×f(x,y)+0×f(x+1,y)+0×f(x-1,y)+0×f(x,y)+0×f(x+1,y)

+(-1)×f(x-1,y+1)+(-2)×f(x,y+1)+(-1)×f(x+1,y+1)+(-1)×f(x-1,y+1)+(-2)×f(x,y+1)+(-1)×f(x+1,y+1)

=[f(x-1,y-1)+2×f(x,y-1)+f(x+1,y-1)]-[f(x-1,y+1)+2×f(x,y+1)+f(x+1,y+1)]=[f(x-1,y-1)+2×f(x,y-1)+f(x+1,y-1)]-[f(x-1,y+1)+2× f(x,y+1)+f(x+1,y+1)]

其中f(x,y)表示图像(x,y)点的灰度值。梯度幅值

Figure BDA0003157886820000131
梯度方向
Figure BDA0003157886820000132
梯度方向实际范围为[-ππ],实际解算的θ范围映射到了
Figure BDA0003157886820000133
由于图像从亮到暗和从暗到亮引起的图像梯度方向可能为互成补角的关系,这样映射可以保证红外图像不受明暗变换的影响。得到梯度图像之后,统计梯度直方图,并进行投影和量化。where f(x,y) represents the gray value of the image (x,y) point. Gradient magnitude
Figure BDA0003157886820000131
Gradient direction
Figure BDA0003157886820000132
The actual range of the gradient direction is [-ππ], and the actual calculated θ range is mapped to
Figure BDA0003157886820000133
Since the image gradient directions caused by the image from light to dark and from dark to light may be complementary angles, this mapping can ensure that the infrared image is not affected by the light-dark transformation. After the gradient image is obtained, the gradient histogram is counted, projected and quantified.

将图像划分为小的连通区域,称之为细胞单元(cell),这里细胞单元的大小为:8×8个像素。统计每个细胞单元的方向梯度直方图。统计方法为:将梯度方向分割成8个区间(bin),从

Figure BDA0003157886820000134
Figure BDA0003157886820000135
如图8所示,然后根据细胞中每个像素点的梯度方向的大小落在哪个bin,对应的区间计数加一,这样,对细胞单元内每个像素用梯度方向在直方图中进行加权投影(映射到固定的角度范围),就可以得到这个细胞单元的方向梯度直方图(HOG)。这样统计下来,得到w×h×8维的特征向量记为THOG,其中w和h分别为图像的列宽和行宽,并将THOG表示为w×h行,8列的矩阵形式。The image is divided into small connected regions, called cells, where the size of the cell is: 8 × 8 pixels. Statistical histograms of directional gradients for each cell unit. The statistical method is: divide the gradient direction into 8 intervals (bins), from
Figure BDA0003157886820000134
arrive
Figure BDA0003157886820000135
As shown in Figure 8, then according to which bin the size of the gradient direction of each pixel in the cell falls, the corresponding interval count is incremented by one, so that each pixel in the cell unit is weighted in the histogram with the gradient direction. (mapped to a fixed angular range), the histogram of directional gradients (HOG) for this cell unit can be obtained. In this way, a w×h×8 dimensional feature vector is obtained and denoted as T HOG , where w and h are the column width and row width of the image, respectively, and T HOG is represented as a matrix form of w×h rows and 8 columns.

最后,得到图像特征描述之后用投影矩阵对w×h×8维的特征描述进行投影,量化成24位二进制编码,投影矩阵为:Finally, after obtaining the image feature description, use the projection matrix to project the w×h×8-dimensional feature description, quantize it into a 24-bit binary code, and the projection matrix is:

Figure BDA0003157886820000141
Figure BDA0003157886820000141

Figure BDA0003157886820000142
Figure BDA0003157886820000142

用投影矩阵对THOG的每一行

Figure BDA0003157886820000143
进行投影,即Use the projection matrix for each row of T HOG
Figure BDA0003157886820000143
to project, i.e.

Figure BDA0003157886820000144
Figure BDA0003157886820000144

这样就得到了投影之后的梯度方向直方图统计矩阵T′HOG,将该矩阵中所有大于0的数置1,小于0的数置0,则该矩阵的每一行即为量化之后的24位二进制编码,也就得到了经过投影和量化的梯度方向直方图统计结果。In this way, the gradient direction histogram statistics matrix T' HOG after projection is obtained, and all the numbers greater than 0 in the matrix are set to 1, and the numbers less than 0 are set to 0, then each row of the matrix is the quantized 24-bit binary After encoding, the projected and quantized gradient direction histogram statistics are obtained.

进一步地,在本发明中,在分别获取了实时图和基准图的投影和量化梯度方向直方图之后,即可基于相似性测量原理对第一投影和量化梯度方向直方图和第二投影和量化梯度方向直方图的相似度进行检测,在第二投影和量化梯度方向直方图中检索与第一投影和量化梯度方向直方图的汉明距离最大值的位置,汉明距离最大值所在的位置即为实时图在基准图中的图像匹配位置。其中,第一投影和量化梯度方向直方图和第二投影和量化梯度方向直方图的相似度S(x,y)可根据

Figure BDA0003157886820000151
来获取,其中,d(a,b)为差别度量,f为单调函数,(u,v)为实时图中的像素坐标集合,I(x+u,y+v)表示实时图在参考图(x,y)处的相似性度量值,T为实时图,I为参考图Further, in the present invention, after the projection and quantization gradient direction histograms of the real-time graph and the reference graph are obtained respectively, the first projection and quantization gradient direction histogram and the second projection and quantization can be based on the similarity measurement principle. The similarity of the gradient direction histogram is detected, and the position of the maximum Hamming distance from the first projection and the quantized gradient direction histogram is retrieved in the second projection and quantized gradient direction histogram, and the position of the maximum Hamming distance is Match the location to the image in the base map for the real-time map. Among them, the similarity S(x, y) between the first projected and quantized gradient direction histogram and the second projected and quantized gradient direction histogram can be determined according to
Figure BDA0003157886820000151
to obtain, where d(a,b) is the difference measure, f is a monotonic function, (u,v) is the set of pixel coordinates in the real-time image, and I(x+u,y+v) indicates that the real-time image is in the reference image The similarity measure at (x, y), T is the real-time image, I is the reference image

具体地,设实时图模板像素坐标集合RT={(ui,vi)|i∈[1,N]},其中,N为模板中的像素个数,(ui,vi)为像素点坐标。基准参考影像RI={(x,y)|x∈[0,Nx-1],y∈[0,Ny-1]},Nx、Ny分别为参考影像的列数和行数。其中,模板为T,参考影像为I。描述T和I的相似性程度用S(x,y)表示。Specifically, set the real-time image template pixel coordinate set R T ={( u i ,vi )|i∈[1,N]}, where N is the number of pixels in the template, and ( u i ,vi ) is Pixel coordinates. Reference reference image R I ={(x,y)|x∈[0,N x -1],y∈[0,N y -1]}, N x and N y are the number of columns and rows of the reference image, respectively number. Among them, the template is T, and the reference image is I. The degree of similarity describing T and I is denoted by S(x,y).

Figure BDA0003157886820000152
Figure BDA0003157886820000152

其中,d(a,b)是差别度量,f是一个单调函数,(u,v)表示实时图的像素坐标集合,I(x+u,y+v)表示实时图在基准图(x,y)处的相似性度量值,通过改变(x,y)的值,完成实时图在基准图中的逐像素滑动计算相似性矩阵S。Among them, d(a,b) is the difference measure, f is a monotonic function, (u,v) represents the pixel coordinate set of the real-time image, I(x+u,y+v) represents the real-time image in the benchmark image (x, v) y), by changing the value of (x, y), the similarity matrix S is calculated by pixel-by-pixel sliding of the real-time image in the reference image.

计算相似性度量时,不直接使用图像的灰度值,而是使用汉明距离计算投影生成的24位二进制特征之间的差别:When calculating the similarity measure, instead of directly using the gray value of the image, the Hamming distance is used to calculate the difference between the 24-bit binary features generated by the projection:

Figure BDA0003157886820000153
Figure BDA0003157886820000153

其中,上式中BitCount是求二进制串a和b异或之后串中1的个数。当d(a,b)=0时,说明a和b一样,当d(a,b)=24时,说明a和b差别最大。显然d(a,b)是差别度量。Among them, BitCount in the above formula is the number of 1 in the string after XOR of binary strings a and b. When d(a,b)=0, it means that a and b are the same, and when d(a,b)=24, it means that a and b have the greatest difference. Obviously d(a,b) is the difference measure.

也可以按相似性对二进制串a和b进行比较Binary strings a and b can also be compared by similarity

Figure BDA0003157886820000154
Figure BDA0003157886820000154

当S(a,b)=24时,说明a和b一样,当S(a,b)=0时,说明a和b差别最大。若用对两张图像的所有二进制串进行比较,并求和。则和越大说明两图相似性越高。When S(a,b)=24, it means that a and b are the same, and when S(a,b)=0, it means that the difference between a and b is the largest. Compare all binary strings of two images and sum them. The larger the sum, the higher the similarity between the two images.

进一步地,在获取了实时图在基准图中的图像匹配位置之后,即可将图像匹配位置转换到无人机所在位置并作为观测量图像匹配位置与无人机所在位置之间的转换关系为

Figure BDA0003157886820000161
其中,h为飞机飞行高度,x和y为相机正下方与拍摄图像间位置关系,rd为相机拍摄位置中心点在相机坐标系下的投影,rn为rd在地理坐标系下投影,
Figure BDA0003157886820000162
为从相机机体坐标系向地理坐标系转换的矩阵。然后,基于观测量构建卡尔曼滤波器以实现无人机的惯性视觉组合导航定位。Further, after obtaining the image matching position of the real-time map in the reference map, the image matching position can be converted to the position of the UAV, and the conversion relationship between the matching position of the observation image and the position of the UAV is as follows:
Figure BDA0003157886820000161
Among them, h is the flight height of the aircraft, x and y are the positional relationship between the camera directly below and the captured image, r d is the projection of the center point of the camera shooting position in the camera coordinate system, rn is the projection of r d in the geographic coordinate system,
Figure BDA0003157886820000162
is the matrix to convert from the camera body coordinate system to the geographic coordinate system. Then, a Kalman filter is constructed based on the observations to realize the inertial vision integrated navigation and positioning of the UAV.

具体地,在本发明中,无人机定位点与图像匹配点位置关系如图10所示,以相机中心点为原点,定义相机机体坐标系d系(前上右),其中dx与惯导载体坐标系(b系)固连,d系x轴与b系x轴同向,d系y轴与b系y轴同向,d系z轴与b系z轴同向。d系与b系之间存在的旋转角度为相机与惯导间的安装误差。地理坐标系n系(北天东),相机拍摄位置中心点在相机坐标系下的投影为

Figure BDA0003157886820000163
l为相机中心点沿光轴到拍摄位置中心点距离。rd在地理坐标系下投影为
Figure BDA0003157886820000164
其中
Figure BDA0003157886820000165
h为飞机飞行高度,x和y为相机正下方与拍摄图像间位置关系,
Figure BDA0003157886820000166
为从相机机体坐标系(d系)向地理坐标系(n系)转换的矩阵,
Figure BDA0003157886820000167
为从惯导载体坐标系(b系)向地理坐标系转换(n系)的矩阵,
Figure BDA0003157886820000168
为从相机机体坐标系(d系)向惯导载体坐标系(b系)转换的矩阵。因此可以利用激光测距获取l、惯导姿态
Figure BDA0003157886820000169
与事先标定的安装误差角
Figure BDA00031578868200001610
将图像匹配位置转换到相机所在位置,即为无人机定位点。Specifically, in the present invention, the positional relationship between the positioning point of the UAV and the image matching point is shown in Figure 10. Taking the camera center point as the origin, the camera body coordinate system d (front upper right) is defined, where dx is related to the inertial navigation system. The carrier coordinate system (b system) is fixed, the d system x axis is in the same direction as the b system x axis, the d system y axis and the b system y axis are in the same direction, and the d system z axis and the b system z axis are in the same direction. The rotation angle between the d system and the b system is the installation error between the camera and the inertial navigation system. In the geographic coordinate system n system (North Tiandong), the projection of the center point of the camera shooting position under the camera coordinate system is
Figure BDA0003157886820000163
l is the distance from the center of the camera to the center of the shooting position along the optical axis. r d is projected in the geographic coordinate system as
Figure BDA0003157886820000164
in
Figure BDA0003157886820000165
h is the flight height of the aircraft, x and y are the positional relationship between the camera directly below and the captured image,
Figure BDA0003157886820000166
is the matrix converted from the camera body coordinate system (d system) to the geographic coordinate system (n system),
Figure BDA0003157886820000167
is the matrix for transforming from the inertial navigation carrier coordinate system (b system) to the geographic coordinate system (n system),
Figure BDA0003157886820000168
is the matrix converted from the camera body coordinate system (d system) to the inertial navigation carrier coordinate system (b system). Therefore, laser ranging can be used to obtain l, inertial navigation attitude
Figure BDA0003157886820000169
with the pre-calibrated installation error angle
Figure BDA00031578868200001610
Convert the image matching position to the position of the camera, which is the positioning point of the drone.

接着,基于观测量构建卡尔曼滤波器以实现无人机的惯性视觉组合导航定位。在卡尔曼滤波模型中,系统连续状态方程为

Figure BDA0003157886820000171
其中,F(t)为t时刻连续状态方程状态转移矩阵,
Figure BDA0003157886820000172
为t时刻系统随机噪声向量。滤波状态向量包括北天东速度误差(单位:m/s)、纬度误差(单位:rad)、高度误差(单位:m)、经度误差(单位:rad)、北天东向失准角(单位:rad)、相机与惯导间安装误差(单位:rad)和激光测距刻度系数误差,共13维。定义为:Next, a Kalman filter is constructed based on the observations to realize the inertial vision integrated navigation and positioning of the UAV. In the Kalman filter model, the continuous state equation of the system is
Figure BDA0003157886820000171
Among them, F(t) is the state transition matrix of the continuous state equation at time t,
Figure BDA0003157886820000172
is the random noise vector of the system at time t. The filtering state vector includes velocity error (unit: m/s), latitude error (unit: rad), altitude error (unit: m), longitude error (unit: rad), and misalignment angle (unit: rad) : rad), installation error between camera and inertial navigation (unit: rad) and laser ranging scale coefficient error, a total of 13 dimensions. defined as:

Figure BDA0003157886820000173
Figure BDA0003157886820000173

其中,δVN为北速误差,δVU为天速误差,δVE为东速误差,δL为纬度误差,δH为高度误差,δλ为经度误差,

Figure BDA0003157886820000174
为北向失准角,
Figure BDA0003157886820000175
为天向失准角,
Figure BDA0003157886820000176
为东向失准角,αx为x轴安装误差,αy为y轴安装误差,αz为z轴安装误差,δk为激光测距刻度系数误差。Among them, δV N is the north speed error, δV U is the sky speed error, δV E is the east speed error, δL is the latitude error, δH is the altitude error, δλ is the longitude error,
Figure BDA0003157886820000174
is the north misalignment angle,
Figure BDA0003157886820000175
is the misalignment angle of the sky,
Figure BDA0003157886820000176
is the east misalignment angle, α x is the x-axis installation error, α y is the y-axis installation error, α z is the z-axis installation error, and δk is the laser ranging scale coefficient error.

系统状态转移矩阵为

Figure BDA0003157886820000177
其中,The system state transition matrix is
Figure BDA0003157886820000177
in,

Figure BDA0003157886820000178
Figure BDA0003157886820000178

Figure BDA0003157886820000179
Figure BDA0003157886820000179

Figure BDA00031578868200001710
Figure BDA00031578868200001710

Figure BDA00031578868200001711
Figure BDA00031578868200001711

Figure BDA0003157886820000181
Figure BDA0003157886820000181

Figure BDA0003157886820000182
Figure BDA0003157886820000182

Figure BDA0003157886820000183
Figure BDA0003157886820000183

Figure BDA0003157886820000184
Figure BDA0003157886820000184

其中,VN为北向速度,VU为天向速度,VE为东向速度,Rm,Rn为球半径的两个维度,L为纬度,H为高度,ωie为地球自转速度,

Figure BDA0003157886820000185
为沿x轴相机机体坐标系(b系)相对于惯性坐标系(i系)的加速度在地理坐标系(n系)的投影,
Figure BDA0003157886820000186
为沿y轴相机机体坐标系(b系)相对于惯性坐标系(i系)的加速度在地理坐标系(n系)的投影,
Figure BDA0003157886820000187
为沿z轴相机机体坐标系(b系)相对于惯性坐标系(i系)的加速度在地理坐标系(n系)的投影。Among them, V N is the northing velocity, V U is the sky velocity, VE is the easting velocity, R m , R n are the two dimensions of the spherical radius, L is the latitude, H is the height, ω ie is the earth's rotation speed,
Figure BDA0003157886820000185
is the projection of the acceleration of the camera body coordinate system (b system) relative to the inertial coordinate system (i system) along the x-axis on the geographic coordinate system (n system),
Figure BDA0003157886820000186
is the projection of the acceleration of the camera body coordinate system (b system) relative to the inertial coordinate system (i system) along the y-axis on the geographic coordinate system (n system),
Figure BDA0003157886820000187
is the projection of the acceleration of the camera body coordinate system (b system) relative to the inertial coordinate system (i system) along the z-axis on the geographic coordinate system (n system).

观测方程定义如下:

Figure BDA0003157886820000188
The observation equation is defined as follows:
Figure BDA0003157886820000188

其中,H(k)为观测矩阵,

Figure BDA0003157886820000189
为状态方程,
Figure BDA00031578868200001810
为观测噪声阵。Among them, H(k) is the observation matrix,
Figure BDA0003157886820000189
is the equation of state,
Figure BDA00031578868200001810
for the observation noise array.

以图像匹配位置和激光测距为观测信息(3维)Using image matching position and laser ranging as observation information (3D)

Figure BDA00031578868200001811
Figure BDA00031578868200001811

其中,δL为纬度误差,δH为高度误差,δλ为经度误差,LINS为惯导测的纬度,LREF为图像匹配的纬度,HINS为惯导测的高度,HREF为图像匹配的高度,λINS为惯导测的经度,λREF为图像匹配的经度,δLINS为惯导测的纬度误差,δLREF为图像匹配的纬度误差,δHINS为惯导测的高度误差,δHREF为图像匹配的高度误差,δλINS为惯导测的经度误差,δλREF为图像匹配的经度误差。Among them, δL is the latitude error, δH is the altitude error, δλ is the longitude error, L INS is the latitude of the inertial navigation measurement, L REF is the latitude of the image matching, H INS is the height of the inertial navigation measurement, and H REF is the height of the image matching. , λ INS is the longitude of the inertial navigation measurement, λ REF is the longitude of the image matching, δL INS is the latitude error of the inertial navigation measurement, δL REF is the latitude error of the image matching, δH INS is the height error of the inertial navigation measurement, and δH REF is The height error of image matching, δλ INS is the longitude error of inertial navigation measurement, and δλ REF is the longitude error of image matching.

其中,

Figure BDA0003157886820000191
in,
Figure BDA0003157886820000191

δrl n为根据激光测距和惯导姿态信息,从图像匹配位置转换到飞机正下方引入的位置误差。δr l n is the position error introduced from the image matching position to the position directly below the aircraft according to the laser ranging and inertial navigation attitude information.

Figure BDA0003157886820000192
Figure BDA0003157886820000192

Figure BDA00031578868200001910
Figure BDA00031578868200001910

其中φ为姿态误差角,η为安装误差角,rl n为图像匹配位置在地理坐标系(n系)下的投影,

Figure BDA0003157886820000193
为带有误差的图像匹配位置在地理坐标系(n系)下的投影,
Figure BDA0003157886820000194
为从惯导载体坐标系(b系)向地理坐标系(n系)转换的矩阵,
Figure BDA0003157886820000195
为带有误差的从惯导载体坐标系(b系)向地理坐标系(n系)转换的矩阵,
Figure BDA0003157886820000196
为从摄像机坐标系(c系)向惯导载体坐标系(b系)转换的矩阵,
Figure BDA0003157886820000197
为带有误差的从摄像机坐标系向惯导载体坐标系转换的矩阵,rl c为图像匹配位置在摄像机坐标系(c系)下的投影。where φ is the attitude error angle, η is the installation error angle, r l n is the projection of the image matching position in the geographic coordinate system (n system),
Figure BDA0003157886820000193
is the projection of the image matching position with the error in the geographic coordinate system (n system),
Figure BDA0003157886820000194
is the matrix converted from the inertial navigation carrier coordinate system (b system) to the geographic coordinate system (n system),
Figure BDA0003157886820000195
is the matrix converted from the inertial navigation carrier coordinate system (b system) to the geographic coordinate system (n system) with errors,
Figure BDA0003157886820000196
is the matrix converted from the camera coordinate system (c system) to the inertial navigation carrier coordinate system (b system),
Figure BDA0003157886820000197
is the matrix converted from the camera coordinate system to the inertial navigation carrier coordinate system with error, and r l c is the projection of the image matching position under the camera coordinate system (c system).

根据公式二和公式三可得,

Figure BDA0003157886820000198
According to formula 2 and formula 3, we can get,
Figure BDA0003157886820000198

将公式四代入公式一可得Substitute Equation 4 into Equation 1 to get

Figure BDA0003157886820000199
Figure BDA0003157886820000199

其中,φN为北向姿态误差角,φU为天向姿态误差角,φE为东向姿态误差角,rN为北向距离,rU为天向距离,rE为东向距离,ηN为北向安装误差角,ηU为天向安装误差角,ηE为东向安装误差角。Among them, φ N is the attitude error angle in the north direction, φ U is the attitude error angle in the sky direction, φ E is the attitude error angle in the east direction, r N is the north direction distance, r U is the sky direction distance, r E is the east direction distance, η N is the installation error angle in the north direction, η U is the installation error angle in the sky direction, and η E is the installation error angle in the east direction.

由此可写出观测矩阵H为From this, the observation matrix H can be written as

H(k)=[H1 H2 H3 H4 H5],H(k)=[H 1 H 2 H 3 H 4 H 5 ],

其中,

Figure BDA0003157886820000201
rN为北向距离,rU为天向距离,rE为东向距离。in,
Figure BDA0003157886820000201
r N is the distance to the north, r U is the distance to the sky, and r E is the distance to the east.

根据本发明另一方面,提供了一种飞行器,飞行器使用如上所述的基于异源图像匹配的惯性视觉组合导航定位方法进行组合导航定位。由于本发明所提供的组合导航定位方法能够解决高空飞行时二维平面控制点的PNP位姿解算过程中的姿态误差较大问题,实现异源图像之间的稳定匹配,提高组合导航算法的自主性。因此将该组合导航定位方法应用到飞行器中,能够极大地提高飞行器的工作性能。According to another aspect of the present invention, an aircraft is provided, which uses the above-mentioned heterogeneous image matching-based inertial visual integrated navigation and positioning method to perform integrated navigation and positioning. Because the combined navigation and positioning method provided by the present invention can solve the problem of large attitude error in the process of calculating the PNP pose of the two-dimensional plane control point during high-altitude flight, realize stable matching between heterogeneous images, and improve the performance of the combined navigation algorithm. autonomy. Therefore, applying the integrated navigation and positioning method to the aircraft can greatly improve the working performance of the aircraft.

为了对本发明有进一步地了解,下面结合图1至图10对本发明所提供的基于异源图像匹配的惯性视觉组合导航定位方法进行详细说明。In order to have a further understanding of the present invention, the combined inertial visual navigation and positioning method based on heterologous image matching provided by the present invention will be described in detail below with reference to FIG. 1 to FIG. 10 .

如图1至图10所示,根据本发明的具体实施例提供了一种基于异源图像匹配的惯性视觉组合导航定位方法,该方法具体包括如下步骤。As shown in FIG. 1 to FIG. 10 , according to a specific embodiment of the present invention, an inertial vision combined navigation and positioning method based on heterologous image matching is provided, and the method specifically includes the following steps.

步骤一,利用惯导姿态信息以及激光测距高度信息将实时图和基准图分别进行正射影像校正,统一实时图和基准图的缩放尺寸。在本实施例中,实时图与正射影像之间存在视角不一致与尺度不一致,无法直接用于匹配定位,因此利用惯导姿态信息以及激光测距高度信息进行融合,进而在满足尺度不变特征变换效果的同时,大幅提高匹配精度。Step 1: Use the inertial navigation attitude information and the laser ranging height information to perform orthophoto correction on the real-time image and the reference image respectively, and unify the scaling size of the real-time image and the baseline image. In this embodiment, there are inconsistencies in perspective and scale between the real-time image and the orthophoto, which cannot be directly used for matching and positioning. Therefore, the inertial navigation attitude information and the laser ranging height information are used for fusion, so as to satisfy the scale-invariant feature While transforming the effect, the matching accuracy is greatly improved.

步骤二,分别获取实时图的第一投影和量化梯度方向直方图以及基准图的第二投影和量化梯度方向直方图。在本实施例中,在计算机视觉以及数字图像处理中梯度方向直方图(HOG)是一种能对物体进行检测的基于形状边缘特征的描述算子,它的基本思想是利用梯度信息能很好的反映图像目标的边缘信息并通过局部梯度的大小将图像局部的外观和形状特征化,但由于描述子生成过程冗长,无法满足实时匹配的需求,因此将梯度特征进行投影和量化编码成二进制描述,提高匹配速度。In step 2, the first projection and quantized gradient direction histogram of the real-time graph and the second projection and quantized gradient direction histogram of the reference graph are obtained respectively. In this embodiment, in computer vision and digital image processing, histogram of gradient orientation (HOG) is a description operator based on shape and edge features that can detect objects. It reflects the edge information of the image target and characterizes the local appearance and shape of the image through the size of the local gradient. However, because the descriptor generation process is lengthy and cannot meet the needs of real-time matching, the gradient features are projected and quantized and encoded into binary descriptions , to improve the matching speed.

步骤三,基于相似性测量原理对第一投影和量化梯度方向直方图和第二投影和量化梯度方向直方图的相似度进行检测,在第二投影和量化梯度方向直方图中检索与第一投影和量化梯度方向直方图的汉明距离最大值的位置,汉明距离最大值所在的位置即为实时图在基准图中的图像匹配位置。在本实施例中,将实时图和基准图分别计算HOG特征之后,使用汉明距离检索方法对两幅图像的相似度进行检测,在基准图中找到和实时图匹配的汉明距离最大值,即为实时图在基准图中的位置。Step 3: Detect the similarity between the first projection and the quantized gradient direction histogram and the second projection and the quantized gradient direction histogram based on the similarity measurement principle, and retrieve the similarity between the second projection and the quantized gradient direction histogram and the first projection. and the position of the maximum value of the Hamming distance of the quantized gradient direction histogram, the position of the maximum value of the Hamming distance is the image matching position of the real-time image in the reference image. In this embodiment, after calculating the HOG feature of the real-time image and the reference image respectively, the Hamming distance retrieval method is used to detect the similarity of the two images, and the maximum value of the Hamming distance matching the real-time image is found in the baseline image, It is the position of the real-time graph in the benchmark graph.

步骤四,将图像匹配位置转换到无人机所在位置并作为观测量,基于观测量构建卡尔曼滤波器以实现无人机的惯性视觉组合导航定位。在本实施例中,图像匹配得到的位置点为相机所拍到的位置,当飞机有俯仰或滚动时,该位置并不是无人机所在位置,因此需要通过坐标系转换,由下视成像点推算出无人机所在点的正下方,之后以视觉位置信息为观测量,构建卡尔曼滤波器,实现连续自主的导航定位功能。Step 4: Convert the image matching position to the position of the UAV and use it as the observation amount. Based on the observation amount, a Kalman filter is constructed to realize the inertial visual integrated navigation and positioning of the UAV. In this embodiment, the position point obtained by image matching is the position captured by the camera. When the aircraft pitches or rolls, the position is not the position of the drone. Therefore, it is necessary to convert the coordinate system to convert the image point from the bottom view. It is calculated that the point directly below the drone is located, and then the visual position information is used as the observation amount to construct a Kalman filter to realize the continuous and autonomous navigation and positioning function.

综上所述,本发明提供了一种基于异源图像匹配的惯性视觉组合导航定位方法,方法针对GPS拒止条件下,无人机的全天时导航定位问题,开展异源图像匹配定位技术的研究,首先提出利用激光测距传感器捕获飞行高度信息的方式,解决巡航段小机动条件下无法对尺度信息进行初始化的问题,通过惯性/激光测距/图像信息的融合,实现实时图与基准图的尺度统一;其次采用基于投影和量化梯度方向直方图的方法,对结构特征进行提取与二进制编码,解决异源图像匹配过程中由于图像信息不一致的误匹配问题;最后提出一种基于惯性/激光测距机的几何定位方法,解决高空飞行时二维平面控制点的PNP位姿解算过程中的姿态误差较大问题,此种方式能够实现异源图像之间的稳定匹配,提高组合导航算法的自主性。In summary, the present invention provides an inertial vision combined navigation and positioning method based on heterologous image matching. The method aims at the problem of all-day navigation and positioning of UAVs under the condition of GPS rejection, and develops heterologous image matching and positioning technology. , firstly proposed the method of using laser ranging sensor to capture flight height information to solve the problem that scale information cannot be initialized under small maneuvering conditions in the cruise segment. Through the fusion of inertial/laser ranging/image information, real-time map and benchmark The scale of the image is unified; secondly, the method based on the projection and quantized gradient direction histogram is used to extract and binary code the structural features to solve the problem of mismatching due to inconsistent image information in the process of heterogeneous image matching. The geometric positioning method of laser rangefinder solves the problem of large attitude error in the process of PNP pose calculation of two-dimensional plane control points during high-altitude flight. This method can achieve stable matching between heterogeneous images and improve integrated navigation. Algorithmic autonomy.

为了便于描述,在这里可以使用空间相对术语,如“在……之上”、“在……上方”、“在……上表面”、“上面的”等,用来描述如在图中所示的一个器件或特征与其他器件或特征的空间位置关系。应当理解的是,空间相对术语旨在包含除了器件在图中所描述的方位之外的在使用或操作中的不同方位。例如,如果附图中的器件被倒置,则描述为“在其他器件或构造上方”或“在其他器件或构造之上”的器件之后将被定位为“在其他器件或构造下方”或“在其他器件或构造之下”。因而,示例性术语“在……上方”可以包括“在……上方”和“在……下方”两种方位。该器件也可以其他不同方式定位(旋转90度或处于其他方位),并且对这里所使用的空间相对描述作出相应解释。For ease of description, spatially relative terms, such as "on", "over", "on the surface", "above", etc., may be used herein to describe what is shown in the figures. The spatial positional relationship of one device or feature shown to other devices or features. It should be understood that spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as "above" or "over" other devices or features would then be oriented "below" or "over" the other devices or features under other devices or constructions". Thus, the exemplary term "above" can encompass both an orientation of "above" and "below." The device may also be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptions used herein interpreted accordingly.

此外,需要说明的是,使用“第一”、“第二”等词语来限定零部件,仅仅是为了便于对相应零部件进行区别,如没有另行声明,上述词语并没有特殊含义,因此不能理解为对本发明保护范围的限制。In addition, it should be noted that the use of words such as "first" and "second" to define components is only for the convenience of distinguishing corresponding components. Unless otherwise stated, the above words have no special meaning and therefore cannot be understood to limit the scope of protection of the present invention.

以上所述仅为本发明的优选实施例而已,并不用于限制本发明,对于本领域的技术人员来说,本发明可以有各种更改和变化。凡在本发明的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。The above descriptions are only preferred embodiments of the present invention, and are not intended to limit the present invention. For those skilled in the art, the present invention may have various modifications and changes. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention shall be included within the protection scope of the present invention.

Claims (9)

1.一种基于异源图像匹配的惯性视觉组合导航定位方法,其特征在于,所述基于异源图像匹配的惯性视觉组合导航定位方法包括:1. an inertial vision combined navigation and positioning method based on heterologous image matching, is characterized in that, the described inertial vision combined navigation positioning method based on heterologous image matching comprises: 利用惯导姿态信息以及激光测距高度信息将实时图和基准图分别进行正射影像校正,统一所述实时图和所述基准图的缩放尺寸;Using inertial navigation attitude information and laser ranging height information to perform orthophoto correction on the real-time image and the reference image respectively, and unify the scaling size of the real-time image and the baseline image; 分别获取所述实时图的第一投影和量化梯度方向直方图以及所述基准图的第二投影和量化梯度方向直方图;respectively acquiring the first projection and quantized gradient direction histogram of the real-time graph and the second projection and quantized gradient direction histogram of the reference graph; 基于相似性测量原理对所述第一投影和量化梯度方向直方图和所述第二投影和量化梯度方向直方图的相似度进行检测,在所述第二投影和量化梯度方向直方图中检索与所述第一投影和量化梯度方向直方图的汉明距离最大值的位置,所述汉明距离最大值所在的位置即为所述实时图在所述基准图中的图像匹配位置;Based on the similarity measurement principle, the similarity between the first projected and quantized gradient direction histogram and the second projected and quantized gradient direction histogram is detected, and the second projected and quantized gradient direction histogram is retrieved from the second projected and quantized gradient direction histogram. the position of the maximum value of the Hamming distance of the first projection and the quantized gradient direction histogram, the position of the maximum value of the Hamming distance is the image matching position of the real-time image in the reference image; 将所述图像匹配位置转换到无人机所在位置并作为观测量,基于所述观测量构建卡尔曼滤波器以实现无人机的惯性视觉组合导航定位。The image matching position is converted to the position of the UAV and used as the observation amount, and a Kalman filter is constructed based on the observation amount to realize the inertial visual integrated navigation and positioning of the UAV. 2.根据权利要求1所述的基于异源图像匹配的惯性视觉组合导航定位方法,其特征在于,获取所述实时图的第一投影和量化梯度方向直方图具体包括:对所述实时图进行高斯滤波;采用Sobel算子提取高斯滤波后的所述实时图的灰度图像梯度特征以获取实时图的梯度图像;基于所述实时图的梯度图像,统计所述实时图的梯度直方图;对所述实时图的梯度直方图进行投影和量化以获取第一投影和量化梯度方向直方图;获取所述基准图的第二投影和量化梯度方向直方图具体包括:对所述基准图进行高斯滤波;采用Sobel算子提取高斯滤波后的所述基准图的灰度图像梯度特征以获取基准图的梯度图像;基于所述基准图的梯度图像,统计所述基准图的梯度直方图;对所述基准图的梯度直方图进行投影和量化以获取第二投影和量化梯度方向直方图。2. The inertial vision combined navigation and positioning method based on heterologous image matching according to claim 1, wherein acquiring the first projection of the real-time image and the quantized gradient direction histogram specifically comprises: performing a step on the real-time image. Gaussian filtering; Using Sobel operator to extract the grayscale image gradient feature of the real-time image after the Gaussian filter to obtain the gradient image of the real-time image; Based on the gradient image of the real-time image, count the gradient histogram of the real-time image; The gradient histogram of the real-time graph is projected and quantized to obtain a first projection and a quantized gradient direction histogram; the obtaining of the second projection and quantized gradient direction histogram of the reference graph specifically includes: performing Gaussian filtering on the reference graph ; Adopt the Sobel operator to extract the grayscale image gradient feature of the benchmark map after the Gaussian filter to obtain the gradient image of the benchmark map; Based on the gradient image of the benchmark map, count the gradient histogram of the benchmark map; The gradient histogram of the reference map is projected and quantized to obtain a second projected and quantized gradient direction histogram. 3.根据权利要求1所述的基于异源图像匹配的惯性视觉组合导航定位方法,其特征在于,所述实时图和所述基准图均可根据
Figure FDA0003157886810000021
实现正射影像校正,其中,
Figure FDA0003157886810000022
Figure FDA0003157886810000023
为非正射影像,
Figure FDA0003157886810000024
为正射影像,
Figure FDA0003157886810000025
为空间点P在世界坐标系中的齐次坐标,zc为空间点P在摄像机坐标系中的坐标的z分量,f为光学系统主距,sc为图像传感器上列方向上相邻像素之间的距离,sr为图像传感器上行方向上相邻像素之间的距离,[cc,cr]T为图像的主点,
Figure FDA0003157886810000026
为两个内参完全一致的摄像机之间的旋转矩阵,
Figure FDA0003157886810000027
为从世界坐标系向摄像机坐标系转换的旋转矩阵,
Figure FDA0003157886810000028
为从世界坐标系向摄像机坐标系转换的平移向量。
3. The inertial vision combined navigation and positioning method based on heterologous image matching according to claim 1, wherein the real-time map and the reference map can be based on
Figure FDA0003157886810000021
Implements orthophoto correction, where,
Figure FDA0003157886810000022
Figure FDA0003157886810000023
is a non-orthophoto image,
Figure FDA0003157886810000024
is an orthophoto,
Figure FDA0003157886810000025
is the homogeneous coordinate of the space point P in the world coordinate system, z c is the z component of the coordinate of the space point P in the camera coordinate system, f is the principal distance of the optical system, and s c is the adjacent pixels in the column direction of the image sensor The distance between them, s r is the distance between adjacent pixels in the upward direction of the image sensor, [c c ,c r ] T is the main point of the image,
Figure FDA0003157886810000026
is the rotation matrix between two cameras with identical internal parameters,
Figure FDA0003157886810000027
is the rotation matrix converted from the world coordinate system to the camera coordinate system,
Figure FDA0003157886810000028
is the translation vector transformed from the world coordinate system to the camera coordinate system.
4.根据权利要求3所述的基于异源图像匹配的惯性视觉组合导航定位方法,其特征在于,所述实时图和所述基准图的缩放系数k可根据
Figure FDA0003157886810000029
来获取,其中,μ为像元尺寸,f为相机焦距,l为相机光心距拍摄点距离。
4. The inertial vision combined navigation and positioning method based on heterologous image matching according to claim 3, wherein the scaling coefficient k of the real-time image and the reference image can be determined according to
Figure FDA0003157886810000029
to obtain, where μ is the pixel size, f is the focal length of the camera, and l is the distance between the optical center of the camera and the shooting point.
5.根据权利要求1至4中任一项所述的基于异源图像匹配的惯性视觉组合导航定位方法,其特征在于,所述第一投影和量化梯度方向直方图和所述第二投影和量化梯度方向直方图的相似度S(x,y)可根据
Figure FDA00031578868100000210
来获取,其中,d(a,b)为差别度量,f为单调函数,(u,v)为实时图中的像素坐标集合,I(x+u,y+v)表示实时图在参考图(x,y)处的相似性度量值,T为实时图,I为参考图。
5. The inertial vision combined navigation and positioning method based on heterologous image matching according to any one of claims 1 to 4, wherein the first projection and the quantized gradient direction histogram and the second projection and The similarity S(x,y) of the quantized gradient direction histogram can be calculated according to
Figure FDA00031578868100000210
to obtain, where d(a, b) is the difference measure, f is a monotonic function, (u, v) is the set of pixel coordinates in the real-time image, and I(x+u, y+v) indicates that the real-time image is in the reference image The similarity measure at (x, y), where T is the real-time image and I is the reference image.
6.根据权利要求1所述的基于异源图像匹配的惯性视觉组合导航定位方法,其特征在于,所述图像匹配位置与所述无人机所在位置之间的转换关系为
Figure FDA0003157886810000031
其中,h为飞机飞行高度,x和y为相机正下方与拍摄图像间位置关系,rd为相机拍摄位置中心点在相机机体坐标系下的投影,rn为rd在地理坐标系下投影,
Figure FDA0003157886810000032
为从相机机体坐标系向地理坐标系转换的矩阵。
6. The inertial vision combined navigation and positioning method based on heterologous image matching according to claim 1, wherein the conversion relationship between the image matching position and the position of the unmanned aerial vehicle is:
Figure FDA0003157886810000031
Among them, h is the flight height of the aircraft, x and y are the positional relationship between the camera directly under the camera and the captured image, r d is the projection of the center point of the camera shooting position in the camera body coordinate system, and r n is the projection of r d in the geographic coordinate system ,
Figure FDA0003157886810000032
is the matrix to convert from the camera body coordinate system to the geographic coordinate system.
7.根据权利要求1至6中任一项所述的基于异源图像匹配的惯性视觉组合导航定位方法,其特征在于,在所述卡尔曼滤波器中,状态向量包括北速误差、天速误差、东速误差、纬度误差、高度误差、经度误差、北向失准角、天向失准角、东向失准角、x轴安装误差、y轴安装误差、z轴安装误差和激光测距刻度系数误差。7. The inertial vision integrated navigation and positioning method based on heterologous image matching according to any one of claims 1 to 6, wherein, in the Kalman filter, the state vector comprises a north speed error, a sky speed Error, Easting Speed Error, Latitude Error, Altitude Error, Longitude Error, North Misalignment, Sky Misalignment, Easting Misalignment, x-axis installation error, y-axis installation error, z-axis installation error and laser ranging Scale factor error. 8.根据权利要求7所述的基于异源图像匹配的惯性视觉组合导航定位方法,其特征在于,在所述卡尔曼滤波器中,观测矩阵为H(k)=[H1 H2 H3 H4 H5],其中,
Figure FDA0003157886810000033
rN为北向距离,rU为天向距离,rE为东向距离。
8 . The inertial vision combined navigation and positioning method based on heterologous image matching according to claim 7 , wherein, in the Kalman filter, the observation matrix is H(k)=[H 1 H 2 H 3 . H 4 H 5 ], where,
Figure FDA0003157886810000033
r N is the distance to the north, r U is the distance to the sky, and r E is the distance to the east.
9.一种飞行器,其特征在于,所述飞行器使用如权利要求1至8中任一项所述的基于异源图像匹配的惯性视觉组合导航定位方法进行组合导航定位。9 . An aircraft, characterized in that, the aircraft uses the heterogeneous image matching-based inertial visual integrated navigation and positioning method according to any one of claims 1 to 8 to perform integrated navigation and positioning.
CN202110783533.1A 2021-07-12 2021-07-12 Inertial vision integrated navigation positioning method based on heterogeneous image matching and aircraft Active CN113624231B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110783533.1A CN113624231B (en) 2021-07-12 2021-07-12 Inertial vision integrated navigation positioning method based on heterogeneous image matching and aircraft

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110783533.1A CN113624231B (en) 2021-07-12 2021-07-12 Inertial vision integrated navigation positioning method based on heterogeneous image matching and aircraft

Publications (2)

Publication Number Publication Date
CN113624231A true CN113624231A (en) 2021-11-09
CN113624231B CN113624231B (en) 2023-09-12

Family

ID=78379514

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110783533.1A Active CN113624231B (en) 2021-07-12 2021-07-12 Inertial vision integrated navigation positioning method based on heterogeneous image matching and aircraft

Country Status (1)

Country Link
CN (1) CN113624231B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114111795A (en) * 2021-11-24 2022-03-01 航天神舟飞行器有限公司 Autonomous Navigation of Small UAV Based on Terrain Matching
CN115127554A (en) * 2022-08-31 2022-09-30 中国人民解放军国防科技大学 Unmanned aerial vehicle autonomous navigation method and system based on multi-source vision assistance
CN115932823A (en) * 2023-01-09 2023-04-07 中国人民解放军国防科技大学 Location method of aircraft to ground target based on heterogeneous region feature matching
CN116518981B (en) * 2023-06-29 2023-09-22 中国人民解放军国防科技大学 Aircraft visual navigation method based on deep learning matching and Kalman filtering
CN117191018A (en) * 2023-08-02 2023-12-08 北京中科导控科技有限公司 Inertial-assisted large-viewing-angle fast scene matching absolute navigation method
CN118521764A (en) * 2024-07-23 2024-08-20 西北工业大学 Unmanned aerial vehicle to ground target combined positioning method, device and system under refusing environment

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110150324A1 (en) * 2009-12-22 2011-06-23 The Chinese University Of Hong Kong Method and apparatus for recognizing and localizing landmarks from an image onto a map
CN102788579A (en) * 2012-06-20 2012-11-21 天津工业大学 Unmanned aerial vehicle visual navigation method based on SIFT algorithm
CN103644904A (en) * 2013-12-17 2014-03-19 上海电机学院 Visual navigation method based on SIFT (scale invariant feature transform) algorithm
KR20150005253A (en) * 2013-07-05 2015-01-14 충남대학교산학협력단 Camera Data Generator for Landmark-based Vision Navigation System and Computer-readable Media Recording Program for Executing the Same
CN104966281A (en) * 2015-04-14 2015-10-07 中测新图(北京)遥感技术有限责任公司 IMU/GNSS guiding matching method of multi-view images
CN107527328A (en) * 2017-09-01 2017-12-29 扆冰蕾 A kind of unmanned plane image geometry processing method for taking into account precision and speed
CN108320304A (en) * 2017-12-18 2018-07-24 广州亿航智能技术有限公司 A kind of automatic edit methods and system of unmanned plane video media
CN108763263A (en) * 2018-04-03 2018-11-06 南昌奇眸科技有限公司 A kind of trade-mark searching method
CN108805906A (en) * 2018-05-25 2018-11-13 哈尔滨工业大学 A kind of moving obstacle detection and localization method based on depth map
WO2020059220A1 (en) * 2018-09-21 2020-03-26 日立建機株式会社 Coordinate conversion system and work machine
CN111238488A (en) * 2020-03-18 2020-06-05 湖南云顶智能科技有限公司 Aircraft accurate positioning method based on heterogeneous image matching
CN111504323A (en) * 2020-04-23 2020-08-07 湖南云顶智能科技有限公司 Unmanned aerial vehicle autonomous positioning method based on heterogeneous image matching and inertial navigation fusion
CN112837353A (en) * 2020-12-29 2021-05-25 北京市遥感信息研究所 A Heterogeneous Image Matching Method Based on Multi-Order Feature Point-Line Matching

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110150324A1 (en) * 2009-12-22 2011-06-23 The Chinese University Of Hong Kong Method and apparatus for recognizing and localizing landmarks from an image onto a map
CN102788579A (en) * 2012-06-20 2012-11-21 天津工业大学 Unmanned aerial vehicle visual navigation method based on SIFT algorithm
KR20150005253A (en) * 2013-07-05 2015-01-14 충남대학교산학협력단 Camera Data Generator for Landmark-based Vision Navigation System and Computer-readable Media Recording Program for Executing the Same
CN103644904A (en) * 2013-12-17 2014-03-19 上海电机学院 Visual navigation method based on SIFT (scale invariant feature transform) algorithm
CN104966281A (en) * 2015-04-14 2015-10-07 中测新图(北京)遥感技术有限责任公司 IMU/GNSS guiding matching method of multi-view images
CN107527328A (en) * 2017-09-01 2017-12-29 扆冰蕾 A kind of unmanned plane image geometry processing method for taking into account precision and speed
CN108320304A (en) * 2017-12-18 2018-07-24 广州亿航智能技术有限公司 A kind of automatic edit methods and system of unmanned plane video media
CN108763263A (en) * 2018-04-03 2018-11-06 南昌奇眸科技有限公司 A kind of trade-mark searching method
CN108805906A (en) * 2018-05-25 2018-11-13 哈尔滨工业大学 A kind of moving obstacle detection and localization method based on depth map
WO2020059220A1 (en) * 2018-09-21 2020-03-26 日立建機株式会社 Coordinate conversion system and work machine
CN111238488A (en) * 2020-03-18 2020-06-05 湖南云顶智能科技有限公司 Aircraft accurate positioning method based on heterogeneous image matching
CN111504323A (en) * 2020-04-23 2020-08-07 湖南云顶智能科技有限公司 Unmanned aerial vehicle autonomous positioning method based on heterogeneous image matching and inertial navigation fusion
CN112837353A (en) * 2020-12-29 2021-05-25 北京市遥感信息研究所 A Heterogeneous Image Matching Method Based on Multi-Order Feature Point-Line Matching

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王民钢;孙传新;: "基于图像匹配的飞行器导航定位算法及仿真", 计算机仿真, no. 05, pages 86 - 89 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114111795A (en) * 2021-11-24 2022-03-01 航天神舟飞行器有限公司 Autonomous Navigation of Small UAV Based on Terrain Matching
CN115127554A (en) * 2022-08-31 2022-09-30 中国人民解放军国防科技大学 Unmanned aerial vehicle autonomous navigation method and system based on multi-source vision assistance
CN115932823A (en) * 2023-01-09 2023-04-07 中国人民解放军国防科技大学 Location method of aircraft to ground target based on heterogeneous region feature matching
CN116518981B (en) * 2023-06-29 2023-09-22 中国人民解放军国防科技大学 Aircraft visual navigation method based on deep learning matching and Kalman filtering
CN117191018A (en) * 2023-08-02 2023-12-08 北京中科导控科技有限公司 Inertial-assisted large-viewing-angle fast scene matching absolute navigation method
CN118521764A (en) * 2024-07-23 2024-08-20 西北工业大学 Unmanned aerial vehicle to ground target combined positioning method, device and system under refusing environment

Also Published As

Publication number Publication date
CN113624231B (en) 2023-09-12

Similar Documents

Publication Publication Date Title
CN113624231A (en) Inertial vision integrated navigation positioning method based on heterogeneous image matching and aircraft
CN105865454B (en) A kind of Navigation of Pilotless Aircraft method generated based on real-time online map
CN109708649B (en) A method and system for determining the attitude of a remote sensing satellite
CN114936971A (en) A water-oriented UAV remote sensing multispectral image stitching method and system
CN111815765B (en) An Image 3D Reconstruction Method Based on Heterogeneous Data Fusion
CN112419374A (en) A UAV Localization Method Based on Image Registration
CN109596121B (en) A method for automatic target detection and spatial positioning of a mobile station
Gao et al. Ground and aerial meta-data integration for localization and reconstruction: A review
CN104268935A (en) Feature-based airborne laser point cloud and image data fusion system and method
Chen et al. Real-time geo-localization using satellite imagery and topography for unmanned aerial vehicles
Granshaw Photogrammetric terminology
CN113793270A (en) Aerial image geometric correction method based on unmanned aerial vehicle attitude information
CN116468621A (en) One-key digital aviation image data processing method
CN115597592B (en) A comprehensive positioning method applied to UAV inspection
CN108801225B (en) Unmanned aerial vehicle oblique image positioning method, system, medium and equipment
CN112927294B (en) Satellite orbit and attitude determination method based on single sensor
CN113436313B (en) A method for active correction of 3D reconstruction errors based on UAV
CN115127554A (en) Unmanned aerial vehicle autonomous navigation method and system based on multi-source vision assistance
CN119090716A (en) Marine remote sensing surveying and mapping method and surveying and mapping system
CN118351184A (en) Scene matching navigation method and system based on deep learning
CN112132029A (en) Unmanned aerial vehicle remote sensing image rapid positioning method for earthquake emergency response
CN116309821A (en) A UAV localization method based on heterogeneous image registration
CN117496114A (en) A target positioning method, system and computer storage medium
CN117073669A (en) Aircraft positioning method
US12000703B2 (en) Method, software product, and system for determining a position and orientation in a 3D reconstruction of the earth&#39;s surface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant