CN114322943B - Target distance measuring method and device based on forward-looking image of unmanned aerial vehicle - Google Patents

Target distance measuring method and device based on forward-looking image of unmanned aerial vehicle Download PDF

Info

Publication number
CN114322943B
CN114322943B CN202111532928.0A CN202111532928A CN114322943B CN 114322943 B CN114322943 B CN 114322943B CN 202111532928 A CN202111532928 A CN 202111532928A CN 114322943 B CN114322943 B CN 114322943B
Authority
CN
China
Prior art keywords
distance
target distance
characteristic line
line segments
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111532928.0A
Other languages
Chinese (zh)
Other versions
CN114322943A (en
Inventor
曹云峰
张传奇
马宁
丁萌
庄丽葵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN202111532928.0A priority Critical patent/CN114322943B/en
Publication of CN114322943A publication Critical patent/CN114322943A/en
Application granted granted Critical
Publication of CN114322943B publication Critical patent/CN114322943B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a target distance measuring method and a device based on a forward-looking image of an unmanned aerial vehicle, wherein the method comprises the following steps: selecting a reference frame, extracting characteristic points, matching the characteristic points, calculating a distance based on characteristic line segments, calculating a target distance and filtering a target distance sequence; the device comprises: the device comprises a reference frame selection module, a characteristic point extraction module, a characteristic point matching module, a distance calculation module based on characteristic line segments, a target distance calculation module and a target distance sequence filtering module. The method has the advantages of simple flow, easy deployment of the device, high accuracy of the target distance measurement result, suitability for the flight scene of the unmanned aerial vehicle, small calculation amount and high calculation speed, and does not need to calibrate a camera or acquire target prior information by means of the pose data provided by the airborne navigation equipment in the implementation process.

Description

Target distance measuring method and device based on forward-looking image of unmanned aerial vehicle
Technical Field
The invention relates to the technical field of optical ranging, in particular to a target distance measuring method and device based on a forward-looking image of an unmanned aerial vehicle.
Background
In recent years, unmanned aerial vehicles have been widely used in military and civilian fields, such as intelligence, surveillance and reconnaissance, as well as power line inspection, resource investigation, precision agriculture, and the like. However, in both variable battlefield environments and common field environments, some obstacles which threaten flight safety exist, such as buildings, line towers, trees and the like. Therefore, the unmanned aerial vehicle is required to have strong autonomous obstacle avoidance capability by the tasks.
The general process of unmanned aerial vehicle realizing autonomous obstacle avoidance is as follows: firstly, sensing a scene and a surrounding environment where an unmanned aerial vehicle is located through an airborne sensor, and determining an obstacle target which has potential threat to flight safety; secondly, acquiring specific information of the obstacle target in the scene, wherein the specific information comprises size, direction, motion track, distance and the like; and finally, based on the information, finishing track planning and avoiding maneuver by means of a guidance and control system of the aircraft. The key for realizing autonomous obstacle avoidance of the unmanned aerial vehicle is that the airborne sensor is used for acquiring the target information of the obstacle.
Currently, sensors for scene and object perception that a drone can be equipped are broadly divided into two categories, active and passive. Active sensors, such as ultrasonic probes, synthetic aperture radars, lidar, etc., sense the surrounding environment by transmitting electromagnetic waves and receiving echoes. Such sensors have the disadvantages of being bulky, high in energy consumption, subject to outdoor sunlight sensitivity, and easily tracked by hostile detection equipment. Compared with the prior art, the passive vision sensor such as an airborne optical camera has obvious advantages in the aspects of size, weight, cost, response speed and the like, and is suitable for being deployed on unmanned aerial vehicles of different sizes and types. In addition, thanks to the development of computer vision technology, the vision sensor can already be competent for most two-dimensional vision tasks, namely, the detection, identification and tracking of the target are realized through the airborne optical image, and the information such as the size, the type, the direction, the motion track and the like of the target is acquired. However, for a guidance and control system of the unmanned aerial vehicle, effective obstacle avoidance is realized, and distance information of an obstacle target is indispensable. In the current state of the art, it remains extremely challenging to make target distance measurements with the information of the image itself.
In this respect, researchers have resorted to motion disparity information contained between frames in a forward-looking image sequence. When two or more images of the same static object are taken using a monocular camera at different views, the disparity information generated by the relative motion between the camera and the scene can be used to solve for the target distance. This idea is similar to triangulation in stereo vision. In addition, some artificially designed features are also used to extract target information in an image. Thus, by means of the multi-view geometry principle, the three-dimensional coordinates of a point in the image in the real world can be solved explicitly. Such object distance measurement methods based on artificial features and multi-view geometry have been applied to many three-dimensional visual tasks, such as structure recovery from motion, instantaneous localization and mapping, etc.
However, the following difficulties still exist when the method is applied to target distance measurement in the autonomous obstacle avoidance process of the unmanned aerial vehicle: (1) Feature matching is a key step in establishing a relationship between feature points of two images, and therefore, a feature matching error may adversely affect the accuracy of a target measurement result. (2) The height and the attitude of the unmanned aerial vehicle are constantly changed in the flying process, and the application requirements under the scene are difficult to meet by the existing measuring method and device. (3) The target distance measurement result sequence may have oscillation and jump and cannot be directly used for obstacle avoidance guidance and flight control.
Disclosure of Invention
The invention aims to solve the technical problem of providing a target distance measuring method and device based on a forward-looking image of an unmanned aerial vehicle, the method is simple and clear, the device is easy to deploy, the result accuracy is high, and the method and device are suitable for outdoor flying scenes.
In order to solve the technical problem, the invention provides a target distance measuring method based on a forward-looking image of an unmanned aerial vehicle, which comprises the following steps:
(1) Selecting an image at a certain specific moment from an image sequence of the current flight phase as a reference frame;
(2) For the images of the reference frame and the current frame, SIFT feature points are extracted from a given target region, and corresponding SIFT descriptors are given;
(3) Based on the feature extraction result, the Euclidean distance between descriptors corresponding to the feature points is used as similarity measurement to carry out feature matching, and mismatching is eliminated according to the measurement;
(4) Calculating the distance corresponding to each pair of characteristic line segments according to the designed distance calculation model based on the characteristic line segments by means of pose information provided by airborne navigation equipment;
(5) Acquiring a set containing distance calculation results of all the characteristic line segment pairs, and calculating a target distance in a median mode;
(6) And filtering a target distance calculation result sequence corresponding to the forward-looking image sequence by adopting a discrete extended Kalman filtering algorithm, and providing a smooth target distance measurement result with higher accuracy.
Preferably, in step (1), the reference frame is selected by means of position information recorded by the airborne navigation equipment at each moment during the flight, and the selection criterion is to keep the ratio of the horizontal distance between the reference frame and the current frame to the target distance measurement result at the previous moment constant.
Preferably, in the step (3), the similarity measure selected for feature matching is an euclidean distance between descriptors corresponding to feature points; and rejecting mismatching, screening according to the similarity measurement, and rejecting weak matching and fuzzy matching.
Preferably, in the step (4), the distance corresponding to each pair of feature line segments is calculated according to the designed distance calculation model based on the feature line segments, specifically: constructing a characteristic line segment by connecting any two matched characteristic points in a reference frame and a current frame; the characteristic line segment is transformed and the length is calculated, and the characteristic line segment is transformed in a longitudinal plane of the unmanned aerial vehicle in the flying process so as to meet the requirement of distance calculation; and calculating the distance corresponding to the characteristic line segments, and calculating the distance under the current frame corresponding to each pair of characteristic line segments according to the geometric imaging relationship.
Preferably, in step (5), in order to accurately and robustly characterize the target distance, a median is taken from a set including all feature line segment pair distance calculation results, and the set is used as the target distance calculation result.
Preferably, in the step (6), the filtering the target distance calculation result by using the discrete extended Kalman filter algorithm specifically comprises: expanding a Kalman filtering model to construct, selecting proper state quantity and observation quantity according to a distance calculation model based on a characteristic line segment, and constructing a system state equation and an observation equation; and (3) expanding the design and application of the Kalman filter, deducing a prediction and update equation of the filter according to the principle of expanding Kalman filtering, and applying the prediction and update equation.
Correspondingly, a target distance measuring device based on unmanned aerial vehicle forward-looking image includes: the reference frame selection module is used for selecting an image at a certain specific moment from the image sequence of the current flight phase as a reference frame; the feature point extraction module is used for extracting SIFT feature points from a given target region in the reference frame image and the current frame image and giving out a corresponding SIFT descriptor; the characteristic point matching module is used for performing characteristic matching and eliminating mismatching by taking the Euclidean distance between descriptors corresponding to the characteristic points as similarity measurement; the distance calculation module based on the characteristic line segments calculates the distance corresponding to each pair of characteristic line segments according to the designed distance calculation model based on the characteristic line segments by means of pose information provided by the airborne navigation equipment; the target distance calculation module is used for taking the median of a set containing all characteristic line segment pair distance calculation results as a target distance calculation result; and the target distance sequence filtering module is used for filtering a target distance calculation result sequence corresponding to the forward-looking image sequence by adopting a discrete Kalman filtering algorithm and providing a smooth target distance measurement result with higher accuracy.
Preferably, the feature point matching module includes: a feature matching metric calculating unit, for calculating the sum of differences between the 128-dimensional descriptors of the feature points in the reference frame and the current frame as a feature matching metric; the weak matching eliminating unit is used for eliminating weak matching in the characteristic point mismatching; and the fuzzy matching eliminating unit is used for eliminating fuzzy matching in the characteristic point mismatching.
Preferably, the feature line segment-based distance calculation module includes: the characteristic line segment construction unit is used for constructing characteristic line segments in the reference frame and the current frame according to the characteristic matching result; the characteristic line segment transformation and length calculation unit is used for transforming the characteristic line segments in a longitudinal plane of the unmanned aerial vehicle in the flight process and calculating the lengths of the transformed characteristic line segments so as to meet the requirement of distance calculation; and the distance calculation unit corresponding to the characteristic line segments is used for calculating the distance values corresponding to each pair of characteristic line segments and forming a set.
Preferably, the target distance sequence filtering module includes: the extended Kalman filtering model building unit is used for determining the state quantity and the observed quantity of the extended Kalman filtering model and building a system state equation and an observation equation; and the extended Kalman filter design and application unit is used for filtering a target depth calculation result corresponding to the foresight image sequence output by the module, so that a smoother and more accurate target distance measurement result sequence is obtained.
The invention has the beneficial effects that: according to the invention, the distance calculation is carried out by utilizing the length of the characteristic line segment instead of the pixel position of the characteristic point, so that the adverse effect caused by characteristic matching errors is reduced; airborne navigation information is introduced into the distance calculation model, so that the method can be suitable for unmanned aerial vehicle flying scenes with constantly changing heights and postures; by designing and applying a discrete extended Kalman filter, the accuracy of a target distance calculation result and the smoothness of an output sequence are further improved; the method has the advantages of simple flow, easy deployment of the device, high accuracy of the target distance measurement result, suitability for the flight scene of the unmanned aerial vehicle, small calculated amount and high calculation speed, and the pose data provided by the airborne navigation equipment is used in the implementation process without calibrating a camera or acquiring the prior information of the target.
Drawings
Fig. 1 is a schematic flow chart of a target distance measuring method based on a forward-looking image of an unmanned aerial vehicle according to the present invention.
Fig. 2 is a schematic flow chart of feature point matching provided in embodiment 1 of the present invention.
Fig. 3 is a schematic diagram of an exemplary graph of a reference frame and a current frame and a feature extraction and matching result provided in embodiment 1 of the present invention.
Fig. 4 is a schematic flowchart of distance calculation based on feature line segments according to embodiment 1 of the present invention.
Fig. 5 is a schematic flow chart of target distance sequence filtering provided in embodiment 1 of the present invention.
Fig. 6 is a schematic diagram illustrating comparison between target distance calculation and target distance measurement results before and after filtering according to embodiment 1 of the present invention.
Fig. 7 is a schematic structural diagram of a target distance measuring device based on a forward-looking image of an unmanned aerial vehicle according to the present invention.
Fig. 8 is a schematic structural diagram of a feature point matching module provided in embodiment 2 of the present invention.
Fig. 9 is a schematic structural diagram of a distance calculation module based on feature line segments according to embodiment 2 of the present invention.
Fig. 10 is a schematic structural diagram of a target distance sequence filtering module according to embodiment 2 of the present invention.
Detailed Description
As shown in fig. 1, a target distance measuring method based on a forward-looking image of an unmanned aerial vehicle includes the following steps:
(1) Selecting an image at a certain specific moment from an image sequence of the current flight phase as a reference frame;
(2) For the images of the reference frame and the current frame, SIFT feature points are extracted from a given target region, and corresponding SIFT descriptors are given;
(3) Based on the feature extraction result, the Euclidean distance between descriptors corresponding to the feature points is used as similarity measurement to carry out feature matching, and mismatching is eliminated according to the similarity measurement;
(4) Calculating the distance corresponding to each pair of characteristic line segments according to the designed distance calculation model based on the characteristic line segments by means of pose information provided by airborne navigation equipment;
(5) Acquiring a set containing distance calculation results of all the characteristic line segment pairs, and calculating a target distance in a median mode;
(6) And filtering the target distance calculation result sequence corresponding to the forward-looking image sequence by adopting a discrete extended Kalman filtering algorithm, and providing a smooth target distance measurement result with higher accuracy.
Example 1:
the embodiment 1 of the invention designs a target distance measuring method based on a forward-looking image of an unmanned aerial vehicle aiming at the characteristics of the forward-looking optical image shot in the flight process of the unmanned aerial vehicle, as shown in fig. 1, the method comprises the following steps:
step S100: and selecting a reference frame.
Specifically, an onboard optical camera is used for shooting forward-looking images during flight, and onboard navigation equipment is used for recording position information at corresponding moments. At the current time t, taking the currently shot image as a current frame, and selecting a reference frame from a past shot image sequence, wherein the selection criterion is that the horizontal distance (base length b) between the reference frame and the current frame and the target distance measurement result z at the previous time are enabled to be t-1 The ratio is kept at 0.25, i.e. b =0.25z t-1
Step S110: and (4) extracting feature points.
Specifically, for a reference frame and a current frame image, SIFT feature points are extracted from a given target region, and corresponding SIFT descriptors are given. Since the method only needs to use the pixel position information of the feature points, if the pixel coordinates of two or more feature points are the same but the scale and the main direction are different, only one of the points and the descriptor thereof need to be reserved.
Step S120: and matching the characteristic points.
Specifically, based on the feature point extraction result, the euclidean distance between descriptors corresponding to the feature points is used as a similarity measure to perform feature matching, and mismatching is eliminated according to the measure. Further, as shown in fig. 2, in step S120, the method may include:
step S121: and calculating a feature matching metric.
Specifically, assuming that n and m SIFT feature points are respectively extracted from target regions in the reference frame and the current frame, any one feature point f in the reference frame i (i =1, \8230;, n) and any one feature point g in the current frame j The sum of difference (SSD) between 128-dimensional feature descriptors of (j =1, \8230;, m) is calculated by:
Figure BDA0003412084770000061
the above calculation is performed for all n × m feature point pairs, which are used as the feature matching metric.
Step S122: and eliminating weak matching.
Specifically, in the above n × m feature point pairs, matches that do not satisfy the following formula are regarded as weak matches and eliminated:
Figure BDA0003412084770000062
where MT is the matching threshold, which is generally 10.
Step S123: and eliminating fuzzy matching.
Specifically, in the above n × m feature point pairs, a match that does not satisfy the following condition is regarded as a fuzzy match and eliminated:
Figure BDA0003412084770000063
Figure BDA0003412084770000064
Figure BDA0003412084770000065
where MR is the maximum ratio, typically taken to be 0.6. Through the steps, the feature matching and the elimination mismatching are completed, a feature matching result is obtained, and the corresponding relation between feature matching points is shown as a thin line crossing two frames of images in fig. 3.
Step S130: and calculating the distance based on the characteristic line segments.
Specifically, the distance calculation model based on the feature line segments is used for calculating the distance corresponding to each pair of feature line segments according to the feature matching result and the position and posture information of the reference frame and the current frame recorded by the airborne navigation equipment. Further, as shown in fig. 4, in step S130, the method may include:
step S131: and constructing a characteristic line segment.
Specifically, it is assumed that p pairs of matching points are included between the reference frame and the current frame in the feature matching result. In the current frame, a line segment can be constructed by connecting any two feature points in the image. In the reference frame, corresponding feature line segments can also be constructed by connecting corresponding matching feature points. Q = p (p-1)/2 pairs of feature line segments can be constructed between the two frames. For any pair of characteristic line segments, respectively recording the length of the original line segments in the reference frame and the current frame as L 1 And L 2 . Correspondingly, at the moment of shooting the reference frame and the current frame, the pitching angles of the unmanned aerial vehicle are respectively theta 1 And theta 2
Step S132: and (5) feature line segment transformation and length calculation.
Specifically, the characteristic line segments are transformed in the longitudinal plane of the unmanned aerial vehicle in the flying process so as to meet the requirement of distance calculation. The transformation is to project the feature line segment to a vertical plane passing through the center of the image according to the line of sight LOS of the two feature points. To calculate the transformed feature line length, the same operations are performed for the reference frame and the current frame as follows:
let the pixel coordinates of feature points (i.e., end points of feature line segments) A and B be (x), respectively A ,y A ) And (x) B ,y B ) And calculating the line-of-sight angle:
Figure BDA0003412084770000071
Figure BDA0003412084770000072
wherein f is y Is the longitudinal focal length of the camera, c y Is the pixel ordinate of the centre point of the image, i.e. half the height of the image.
Calculating the length of the transformed characteristic line segment:
Figure BDA0003412084770000073
/>
step S133: and calculating the distance corresponding to the characteristic line segment.
Specifically, according to the geometric imaging relationship, calculating the distance value z corresponding to the k (k =1, \8230;, q) th pair of characteristic line segments under the current frame k
Figure BDA0003412084770000074
The above steps are performed for all q pairs of feature line segments, resulting in a set { z ] containing q distance values k |k=1,…,q}。
Step S140: and calculating the target distance.
Specifically, a median is taken for a set containing all feature line segment pair distance calculation results, and the set is taken as a target distance calculation result:
Figure BDA0003412084770000075
step S150: and filtering the target distance sequence.
Specifically, a discrete extended Kalman filtering algorithm is adopted to filter a target distance calculation result sequence corresponding to the forward-looking image sequence, and a smooth target distance measurement result with higher accuracy is given. Further, as shown in fig. 5, in step S150, the method may include:
step S151: and constructing an extended Kalman filtering model.
Specifically, the state quantity of the extended Kalman filtering model is determined to be
Figure BDA0003412084770000081
The observed quantity is the current target distance z, and a system state equation and an observation equation at the current moment t are established:
Figure BDA0003412084770000082
Figure BDA0003412084770000083
wherein, b t Is the horizontal displacement of the drone from time t-1 to time t, and w and v are the process noise and the observation noise, respectively.
Step S152: and (4) expanding design and application of a Kalman filter.
Specifically, the prediction and update equations for designing an extended Kalman filter are as follows:
Figure BDA0003412084770000084
Figure BDA0003412084770000085
Figure BDA0003412084770000086
Figure BDA0003412084770000087
P t|t =(I-K t H t )P t|t-1
where P and Q are the process noise covariance and the observation noise covariance, respectively, K is the filter gain,
Figure BDA0003412084770000088
and &>
Figure BDA0003412084770000089
The first partial derivatives of the state equation and the observation equation, respectively.
The target distance calculation result sequence corresponding to the forward-looking image sequence obtained in step S140 is used as an input of the filter, so that a smoother and more accurate target distance measurement result sequence can be obtained through output. A comparison of the target distance calculation and target distance measurement results before and after filtering is shown in fig. 6.
Example 2:
embodiment 2 of the present invention provides a target distance measuring device based on a forward-looking image of an unmanned aerial vehicle, as shown in fig. 7, including the following modules: a reference frame selecting module 200, which selects an image at a specific moment from the image sequence of the current flight phase as a reference frame; a feature point extraction module 210, which extracts SIFT feature points in a given target region in the reference frame and the current frame image and provides a corresponding SIFT descriptor; the feature point matching module 220 is used for performing feature matching and eliminating mismatching by taking the Euclidean distance between descriptors corresponding to the feature points as similarity measurement; the feature line segment-based distance calculation module 230 calculates, by using pose information provided by the airborne navigation apparatus, distances corresponding to each pair of feature line segments according to the designed feature line segment-based distance calculation model; a target distance calculation module 240, which takes the median of the set including all the feature line pair distance calculation results as the target distance calculation result; the target distance sequence filtering module 250 filters a target distance calculation result sequence corresponding to the forward-looking image sequence by using a discrete Kalman filtering algorithm, and provides a smooth target distance measurement result with higher accuracy.
In the present embodiment, further, as shown in fig. 8, the feature point matching module 220 includes: a feature matching metric calculating unit 221, configured to calculate a sum of differences between the 128-dimensional descriptors of the feature points in the reference frame and the current frame, and use the sum of differences as a feature matching metric; a weak matching eliminating unit 222 for eliminating weak matching in the feature point mismatching; and a fuzzy matching eliminating unit 223 for eliminating fuzzy matching in the characteristic point mismatching.
In this embodiment, further, as shown in fig. 9, the feature line segment based distance calculating module 230 includes: a feature line segment constructing unit 231 for constructing feature line segments in the reference frame and the current frame according to the feature matching result; the characteristic line segment transformation and length calculation unit 232 is used for transforming the characteristic line segments in a longitudinal plane of the unmanned aerial vehicle in the flight process and calculating the lengths of the transformed characteristic line segments so as to meet the requirement of distance calculation; and the distance calculating unit 233 corresponding to the feature line segments is used for calculating the distance values corresponding to each pair of feature line segments and forming a set.
In this embodiment, further, as shown in fig. 10, the target distance sequence filtering module 250 includes: an extended Kalman filter model constructing unit 251, configured to determine a state quantity and an observed quantity of the extended Kalman filter model, and construct a system state equation and an observation equation; an extended Kalman filter design and application unit 252 is configured to filter the target depth calculation result corresponding to the forward-looking image sequence output by the module 240, so as to obtain a smoother and more accurate target distance measurement result sequence.

Claims (8)

1. A target distance measuring method based on a forward-looking image of an unmanned aerial vehicle is characterized by comprising the following steps:
(1) Selecting an image at a certain specific moment from an image sequence of the current flight phase as a reference frame;
(2) For the images of the reference frame and the current frame, SIFT feature points are extracted from a given target region, and corresponding SIFT descriptors are given;
(3) Based on the feature extraction result, the Euclidean distance between descriptors corresponding to the feature points is used as similarity measurement to carry out feature matching, and mismatching is eliminated according to the measurement;
(4) Calculating the distance corresponding to each pair of characteristic line segments according to the designed distance calculation model based on the characteristic line segments by means of pose information provided by airborne navigation equipment; the method comprises the following specific steps: a feature line segment structure, wherein any two matched feature points are connected in a reference frame and a current frame to construct a feature line segment; the characteristic line segment is transformed and the length is calculated, and the characteristic line segment is transformed in a longitudinal plane of the unmanned aerial vehicle in the flying process so as to meet the requirement of distance calculation; calculating the distance corresponding to the characteristic line segments, and calculating the distance under the current frame corresponding to each pair of characteristic line segments according to the geometric imaging relationship;
(5) Obtaining a set containing distance calculation results of all the characteristic line segments, and calculating the target distance in a median mode;
(6) And filtering the target distance calculation result sequence corresponding to the forward-looking image sequence by adopting a discrete extended Kalman filtering algorithm, and providing a smooth target distance measurement result with higher accuracy.
2. The method for measuring the target distance based on the forward-looking image of the unmanned aerial vehicle as claimed in claim 1, wherein in step (1), the reference frame is selected by means of position information recorded by the onboard navigation equipment at each moment in the flight process, and the selection criterion is that the ratio of the horizontal distance between the reference frame and the current frame to the target distance measurement result at the previous moment is kept constant.
3. The method according to claim 1, wherein in step (3), the similarity measure selected for feature matching is the Euclidean distance between descriptors corresponding to feature points; and rejecting mismatching, screening according to the similarity measurement, and rejecting weak matching and fuzzy matching.
4. The method according to claim 1, wherein in step (5), the set of distance calculation results is taken as the median of all feature line segments.
5. The method for measuring the target distance based on the forward-looking image of the unmanned aerial vehicle according to claim 1, wherein in the step (6), the step of filtering the calculation result of the target distance by using a discrete extended Kalman filter algorithm specifically comprises the steps of: expanding a Kalman filtering model to construct, selecting proper state quantity and observation quantity according to a distance calculation model based on a characteristic line segment, and constructing a system state equation and an observation equation; and (3) designing and applying an extended Kalman filter, and deducing and obtaining a prediction and update equation of the filter according to the extended Kalman filtering principle and applying the prediction and update equation.
6. An apparatus for implementing the method for measuring target distance based on forward-looking image of unmanned aerial vehicle according to claim 1, comprising: the reference frame selection module is used for selecting an image at a certain specific moment from the image sequence of the current flight phase as a reference frame; the feature point extraction module is used for extracting SIFT feature points from a given target region in the reference frame image and the current frame image and giving out a corresponding SIFT descriptor; the characteristic point matching module is used for performing characteristic matching and eliminating mismatching by taking the Euclidean distance between descriptors corresponding to the characteristic points as similarity measurement; the distance calculation module based on the characteristic line segments calculates the distance corresponding to each pair of characteristic line segments according to the designed distance calculation model based on the characteristic line segments by means of pose information provided by the airborne navigation equipment; the target distance calculation module is used for taking the median of a set containing all characteristic line segment pair distance calculation results as a target distance calculation result; the target distance sequence filtering module is used for filtering a target distance calculation result sequence corresponding to the forward-looking image sequence by adopting a discrete Kalman filtering algorithm to give a smooth target distance measurement result with higher accuracy;
the distance calculation module based on the characteristic line segments comprises: the characteristic line segment construction unit is used for constructing characteristic line segments in the reference frame and the current frame according to the characteristic matching result; the characteristic line segment transformation and length calculation unit is used for transforming the characteristic line segments in a longitudinal plane of the unmanned aerial vehicle in the flight process and calculating the lengths of the transformed characteristic line segments so as to meet the requirement of distance calculation; and the distance calculation unit corresponding to the characteristic line segments is used for calculating the distance values corresponding to each pair of characteristic line segments and forming a set.
7. The unmanned aerial vehicle forward-looking image-based target distance measuring device of claim 6, wherein the feature point matching module comprises: a feature matching metric calculating unit, for calculating the sum of differences between the 128-dimensional descriptors of the feature points in the reference frame and the current frame as a feature matching metric; the weak matching eliminating unit is used for eliminating weak matching in the characteristic point mismatching; and the fuzzy matching eliminating unit is used for eliminating fuzzy matching in the characteristic point mismatching.
8. The apparatus of claim 6, wherein the target distance sequence filtering module comprises: the extended Kalman filtering model building unit is used for determining the state quantity and the observed quantity of the extended Kalman filtering model and building a system state equation and an observation equation; and the extended Kalman filter design and application unit is used for filtering a target depth calculation result corresponding to the foresight image sequence output by the module, so that a smoother and more accurate target distance measurement result sequence is obtained.
CN202111532928.0A 2021-12-15 2021-12-15 Target distance measuring method and device based on forward-looking image of unmanned aerial vehicle Active CN114322943B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111532928.0A CN114322943B (en) 2021-12-15 2021-12-15 Target distance measuring method and device based on forward-looking image of unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111532928.0A CN114322943B (en) 2021-12-15 2021-12-15 Target distance measuring method and device based on forward-looking image of unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN114322943A CN114322943A (en) 2022-04-12
CN114322943B true CN114322943B (en) 2023-03-28

Family

ID=81051909

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111532928.0A Active CN114322943B (en) 2021-12-15 2021-12-15 Target distance measuring method and device based on forward-looking image of unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN114322943B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109813334A (en) * 2019-03-14 2019-05-28 西安工业大学 Real-time high-precision vehicle mileage calculation method based on binocular vision

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3867410B2 (en) * 1998-09-02 2007-01-10 株式会社明電舎 Three-dimensional visual positioning method and apparatus
US8442304B2 (en) * 2008-12-29 2013-05-14 Cognex Corporation System and method for three-dimensional alignment of objects using machine vision
EP2527787B1 (en) * 2011-05-23 2019-09-11 Kabushiki Kaisha TOPCON Aerial photograph image pickup method and aerial photograph image pickup apparatus
CN108052110A (en) * 2017-09-25 2018-05-18 南京航空航天大学 UAV Formation Flight method and system based on binocular vision
CN109376785B (en) * 2018-10-31 2021-09-24 东南大学 Navigation method based on iterative extended Kalman filtering fusion inertia and monocular vision
CN110146110B (en) * 2019-05-20 2022-11-15 哈尔滨工程大学 Mismatching judgment method for ICNN data association of robot line characteristics in indoor environment

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109813334A (en) * 2019-03-14 2019-05-28 西安工业大学 Real-time high-precision vehicle mileage calculation method based on binocular vision

Also Published As

Publication number Publication date
CN114322943A (en) 2022-04-12

Similar Documents

Publication Publication Date Title
Kong et al. Autonomous landing of an UAV with a ground-based actuated infrared stereo vision system
Meingast et al. Vision based terrain recovery for landing unmanned aerial vehicles
EP0436213B1 (en) Obstacle detection system
Strydom et al. Visual odometry: autonomous uav navigation using optic flow and stereo
Schneider et al. Fast and effective online pose estimation and mapping for UAVs
CN113485441A (en) Distribution network inspection method combining unmanned aerial vehicle high-precision positioning and visual tracking technology
CN103149939A (en) Dynamic target tracking and positioning method of unmanned plane based on vision
EP3757606A2 (en) Dense mapping using range sensor multi-scanning and multi-view geometry from successive image frames
Sanfourche et al. Perception for UAV: Vision-Based Navigation and Environment Modeling.
Ivanovas et al. Block matching based obstacle avoidance for unmanned aerial vehicle
Oliveira et al. Real-time and post-processed georeferencing for hyperpspectral drone remote sensing
Conte et al. High accuracy ground target geo-location using autonomous micro aerial vehicle platforms
Hartley et al. Using roads for autonomous air vehicle guidance
Steffen et al. On visual real time mapping for unmanned aerial vehicles
Kamat et al. A survey on autonomous navigation techniques
Bhanu et al. Inertial navigation sensor integrated motion analysis for obstacle detection
Ramos et al. Vision-based tracking of non-cooperative space bodies to support active attitude control detection
CN114322943B (en) Target distance measuring method and device based on forward-looking image of unmanned aerial vehicle
Aminzadeh et al. Implementation and performance evaluation of optical flow navigation system under specific conditions for a flying robot
Hintze Autonomous landing of a rotary unmanned aerial vehicle in a non-cooperative environment using machine vision
Al-Kaff Vision-based navigation system for unmanned aerial vehicles
Kang et al. Development of a peripheral-central vision system for small UAS tracking
CN113589848B (en) Multi-unmanned aerial vehicle detection, positioning and tracking system and method based on machine vision
Zheng et al. Integrated navigation system with monocular vision and LIDAR for indoor UAVs
Ma et al. A review: The survey of attitude estimation in autonomous uav navigation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant