CN114689030A - Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision - Google Patents

Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision Download PDF

Info

Publication number
CN114689030A
CN114689030A CN202210611516.4A CN202210611516A CN114689030A CN 114689030 A CN114689030 A CN 114689030A CN 202210611516 A CN202210611516 A CN 202210611516A CN 114689030 A CN114689030 A CN 114689030A
Authority
CN
China
Prior art keywords
target
image
template
unmanned aerial
aerial vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210611516.4A
Other languages
Chinese (zh)
Inventor
王世勇
安帅
李茂�
雷超
倪峰棋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China South Industries Group Automation Research Institute
Original Assignee
China South Industries Group Automation Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China South Industries Group Automation Research Institute filed Critical China South Industries Group Automation Research Institute
Priority to CN202210611516.4A priority Critical patent/CN114689030A/en
Publication of CN114689030A publication Critical patent/CN114689030A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C5/00Measuring height; Measuring distances transverse to line of sight; Levelling between separated points; Surveyors' levels
    • G01C5/005Measuring height; Measuring distances transverse to line of sight; Levelling between separated points; Surveyors' levels altimeters for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses an unmanned aerial vehicle auxiliary positioning method and system based on airborne vision, aiming at a satellite signal rejection environment, an airborne monocular camera and a ground vision target are utilized, and positioning information can be provided for an unmanned aerial vehicle in real time according to the attitude angle and height of the unmanned aerial vehicle and longitude and latitude information before satellite signals are lost, which are provided by a flight controller, so that the relative position relation between the unmanned aerial vehicle and the ground target is maintained. The method adopts passive positioning of airborne vision, has the advantages of intuition, difficulty in interference and the like, can be used as an independent positioning system of the unmanned aerial vehicle, and provides relative position information and positioning coordinate information in the taking-off and landing and following processes of the unmanned aerial vehicle. Meanwhile, a method of overlapping large and small targets is adopted, and the ground large target contains the small target. The problem of unmanned aerial vehicle take off and land the in-process, lead to can't shoot complete ground target because angle of vision is solved.

Description

Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
Technical Field
The invention relates to the technical field of unmanned aerial vehicle navigation, in particular to an unmanned aerial vehicle auxiliary positioning method and system based on airborne vision.
Background
A multi-rotor unmanned aerial vehicle is a special unmanned helicopter with three or more rotor shafts. It is rotated by a motor on each shaft, driving the rotor, thereby generating lift. The collective pitch of the rotors is fixed and not variable as in a typical helicopter. Through changing the relative speed between the different rotors, the size of unipolar propulsive force can be changed to the orbit of control aircraft.
Along with the development of many rotor unmanned aerial vehicle technique and navigation, many rotor unmanned aerial vehicle's function is more and more abundant, and its application is more and more diversified. The method is widely applied to the fields of express delivery, traffic supervision, environment management and the like. In some outdoor application scenarios, the autonomous flight function of the drone is typically used. Satellite navigation is typically employed for unmanned aerial vehicle outdoor navigation. The satellite navigation has the advantages of global, all-weather, continuous and accurate navigation and positioning capability and good real-time performance. Satellite navigation suffers from drawbacks as well, such as susceptibility to electromagnetic interference; the operation of the satellite receiver is affected by the maneuvering of the drone. Meanwhile, the navigation accuracy of satellite navigation depends heavily on the satellite signal strength.
Especially, in complex landform environments such as cities and mountain forests, satellite signals are easily affected by environment and artificially maliciously interfered, so that satellite receiving signals of the unmanned aerial vehicle are lost, and the unmanned aerial vehicle is out of control or cannot take off. In a satellite signal rejection environment, drones typically utilize inertial navigation. Although inertial navigation can solve the problem of positioning without satellite signals in a short time, inertial navigation cannot solve the problem of positioning drift in a long time.
Therefore, how to provide an auxiliary positioning method can assist positioning of the unmanned aerial vehicle in a satellite signal rejection environment, effectively reduce dependence of the unmanned aerial vehicle on satellite signals, and improve application capability of the unmanned aerial vehicle is a technical problem which needs to be solved by technical personnel in the field urgently.
Disclosure of Invention
The invention provides an unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
The invention provides the following scheme:
an unmanned aerial vehicle auxiliary positioning method based on airborne vision comprises the following steps:
acquiring attitude information and a first flight height provided by an airborne flight controller;
acquiring a target image in an onboard camera video stream, wherein the target image comprises identification features of a reference target image;
matching the reference target pattern with template target patterns contained in a template target pattern library; obtaining a template rotation angle corresponding to a template target graph with the highest similarity; so as to determine the rotation angle of the airborne camera relative to the reference target according to the template rotation angle;
calculating the centroid pixel coordinates and pixel area of the reference target graph in the target image;
calculating the physical distance between the reference target and the airborne camera according to the actual physical size of the reference target and the angle of view of the airborne camera and the pixel area so as to calculate and obtain a second flying height of the carrier according to the physical distance;
and obtaining coordinates of the reference target in a northeast coordinate system with the center of mass of the carrier as an origin through coordinate system transformation calculation according to the center of mass pixel coordinates, the attitude information and the second flying height, so as to obtain the offset of the reference target relative to the carrier in the northeast coordinate system according to the coordinate calculation.
Preferably: the acquiring of the target image in the video stream of the onboard camera comprises:
reading a frame of image of an original video stream from an on-board camera device, and dividing the image into a plurality of connected areas; screening out connected domains meeting the requirement of the area size according to the area of the connected domains; scaling the connected regions and calculating to obtain the identification features of each connected region, respectively performing similarity matching on the identification features of each connected region and the identification features of the pre-stored template target patterns, and determining the connected region with the highest identification feature similarity as a target connected region, wherein the region image contained in the target connected region is a target image.
Preferably: reading a frame of image of an original video stream from an on-board camera device, carrying out binarization processing on the image, and then segmenting the image to obtain a plurality of connected regions;
carrying out normalization processing on the target connected region, wherein the image normalization characteristics are as follows:
Figure 766234DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 231981DEST_PATH_IMAGE002
is the desired mean value of the image,
Figure 805045DEST_PATH_IMAGE003
is the variance of the image and is,
Figure 695641DEST_PATH_IMAGE004
is the normalized image characteristics,
Figure 86171DEST_PATH_IMAGE005
Is the pixel value at the corresponding pixel coordinate.
Preferably: the similarity matching of the identification features of each communication area and the identification features of the template target patterns stored in advance respectively comprises the following steps:
the matching calculation formula is as follows:
Figure 429427DEST_PATH_IMAGE006
obtaining a candidate target corresponding to the maximum matching value as a detected target;
wherein the content of the first and second substances,
Figure 947127DEST_PATH_IMAGE007
to best match the corresponding template target index,
Figure 426650DEST_PATH_IMAGE008
is a dot product operation of the feature image matrix,
Figure 129027DEST_PATH_IMAGE009
is the pixel area;
Figure 84214DEST_PATH_IMAGE010
in the above formula, i and j are subscripts of values in two dimensions, i.e. width and height, of the image, and the value range is as follows:
Figure 733501DEST_PATH_IMAGE011
Figure 736704DEST_PATH_IMAGE012
preferably: the identifying feature comprises a full profile feature of the reference target; the reference target comprises a first target and a second target, and the contour size of the first target is larger than that of the second target;
acquiring a connected domain of the reference target graph, and determining the type of the reference target through the connected domain;
the determining the type of the reference target by the connected domain comprises:
judging whether the connected domain is included, if the connected domain is included, segmenting the first target and the second target, and determining the type of the reference target according to the first flight altitude and the flight altitude interval threshold.
Preferably: the first target and the second target have the same profile features, the second targets are stacked on top of each other and the second target is located in the center of the first target.
Preferably: the method for establishing the template target graph library comprises the following steps:
normalizing the template target according to a preset configuration size, rotating the template target within a range of 360 degrees at intervals of precision, and respectively calculating the feature vector of the template target at each rotation angle and storing the feature vector as a feature vector matrix; the feature vector includes at least a mean, a centroid shift, and a standard deviation.
Preferably: the coordinate of the reference target under a northeast coordinate system with the carrier centroid as an origin is obtained through coordinate system transformation calculation according to the centroid pixel coordinate, the attitude information and the second flying height, and the coordinate comprises:
knowing the onboard camera pixel (u0, v0), the field angle (FOVx, FOVy), the offset T between the onboard camera mounting position and the carrier centroid, the flying height H of the carrier at a certain moment, the attitude angle (yaw, pitch, roll), the pixel position (u, v) of the reference target graphic in the image taken by the onboard camera;
calculating the camera visual field shooting range (width, height):
Figure 219769DEST_PATH_IMAGE013
Figure 537618DEST_PATH_IMAGE014
pixel coordinate system (o' -u-v) to camera coordinate (oc-xc-yc-zc) system conversion:
Figure 380809DEST_PATH_IMAGE015
Figure 834924DEST_PATH_IMAGE016
Figure 879104DEST_PATH_IMAGE017
Figure 294036DEST_PATH_IMAGE018
Figure 81863DEST_PATH_IMAGE019
where dx, dy are the actual physical lengths represented by each pixel in the width and height directions, respectively; the coordinate of the target along the Z axis in the camera coordinate system is approximately equal to the flight height H of the airplane;
camera coordinate system (oc-xc-yc-zc) to carrier coordinate system (oz-xz-yz-zz) transformation:
Figure 515119DEST_PATH_IMAGE020
Figure 464620DEST_PATH_IMAGE021
Figure 491482DEST_PATH_IMAGE022
transformation of the carrier coordinate system (oz-xz-yz-zz) to the northeast coordinate system (O-N-E-D):
Figure 958366DEST_PATH_IMAGE023
Figure 121494DEST_PATH_IMAGE024
Figure 507476DEST_PATH_IMAGE025
Figure 146268DEST_PATH_IMAGE026
preferably: and expanding the target image size by the target multiple from the reference target image size through the centroid pixel coordinates to form a search box, and detecting the target image of the subsequent frame in the search box.
An unmanned aerial vehicle assisted positioning system based on airborne vision, the system comprising:
the attitude information acquisition unit is used for acquiring attitude information and a first flight height provided by the airborne flight controller;
a target image acquisition unit for acquiring a target image in an onboard camera video stream, the target image including an identification feature of a reference target image;
the rotation angle determining unit is used for matching the reference target pattern with the template target patterns contained in the template target pattern library; obtaining a template rotation angle corresponding to a template target graph with the highest similarity; so as to determine the rotation angle of the airborne camera relative to the reference target according to the template rotation angle;
the pixel calculation unit is used for calculating the centroid pixel coordinates and the pixel area of the reference target graph in the target image;
the second flying height calculating unit is used for calculating and obtaining the physical distance between the reference target and the airborne camera according to the actual physical size of the reference target and the field angle of the airborne camera and the pixel area so as to obtain the second flying height of the carrier according to the physical distance;
and the offset calculating unit is used for obtaining the coordinates of the reference target under a northeast coordinate system with the center of mass of the carrier as an origin through coordinate system transformation calculation according to the centroid pixel coordinates, the attitude information and the second flying height, so as to obtain the offset of the reference target relative to the carrier under the northeast coordinate system according to the coordinate calculation.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects:
according to the unmanned aerial vehicle auxiliary positioning method based on airborne vision, aiming at the satellite signal rejection environment, the airborne monocular camera and the ground visual target are utilized, and positioning information can be provided for the unmanned aerial vehicle in real time according to the attitude angle and the height of the unmanned aerial vehicle provided by the flight controller and the longitude and latitude information before satellite signals are lost, so that the relative position relation between the unmanned aerial vehicle and the ground target is maintained. The method adopts passive positioning of airborne vision, has the advantages of intuition, difficult interference and the like, can be used as an independent positioning system of the unmanned aerial vehicle, and provides relative position information and positioning coordinate information in the taking-off and landing and following processes of the unmanned aerial vehicle.
In addition, under the preferred embodiment, the problem that the whole ground target cannot be shot due to the angle of view in the taking-off and landing process of the unmanned aerial vehicle is solved, the method for overlapping the large and small targets is adopted, and the large ground target contains the small target. The ground target identification is mainly realized in two stages in the taking-off and landing process of the unmanned aerial vehicle. When the relative flying height is larger than the threshold value, the onboard processor identifies the ground large target, and when the relative flying height is lower than the threshold value, the onboard processor cannot detect the complete large target, and the outline of the small target is automatically detected.
Of course, it is not necessary for any product in which the invention is practiced to achieve all of the above-described advantages at the same time.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments will be briefly described below. It is obvious that the drawings in the following description are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
Fig. 1 is a flowchart of an auxiliary positioning method for an unmanned aerial vehicle based on airborne vision according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a ground-based reference target according to an embodiment of the present invention;
fig. 3 is a schematic block diagram of an auxiliary positioning method for an unmanned aerial vehicle based on airborne vision according to an embodiment of the present invention;
FIG. 4 is a schematic flow chart of a target detection algorithm provided by an embodiment of the present invention;
FIG. 5 is a schematic diagram of the relationship between coordinate systems provided by the embodiment of the present invention;
FIG. 6 is a block diagram of a vision-aided positioning system provided by an embodiment of the invention;
FIG. 7(a), FIG. 7(b), FIG. 7(c), FIG. 7(d), FIG. 7(e), FIG. 7(f), FIG. 7(g) and FIG. 7(h) are schematic diagrams of the rotation state of a partial target template provided by an embodiment of the present invention;
fig. 8 is a diagram of a horizontal offset (X-axis direction) trajectory for visual-assisted positioning of an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 9 is a diagram of a horizontal offset (Y-axis direction) trajectory for vision-assisted positioning of an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 10 is a schematic diagram of an auxiliary positioning system of an unmanned aerial vehicle based on airborne vision according to an embodiment of the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. It should be apparent that the described embodiments are only some of the embodiments of the present invention, and not all of the embodiments. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present invention.
Referring to fig. 1, an auxiliary positioning method for an unmanned aerial vehicle based on airborne vision according to an embodiment of the present invention is shown in fig. 1, and the method may include:
s101: acquiring attitude information and a first flight height provided by an airborne flight controller;
s102: acquiring a target image in an onboard camera video stream, wherein the target image comprises identification features of a reference target image;
s103: matching the reference target pattern with template target patterns contained in a template target pattern library; obtaining a template rotation angle corresponding to a template target graph with the highest similarity; so as to determine the rotation angle of the airborne camera relative to the reference target according to the template rotation angle;
s104: calculating the centroid pixel coordinates and pixel area of the reference target graph in the target image;
s105: calculating the physical distance between the reference target and the airborne camera according to the actual physical size of the reference target and the angle of view of the airborne camera and the pixel area so as to calculate and obtain a second flying height of the carrier according to the physical distance;
s106: and obtaining coordinates of the reference target in a northeast coordinate system with the center of mass of the carrier as an origin through coordinate system transformation calculation according to the center of mass pixel coordinates, the attitude information and the second flying height, so as to obtain the offset of the reference target relative to the carrier in the northeast coordinate system according to the coordinate calculation.
According to the unmanned aerial vehicle auxiliary positioning method based on airborne vision, aiming at the satellite signal rejection environment, an airborne monocular camera and a ground visual target are utilized, and positioning information can be provided for the unmanned aerial vehicle in real time according to the attitude angle and the height of the unmanned aerial vehicle provided by a flight controller and longitude and latitude information before satellite signals are lost so as to keep the relative position relation between the unmanned aerial vehicle and the ground target. The method adopts passive positioning of airborne vision, has the advantages of intuition, difficulty in interference and the like, can be used as an independent positioning system of the unmanned aerial vehicle, and provides relative position information and positioning coordinate information in the taking-off and landing and following processes of the unmanned aerial vehicle.
The target image provided by the embodiment of the present application needs to include a pattern of a reference target, and because the content of the image captured by each frame of image of the video stream of the onboard camera may be more, in order to better acquire the identification feature including the pattern of the reference target, the embodiment of the present application may provide the target image in the video stream of the onboard camera, including:
reading a frame of image of an original video stream from an on-board camera device, and dividing the image into a plurality of connected areas; screening out connected domains meeting the requirement of the area size according to the area of the connected domains; scaling the connected regions and calculating to obtain the identification features of each connected region, respectively performing similarity matching on the identification features of each connected region and the identification features of the pre-stored template target patterns, and determining the connected region with the highest identification feature similarity as a target connected region, wherein the region image contained in the target connected region is a target image. Specifically, a frame of image of an original video stream is read from an onboard camera device, and the image is divided into a plurality of connected areas after binarization processing is performed on the image;
carrying out normalization processing on the target connected region, wherein the image normalization characteristics are as follows:
Figure 275898DEST_PATH_IMAGE027
wherein the content of the first and second substances,
Figure 165969DEST_PATH_IMAGE028
is a desired average value of the image,
Figure 988432DEST_PATH_IMAGE029
is the variance of the image and is,
Figure 724307DEST_PATH_IMAGE030
is the normalized image characteristics,
Figure 516682DEST_PATH_IMAGE031
Is the pixel value at the corresponding pixel coordinate.
The similarity matching of the identification features of each communication area and the identification features of the template target patterns stored in advance respectively comprises the following steps:
the matching calculation formula is as follows:
Figure 654402DEST_PATH_IMAGE032
and obtaining the candidate target corresponding to the maximum matching value as the detected target, wherein,
Figure 647766DEST_PATH_IMAGE033
to best match the corresponding template target index,
Figure 746303DEST_PATH_IMAGE034
is a dot product operation of the feature image matrix,
Figure 217736DEST_PATH_IMAGE035
is the pixel area:
Figure 334597DEST_PATH_IMAGE036
in the above formula, i and j are subscripts of values in two dimensions, i.e. width and height, of the image, and the value range is as follows:
Figure 967703DEST_PATH_IMAGE037
Figure 943749DEST_PATH_IMAGE038
the identifying feature comprises a full profile feature of the reference target; the reference target comprises a first target and a second target, and the contour size of the first target is larger than that of the second target;
acquiring a connected domain of the reference target graph, and determining the type of the reference target through the connected domain;
the determining the type of the reference target by the connected domain comprises:
judging whether a connected domain of the first target and the second target is included, if so, segmenting the first target and the second target, and determining the type of the reference target according to the first flying height and the flying height interval threshold.
The first target and the second target have the same profile features, the second targets are stacked on top of each other and the second target is located in the center of the first target.
In order to solve the problem that the whole ground target cannot be shot due to the field angle in the taking-off and landing process of the unmanned aerial vehicle, a method of overlapping large and small targets is adopted in the method, and as shown in fig. 2, the large ground target contains the small target. The ground target identification is mainly realized in two stages in the taking-off and landing process of the unmanned aerial vehicle. When the relative flying height is larger than the threshold value, the onboard processor identifies the ground large target, and when the relative flying height is lower than the threshold value, the onboard processor cannot detect the complete large target, and the outline of the small target is automatically detected.
When the contour of the target is detected, affine rotation transformation is carried out on the target according to the estimated target rotation angle, so that the actual pixel size of the target is estimated, and finally the current flying height of the unmanned aerial vehicle is estimated according to the optical imaging principle.
The method for establishing the template target graph library comprises the following steps:
normalizing the template target according to a preset configuration size, rotating the template target within a range of 360 degrees at intervals of precision, and respectively calculating the feature vector of the template target at each rotation angle and storing the feature vector as a feature vector matrix; the feature vector includes at least a mean, a centroid shift, and a standard deviation. When setting the reference target, it is necessary to ensure that the specific part of the reference target is oriented in the same direction as the initial position of the template target.
The coordinate of the reference target under a northeast coordinate system with the carrier centroid as an origin is obtained through coordinate system transformation calculation according to the centroid pixel coordinate, the attitude information and the second flying height, and the coordinate comprises:
knowing the pixels (u0, v0) of the airborne camera, the field angle (FOVx, FOVy), the offset T between the installation position of the airborne camera and the mass center of the carrier, the flying height H of the carrier at a certain moment, the attitude angle (yaw, pitch, roll), and the pixel position (u, v) of the reference target pattern in the image shot by the airborne camera;
calculating the camera visual field shooting range (width, height):
Figure 94239DEST_PATH_IMAGE039
Figure 940973DEST_PATH_IMAGE040
pixel coordinate system (o' -u-v) to camera coordinate (oc-xc-yc-zc) system conversion:
Figure 276139DEST_PATH_IMAGE041
Figure 332957DEST_PATH_IMAGE042
Figure 411771DEST_PATH_IMAGE043
Figure 253956DEST_PATH_IMAGE044
Figure 494445DEST_PATH_IMAGE045
where dx, dy are the actual physical lengths represented by each pixel in the width and height directions, respectively; the coordinate of the target along the Z axis in the camera coordinate system is approximately equal to the flight height H of the airplane;
camera coordinate system (oc-xc-yc-zc) to carrier coordinate system (oz-xz-yz-zz) transformation:
Figure 179504DEST_PATH_IMAGE046
Figure 921064DEST_PATH_IMAGE047
Figure 742389DEST_PATH_IMAGE048
transformation of the carrier coordinate system (oz-xz-yz-zz) to the northeast coordinate system (O-N-E-D):
Figure 153779DEST_PATH_IMAGE049
Figure 932992DEST_PATH_IMAGE050
Figure 88030DEST_PATH_IMAGE051
Figure 888495DEST_PATH_IMAGE052
in order to further improve the search efficiency of the target image, the embodiment of the application may provide that a search box is formed by expanding the size of the reference target image by a target multiple through the centroid pixel coordinate, and the target image detection of the subsequent frame is performed in the search box.
The method provided by the embodiment of the application can effectively solve the drifting problem of inertial navigation under long-time positioning aiming at the positioning scene of the multi-rotor unmanned aerial vehicle in the satellite signal rejection environment. The designed method has low requirement on the computing power of the processor, and in practical application, a common embedded computing platform can meet the computing requirement.
The method provided by the present application is described in detail below with reference to fig. 3, by way of example of a nested target design format.
1. Target design
The present application employs a nested target design format. The target shape is shown in figure 2. Wherein the size of the large target is 1.5 meters and 1.5 meters, and the size of the small target is 0.1 meter and 0.1 meter. The small target is located in the center of the large target. The target is printed on the white tarpaulin.
2. Camera selection
The near-infrared band camera is selected, the horizontal field angle of the camera is 60 degrees, the vertical field angle of the camera is 34 degrees, and the image quality is 1080P (pixels 1920x 1080).
3. Description of the flow of an Algorithm
3.1 target template set preparation
And scaling and rotating the target template to obtain a template set, wherein the set can be used as a feature set to be matched. The algorithm scales the shortest edge of the template uniformly to 30 pixels. Fig. 7(a), 7(b), 7(c), 7(d), 7(e), 7(f), 7(g), and 7(h) show the partial templates.
3.2 target template feature extraction
Normalizing the template target according to a preset configuration size, rotating the template target within a range of 360 degrees at intervals of precision, respectively calculating the characteristic vectors of the template target after each rotation angle, respectively calculating the characteristic vectors of the target template after each rotation angle, including a mean value, a core offset, a standard deviation and the like, and storing the characteristic vectors as a target template characteristic vector matrix.
3.3 target detection and Pixel coordinates and Pixel area calculation
Reading a frame of image of an original video stream from an onboard camera device, performing binarization processing and image segmentation to obtain a plurality of connected regions, performing scale scaling on the connected regions and calculating a feature vector so as to perform matching detection of similarity with a target template, and finding out a candidate region with the highest similarity as a detected target image.
Specifically, as shown in fig. 4, raw frame data of a video stream is read from an onboard image pickup apparatus, and binarization processing, image segmentation, and feature extraction are performed. In order to eliminate the interference of a complex background, normalization processing is carried out on the candidate subregion image, and the image normalization characteristics are as follows:
Figure 736366DEST_PATH_IMAGE053
wherein, the first and the second end of the pipe are connected with each other,
Figure 5804DEST_PATH_IMAGE054
is the desired mean value of the image,
Figure 964533DEST_PATH_IMAGE055
is the variance of the image. In minuteAfter the characteristics of the template and the candidate target image are extracted, the characteristics of the target image are matched with the characteristics of the template image, and the matching calculation formula is as follows:
Figure 760451DEST_PATH_IMAGE056
and obtaining a candidate target corresponding to the maximum matching value as the detected target, wherein,
Figure 903856DEST_PATH_IMAGE057
to best match the corresponding template target index,
Figure 519645DEST_PATH_IMAGE058
is a dot product operation of the feature image matrix,
Figure 282065DEST_PATH_IMAGE059
is the pixel area;
Figure 807855DEST_PATH_IMAGE060
in the above formula, i and j are subscripts of values in two dimensions, i.e. width and height, of the image, and the value range is as follows:
Figure 997528DEST_PATH_IMAGE061
Figure 490826DEST_PATH_IMAGE062
if the target image is not detected, the algorithm is ended, the number of detection failure frames is recorded, and the next frame is entered. And if the target is detected, calculating the connected domain of the target. If the target comprises a large target connected domain and a small target connected domain, the large target and the small target are separated. And judging and selecting to adopt a large target or a small target as a reference according to a preset threshold value of the flight altitude interval. And calculating the pixel coordinates of the target centroid in the current frame image visual field, the pixel area of the target and the corresponding rotation angle of the target template. The pixel location of the target centroid and the size of the search box expanded by a preset multiple by the target size are saved. Target detection of subsequent frames is carried out in the search box, and detection efficiency and precision are improved.
3.4 fly height estimation
And when the flying height of the unmanned aerial vehicle is lower than 40 meters, calculating the physical distance between the target and the camera according to the actual physical size of the known target and the field angle of the camera and the pixel area of the target in the image calculated in the step 3.3. Based on monocular distance measurement and the principle of near-far distance, the physical distance between the target and the camera can be calculated in real time according to the actual physical size of the target and the field angle of the camera which are known in advance, and the pixel area of the input real-time target in the image, so that the flying height estimation is completed.
3.5 target offset calculation
And (3) calculating the offset of the target in the northeast coordinate system by using the target pixel coordinate calculated in the step 3.3 and combining the attitude and height information provided by the flight controller through multiple coordinate system transformations (pixel coordinate system- > camera coordinate system- > carrier coordinate system- > northeast coordinate system).
And calculating the offset of the target in the coordinate system of the northeast through multiple times of coordinate system transformation by combining the attitude and height information provided by the flight controller.
The camera pixel (u0, v0), the field angle (FOVx, FOVy), the offset T between the camera mounting position and the drone centroid, the flying height H at a time of the drone, the attitude angle (yaw, pitch, roll), the pixel position (u, v) of the target in the image taken by the camera are known. During installation, the horizontal and vertical field angle directions of the camera are required to be respectively parallel to the Y axis and the X axis in the carrier coordinate system.
Calculating the camera visual field shooting range (width, height):
Figure 525778DEST_PATH_IMAGE063
Figure 296288DEST_PATH_IMAGE064
pixel coordinate system (o' -u-v) to camera coordinate (oc-xc-yc-zc) system conversion:
Figure 266649DEST_PATH_IMAGE065
Figure 122610DEST_PATH_IMAGE066
Figure 85887DEST_PATH_IMAGE067
Figure 976482DEST_PATH_IMAGE068
Figure 507958DEST_PATH_IMAGE069
where dx, dy are the actual physical lengths represented by each pixel in the width and height directions, respectively. The target's coordinates along the Z-axis in the camera coordinate system are approximately equal to the aircraft flight height H.
Camera coordinate system (oc-xc-yc-zc) to carrier coordinate system (oz-xz-yz-zz) transformation:
Figure 458072DEST_PATH_IMAGE070
Figure 834826DEST_PATH_IMAGE071
Figure 579929DEST_PATH_IMAGE072
as can be seen from the definition of the coordinate system in fig. 5, there is a rotation of an angle of-pi/2 around the Z-axis between the camera coordinate system and the carrier coordinate system. And assume that there is some offset between the camera mounting location and the carrier centroid location. The camera coordinates are transformed to the carrier coordinates, and the camera coordinates are equivalent to rotation and translation around the Z axis of the carrier coordinate system.
Transformation of the carrier coordinate system (oz-xz-yz-zz) to the northeast coordinate system (O-N-E-D):
Figure 875781DEST_PATH_IMAGE073
Figure 706333DEST_PATH_IMAGE074
Figure 762145DEST_PATH_IMAGE075
Figure 627333DEST_PATH_IMAGE076
and converting the carrier coordinate system into a northeast coordinate system by adopting the attitude angle provided by the flight controller to obtain the coordinate of the target under the northeast coordinate system with the carrier centroid as the origin, namely the offset of the target relative to the carrier. And after the target offset calculation is finished, entering the next frame and repeatedly executing the steps 2 to 5.
4. Algorithmic computing platform selection
The Haisi Hi3559A chip was chosen as the computing platform in this experiment. And (5) transplanting the algorithm package to the computing platform. The camera front end collects videos and sends the videos to the algorithm through a serial port, and the algorithm processes on line and stores calculation results and related parameters.
5. Unmanned flight platform selection
Select 8 rotor unmanned aerial vehicles in dazzling spirit as hanging in this experiment and fly experiment unmanned aerial vehicle flight platform. A nine-axis gyroscope and a barometer are selected and used in the experiment and are rigidly connected with a camera, so that the attitude and height information of the unmanned aerial vehicle is provided for vision auxiliary positioning. The camera is installed perpendicularly in the unmanned aerial vehicle side, and the rigid link.
6. Experiment of field hanging
The experimental site was selected to be performed in a relatively open field. The target is fixed on a flat ground, and the camera is opposite to the small target. The unmanned aerial vehicle takes off from the ground fixed target, hovers after being lifted off to the maximum height, and then lands to the same place. The maximum flying height is 220 meters, the target is used as the center of a circle in the whole flying process, the horizontal movement range does not exceed 50 meters, and the attitude angle (roll and pitch) of the unmanned aerial vehicle is required to be less than 20 degrees.
7. Assisted positioning result analysis
In the initial stage of take-off, the maximum horizontal deviation of the unmanned aerial vehicle given by the vision auxiliary positioning system reaches 2 meters, and the horizontal deviation of the unmanned aerial vehicle tends to be stable within 0.25 meter along with the increase of the height. The main reason is that in the takeoff and landing stage, when the distance between the unmanned aerial vehicle and the target is closer, the target laid on the ground can be affected by wind generated by the propeller, so that the detected target is far away from the camera.
During the hovering phase, as shown in fig. 8, the offset in the X-axis direction of the drone is within 1 meter. As shown in fig. 9, the offset in the Y-axis direction is within 1.5 meters. The reason is that the unmanned aerial vehicle can horizontally drift when hovering under the influence of wind disturbance and single-point satellite positioning, so that the vision auxiliary positioning data fluctuates. It should be noted that the vision-assisted positioning system performs smooth processing on the positioning data, thereby avoiding the influence of large jump data on the flight controller, and simultaneously satisfying the real-time requirement of the flight controller on the positioning information with sufficient output frequency.
Referring to fig. 10, corresponding to the method for assisting positioning of an unmanned aerial vehicle based on airborne vision provided in the embodiment of the present application, as shown in fig. 10, an embodiment of the present application further provides a system for assisting positioning of an unmanned aerial vehicle based on airborne vision, where the system may specifically include:
an attitude information obtaining unit 201, configured to obtain attitude information provided by an airborne flight controller and a first flight altitude;
a target image obtaining unit 202, configured to obtain a target image in a video stream of an airborne camera, where the target image includes an identification feature of a reference target image;
a rotation angle determining unit 203, configured to match the reference target pattern with a template target pattern included in a template target pattern library; obtaining a template rotation angle corresponding to a template target graph with the highest similarity; so as to determine the rotation angle of the airborne camera relative to the reference target according to the template rotation angle;
a pixel calculating unit 204, configured to calculate centroid pixel coordinates and pixel areas of the reference target image in the target image;
a second flying height calculating unit 205, configured to calculate a physical distance between the reference target and the onboard camera according to the actual physical size of the reference target and the angle of view of the onboard camera in combination with the pixel area, so as to calculate a second flying height of the carrier according to the physical distance;
and an offset calculating unit 206, configured to obtain coordinates of the reference target in a northeast coordinate system with a carrier centroid as an origin through coordinate system transformation calculation according to the centroid pixel coordinates, the attitude information, and the second flying height, so as to obtain an offset of the reference target relative to the carrier in the northeast coordinate system according to the coordinate calculation.
Referring to fig. 6, the vision-aided positioning system provided in the embodiment of the present application is mainly composed of three parts, namely, an onboard camera, an onboard processor, and a ground target. The video that the machine carried the camera and gathered directly transmits for the built-in airborne treater of camera, combines aircraft attitude and the height information that flight controller provided, solves out the relative position relation of target and unmanned aerial vehicle, has reduced the dependence of many rotor unmanned aerial vehicle to satellite signal, has solved the unable location problem of many rotor unmanned aerial vehicle under the satellite signal refuses the environment.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
From the above description of the embodiments, it is clear to those skilled in the art that the present application can be implemented by software plus necessary general hardware platform. Based on such understanding, the technical solutions of the present application may be essentially or partially implemented in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments of the present application.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, the system or system embodiments are substantially similar to the method embodiments and therefore are described in a relatively simple manner, and reference may be made to some of the descriptions of the method embodiments for related points. The above-described system and system embodiments are only illustrative, wherein the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (10)

1. An unmanned aerial vehicle auxiliary positioning method based on airborne vision is characterized by comprising the following steps:
acquiring attitude information and a first flight height provided by an airborne flight controller;
acquiring a target image in an onboard camera video stream, wherein the target image comprises identification features of a reference target image;
matching the reference target pattern with template target patterns contained in a template target pattern library; obtaining a template rotation angle corresponding to a template target graph with the highest similarity; so as to determine the rotation angle of the airborne camera relative to the reference target according to the template rotation angle;
calculating the centroid pixel coordinates and pixel area of the reference target graph in the target image;
calculating the physical distance between the reference target and the airborne camera according to the actual physical size of the reference target and the angle of view of the airborne camera and the pixel area so as to calculate and obtain a second flying height of the carrier according to the physical distance;
and obtaining coordinates of the reference target in a northeast coordinate system with the center of mass of the carrier as an origin through coordinate system transformation calculation according to the center of mass pixel coordinates, the attitude information and the second flying height, so as to obtain the offset of the reference target relative to the carrier in the northeast coordinate system according to the coordinate calculation.
2. The method of claim 1, wherein the obtaining the target image in the video stream of the onboard camera comprises:
reading a frame of image of an original video stream from an on-board camera device, and dividing the image into a plurality of connected areas; screening out connected domains meeting the requirement of the area size according to the area of the connected domains; scaling the connected regions and calculating to obtain the identification features of each connected region, respectively performing similarity matching on the identification features of each connected region and the identification features of the pre-stored template target patterns, and determining the connected region with the highest identification feature similarity as a target connected region, wherein the region image contained in the target connected region is a target image.
3. The auxiliary positioning method for the unmanned aerial vehicle based on the airborne vision as claimed in claim 2, wherein a frame of image of an original video stream is read from an airborne camera device, and the image is divided into a plurality of connected regions after being subjected to binarization processing;
carrying out normalization processing on the target connected region, wherein the image normalization characteristics are as follows:
Figure 259229DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 637120DEST_PATH_IMAGE002
is the desired mean value of the image,
Figure 441128DEST_PATH_IMAGE003
is the variance of the image and is,
Figure 779837DEST_PATH_IMAGE004
is the normalized image characteristics,
Figure 327493DEST_PATH_IMAGE005
Is the pixel value at the corresponding pixel coordinate.
4. The method of claim 2, wherein the similarity matching of the identification features of each connected region with the identification features of the pre-stored template target patterns comprises:
the matching calculation formula is as follows:
Figure 418946DEST_PATH_IMAGE006
obtaining a candidate target corresponding to the maximum matching value as a detected target;
wherein the content of the first and second substances,
Figure 659434DEST_PATH_IMAGE007
to best match the corresponding template target index,
Figure 344493DEST_PATH_IMAGE008
is a dot product operation of the feature image matrix,
Figure 836786DEST_PATH_IMAGE009
is the pixel area;
Figure 658111DEST_PATH_IMAGE010
in the above formula, i and j are subscripts of values in two dimensions, i.e. width and height, of the image, and the value range is as follows:
Figure 335080DEST_PATH_IMAGE011
Figure 366490DEST_PATH_IMAGE012
5. the method of claim 1, wherein the identifying features comprise full profile features of a reference target; the reference target comprises a first target and a second target, and the contour size of the first target is larger than that of the second target;
acquiring a connected domain of the reference target graph, and determining the type of the reference target through the connected domain;
the determining the type of the reference target by the connected domain comprises:
judging whether the connected domain is included, if the connected domain is included, segmenting the first target and the second target, and determining the type of the reference target according to the first flight altitude and the flight altitude interval threshold.
6. An airborne vision based unmanned aerial vehicle assisted positioning method as defined in claim 5, wherein the first target and the second target have the same contour features, the second targets are stacked on top of each other and the second target is located at the center of the first target.
7. The method for assisting unmanned aerial vehicle positioning based on airborne vision according to claim 1, wherein the method for establishing the template target graphic library comprises the following steps:
normalizing the template target according to a preset configuration size, rotating the template target within a range of 360 degrees at intervals of precision, and respectively calculating the feature vector of the template target at each rotation angle and storing the feature vector as a feature vector matrix; the feature vector includes at least a mean, a centroid shift, and a standard deviation.
8. The method of claim 1, wherein the obtaining coordinates of the reference target in a northeast coordinate system with a carrier centroid as an origin by coordinate system transformation calculation according to the centroid pixel coordinates, the attitude information and the second flying height comprises:
knowing the onboard camera pixel (u0, v0), the field angle (FOVx, FOVy), the offset T between the onboard camera mounting position and the carrier centroid, the flying height H of the carrier at a certain moment, the attitude angle (yaw, pitch, roll), the pixel position (u, v) of the reference target graphic in the image taken by the onboard camera;
calculating the camera visual field shooting range (width, height):
Figure 521528DEST_PATH_IMAGE013
Figure 335376DEST_PATH_IMAGE014
pixel coordinate system (o' -u-v) to camera coordinate (oc-xc-yc-zc) system conversion:
Figure 652088DEST_PATH_IMAGE015
Figure 311739DEST_PATH_IMAGE016
Figure 129522DEST_PATH_IMAGE017
Figure 191019DEST_PATH_IMAGE018
Figure 819578DEST_PATH_IMAGE019
where dx, dy are the actual physical lengths represented by each pixel in the width and height directions, respectively; the coordinate of the target along the Z axis in the camera coordinate system is approximately equal to the flight height H of the airplane;
camera coordinate system (oc-xc-yc-zc) to carrier coordinate system (oz-xz-yz-zz) transformation:
Figure 966525DEST_PATH_IMAGE020
Figure 197787DEST_PATH_IMAGE021
Figure 972845DEST_PATH_IMAGE022
transformation of the carrier coordinate system (oz-xz-yz-zz) to the northeast coordinate system (O-N-E-D):
Figure 162517DEST_PATH_IMAGE023
Figure 406548DEST_PATH_IMAGE024
Figure 441500DEST_PATH_IMAGE025
Figure 71065DEST_PATH_IMAGE026
9. the method of claim 1, wherein the centroid pixel coordinates are expanded by a target multiple from the size of the reference target pattern to form a search box, and target image detection of subsequent frames is performed in the search box.
10. An unmanned aerial vehicle assistance-localization real-time system based on airborne vision, the system comprising:
the attitude information acquisition unit is used for acquiring attitude information and a first flight height provided by the airborne flight controller;
a target image acquisition unit for acquiring a target image in an onboard camera video stream, the target image including an identification feature of a reference target image;
the rotation angle determining unit is used for matching the reference target pattern with the template target patterns contained in the template target pattern library; obtaining a template rotation angle corresponding to a template target graph with the highest similarity; so as to determine the rotation angle of the airborne camera relative to the reference target according to the template rotation angle;
the pixel calculation unit is used for calculating the centroid pixel coordinates and the pixel area of the reference target graph in the target image;
the second flying height calculating unit is used for calculating and obtaining the physical distance between the reference target and the airborne camera according to the actual physical size of the reference target and the field angle of the airborne camera and the pixel area so as to obtain the second flying height of the carrier according to the physical distance;
and the offset calculating unit is used for obtaining the coordinates of the reference target under a northeast coordinate system with the center of mass of the carrier as an origin through coordinate system transformation calculation according to the centroid pixel coordinates, the attitude information and the second flying height, so as to obtain the offset of the reference target relative to the carrier under the northeast coordinate system according to the coordinate calculation.
CN202210611516.4A 2022-06-01 2022-06-01 Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision Pending CN114689030A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210611516.4A CN114689030A (en) 2022-06-01 2022-06-01 Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210611516.4A CN114689030A (en) 2022-06-01 2022-06-01 Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision

Publications (1)

Publication Number Publication Date
CN114689030A true CN114689030A (en) 2022-07-01

Family

ID=82130978

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210611516.4A Pending CN114689030A (en) 2022-06-01 2022-06-01 Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision

Country Status (1)

Country Link
CN (1) CN114689030A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116051628A (en) * 2023-01-16 2023-05-02 北京卓翼智能科技有限公司 Unmanned aerial vehicle positioning method and device, electronic equipment and storage medium
CN116164711A (en) * 2023-03-09 2023-05-26 广东精益空间信息技术股份有限公司 Unmanned aerial vehicle mapping method, unmanned aerial vehicle mapping system, unmanned aerial vehicle mapping medium and unmanned aerial vehicle mapping computer

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005037191A (en) * 2003-07-17 2005-02-10 Asia Air Survey Co Ltd Automatic orientation method by special mark
GB9723831D0 (en) * 1995-11-06 2007-12-12 Secr Defence Method of weapon guidance by target
CN103424126A (en) * 2013-08-12 2013-12-04 西安电子科技大学 System and method for verifying visual autonomous landing simulation of unmanned aerial vehicle
CN103905746A (en) * 2012-12-28 2014-07-02 清华大学 Method and device for localization and superposition of sub-pixel-level image offset and video device
CN106774423A (en) * 2017-02-28 2017-05-31 亿航智能设备(广州)有限公司 The landing method and system of a kind of unmanned plane
CN107330917A (en) * 2017-06-23 2017-11-07 歌尔股份有限公司 The track up method and tracking equipment of mobile target
CN108227685A (en) * 2018-01-08 2018-06-29 中科开元信息技术(北京)有限公司 A kind of real-time control system for being classified remotely pilotless vehicle
CN108873917A (en) * 2018-07-05 2018-11-23 太原理工大学 A kind of unmanned plane independent landing control system and method towards mobile platform
CN109341543A (en) * 2018-11-13 2019-02-15 厦门市汉飞鹰航空科技有限公司 A kind of height calculation method of view-based access control model image
CN109409387A (en) * 2018-11-06 2019-03-01 深圳增强现实技术有限公司 The acquisition direction of image capture device determines method, apparatus and electronic equipment
CN110222612A (en) * 2019-05-27 2019-09-10 北京交通大学 Dynamic target recognition and tracking for unmanned plane Autonomous landing
CN111079985A (en) * 2019-11-26 2020-04-28 昆明理工大学 Criminal case criminal period prediction method based on BERT and fused with distinguishable attribute features
CN111598952A (en) * 2020-05-21 2020-08-28 华中科技大学 Multi-scale cooperative target design and online detection and identification method and system
US20200342360A1 (en) * 2018-06-08 2020-10-29 Tencent Technology (Shenzhen) Company Limited Image processing method and apparatus, and computer-readable medium, and electronic device
CN112215860A (en) * 2020-09-23 2021-01-12 国网福建省电力有限公司漳州供电公司 Unmanned aerial vehicle positioning method based on image processing
CN112731966A (en) * 2020-12-22 2021-04-30 广州优飞信息科技有限公司 Special landing control method and device for multi-rotor unmanned aerial vehicle integrating vision
CN112819094A (en) * 2021-02-25 2021-05-18 北京时代民芯科技有限公司 Target detection and identification method based on structural similarity measurement
CN113222838A (en) * 2021-05-07 2021-08-06 国网山西省电力公司吕梁供电公司 Unmanned aerial vehicle autonomous line patrol method based on visual positioning
US20210390301A1 (en) * 2019-03-08 2021-12-16 Shen Zhen Clearvision Robotics, Inc, Limited Indoor vision positioning system and mobile robot
CN113989174A (en) * 2021-10-29 2022-01-28 北京百度网讯科技有限公司 Image fusion method and training method and device of image fusion model
CN114018155A (en) * 2021-11-19 2022-02-08 上海交通大学 Method and system for detecting precision of chemical milling laser engraving profile
CN114200396A (en) * 2021-11-01 2022-03-18 中国人民解放军91977部队 Tethered unmanned aerial vehicle photoelectric positioning system independent of satellite navigation technology

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9723831D0 (en) * 1995-11-06 2007-12-12 Secr Defence Method of weapon guidance by target
JP2005037191A (en) * 2003-07-17 2005-02-10 Asia Air Survey Co Ltd Automatic orientation method by special mark
CN103905746A (en) * 2012-12-28 2014-07-02 清华大学 Method and device for localization and superposition of sub-pixel-level image offset and video device
CN103424126A (en) * 2013-08-12 2013-12-04 西安电子科技大学 System and method for verifying visual autonomous landing simulation of unmanned aerial vehicle
CN106774423A (en) * 2017-02-28 2017-05-31 亿航智能设备(广州)有限公司 The landing method and system of a kind of unmanned plane
CN107330917A (en) * 2017-06-23 2017-11-07 歌尔股份有限公司 The track up method and tracking equipment of mobile target
CN108227685A (en) * 2018-01-08 2018-06-29 中科开元信息技术(北京)有限公司 A kind of real-time control system for being classified remotely pilotless vehicle
US20200342360A1 (en) * 2018-06-08 2020-10-29 Tencent Technology (Shenzhen) Company Limited Image processing method and apparatus, and computer-readable medium, and electronic device
CN108873917A (en) * 2018-07-05 2018-11-23 太原理工大学 A kind of unmanned plane independent landing control system and method towards mobile platform
CN109409387A (en) * 2018-11-06 2019-03-01 深圳增强现实技术有限公司 The acquisition direction of image capture device determines method, apparatus and electronic equipment
CN109341543A (en) * 2018-11-13 2019-02-15 厦门市汉飞鹰航空科技有限公司 A kind of height calculation method of view-based access control model image
US20210390301A1 (en) * 2019-03-08 2021-12-16 Shen Zhen Clearvision Robotics, Inc, Limited Indoor vision positioning system and mobile robot
CN110222612A (en) * 2019-05-27 2019-09-10 北京交通大学 Dynamic target recognition and tracking for unmanned plane Autonomous landing
CN111079985A (en) * 2019-11-26 2020-04-28 昆明理工大学 Criminal case criminal period prediction method based on BERT and fused with distinguishable attribute features
CN111598952A (en) * 2020-05-21 2020-08-28 华中科技大学 Multi-scale cooperative target design and online detection and identification method and system
CN112215860A (en) * 2020-09-23 2021-01-12 国网福建省电力有限公司漳州供电公司 Unmanned aerial vehicle positioning method based on image processing
CN112731966A (en) * 2020-12-22 2021-04-30 广州优飞信息科技有限公司 Special landing control method and device for multi-rotor unmanned aerial vehicle integrating vision
CN112819094A (en) * 2021-02-25 2021-05-18 北京时代民芯科技有限公司 Target detection and identification method based on structural similarity measurement
CN113222838A (en) * 2021-05-07 2021-08-06 国网山西省电力公司吕梁供电公司 Unmanned aerial vehicle autonomous line patrol method based on visual positioning
CN113989174A (en) * 2021-10-29 2022-01-28 北京百度网讯科技有限公司 Image fusion method and training method and device of image fusion model
CN114200396A (en) * 2021-11-01 2022-03-18 中国人民解放军91977部队 Tethered unmanned aerial vehicle photoelectric positioning system independent of satellite navigation technology
CN114018155A (en) * 2021-11-19 2022-02-08 上海交通大学 Method and system for detecting precision of chemical milling laser engraving profile

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
万刚等: "《无人机测绘技术及应用》", 30 December 2015, 测绘出版社 *
曾振华等: "多旋翼无人机自主精准降落的控制系统研究", 《广东工业大学学报》 *
洪富祥等: ""一种基于机器视觉的无人机同心圆靶精准降落方法"", 《量子电子学报》 *
聂烜等: ""一种基于合作目标的视觉定位方法"", 《制导与引信》 *
解洪胜: "《基于支撑向量机的图像检索若干问题》", 30 October 2013, 山东人民出版社 *
韩航迪等: ""基于视觉自动换电池无人机自主着落方法研究"", 《电气与自动化》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116051628A (en) * 2023-01-16 2023-05-02 北京卓翼智能科技有限公司 Unmanned aerial vehicle positioning method and device, electronic equipment and storage medium
CN116051628B (en) * 2023-01-16 2023-10-27 北京卓翼智能科技有限公司 Unmanned aerial vehicle positioning method and device, electronic equipment and storage medium
CN116164711A (en) * 2023-03-09 2023-05-26 广东精益空间信息技术股份有限公司 Unmanned aerial vehicle mapping method, unmanned aerial vehicle mapping system, unmanned aerial vehicle mapping medium and unmanned aerial vehicle mapping computer
CN116164711B (en) * 2023-03-09 2024-03-29 广东精益空间信息技术股份有限公司 Unmanned aerial vehicle mapping method, unmanned aerial vehicle mapping system, unmanned aerial vehicle mapping medium and unmanned aerial vehicle mapping computer

Similar Documents

Publication Publication Date Title
CN111326023B (en) Unmanned aerial vehicle route early warning method, device, equipment and storage medium
Zhao et al. Detection, tracking, and geolocation of moving vehicle from uav using monocular camera
CN106874854B (en) Unmanned aerial vehicle tracking method based on embedded platform
CN103149939B (en) A kind of unmanned plane dynamic target tracking of view-based access control model and localization method
CN102353377B (en) High altitude long endurance unmanned aerial vehicle integrated navigation system and navigating and positioning method thereof
CN103822635B (en) The unmanned plane during flying spatial location real-time computing technique of view-based access control model information
Martínez et al. On-board and ground visual pose estimation techniques for UAV control
Yahyanejad et al. Incremental mosaicking of images from autonomous, small-scale uavs
CN113485441A (en) Distribution network inspection method combining unmanned aerial vehicle high-precision positioning and visual tracking technology
CN107240063A (en) A kind of autonomous landing method of rotor wing unmanned aerial vehicle towards mobile platform
CN108292140A (en) System and method for making a return voyage automatically
CN104808685A (en) Vision auxiliary device and method for automatic landing of unmanned aerial vehicle
CN109212545A (en) Multiple source target following measuring system and tracking based on active vision
CN105549614A (en) Target tracking method of unmanned plane
CN114689030A (en) Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
CN105644785A (en) Unmanned aerial vehicle landing method based on optical flow method and horizon line detection
CN106949880B (en) The method that area's unmanned plane image part degree of overlapping crosses high disposal is surveyed in height above sea level big rise and fall
CN109460046B (en) Unmanned aerial vehicle natural landmark identification and autonomous landing method
CN109341686B (en) Aircraft landing pose estimation method based on visual-inertial tight coupling
CN111426320A (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN110058604A (en) A kind of accurate landing system of unmanned plane based on computer vision
CN114034296A (en) Navigation signal interference source detection and identification method and system
Ghosh et al. AirTrack: Onboard deep learning framework for long-range aircraft detection and tracking
CN109764864B (en) Color identification-based indoor unmanned aerial vehicle pose acquisition method and system
CN116202489A (en) Method and system for co-locating power transmission line inspection machine and pole tower and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220701

RJ01 Rejection of invention patent application after publication